WO2023210702A1 - Information processing program, information processing device, and information processing method - Google Patents

Information processing program, information processing device, and information processing method Download PDF

Info

Publication number
WO2023210702A1
WO2023210702A1 PCT/JP2023/016503 JP2023016503W WO2023210702A1 WO 2023210702 A1 WO2023210702 A1 WO 2023210702A1 JP 2023016503 W JP2023016503 W JP 2023016503W WO 2023210702 A1 WO2023210702 A1 WO 2023210702A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
underwater
brightness value
image
distance
Prior art date
Application number
PCT/JP2023/016503
Other languages
French (fr)
Japanese (ja)
Inventor
裕子 石若
智博 吉田
Original Assignee
ソフトバンク株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソフトバンク株式会社 filed Critical ソフトバンク株式会社
Publication of WO2023210702A1 publication Critical patent/WO2023210702A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated

Definitions

  • the present invention relates to an information processing program, an information processing device, and an information processing method.
  • Image recognition technology for identifying objects in images is conventionally known.
  • image recognition technology for identifying objects in images (still images or moving images) is conventionally known.
  • the present application has been made in view of the above, and aims to provide an information processing program, an information processing device, and an information processing method that can accurately estimate information regarding underwater objects from images.
  • the information processing program provides a reference image including an underwater reference object irradiated with light from a light source, the brightness of an object area occupied by the reference object in the reference image captured by the underwater imaging device.
  • reference information indicating a relationship between a reference brightness value, which is a value, and a reference distance, which is a distance from the imaging device to the reference object; and a target image including the underwater target irradiated with light from the light source.
  • FIG. 1 is a diagram for explaining the color attenuation rate in water.
  • FIG. 2 is a diagram for explaining a reference information acquisition method according to the embodiment.
  • FIG. 3 is a diagram illustrating an example of reference information according to the embodiment.
  • FIG. 4 is a diagram for explaining the shape of a reference object used to obtain reference information according to the embodiment.
  • FIG. 5 is a diagram illustrating a configuration example of an information processing apparatus according to an embodiment.
  • FIG. 6 is a diagram illustrating an example of a target image according to the embodiment.
  • FIG. 7 is a diagram showing an overview of information processing according to a modification.
  • FIG. 8 is a diagram showing an overview of information processing according to a modification.
  • FIG. 9 is a hardware configuration diagram showing an example of a computer that implements the functions of the information processing device.
  • a specific example is a method in which a machine learning model for image recognition is trained to estimate the number of fish and the size of fish from images.
  • a machine learning model for image recognition is trained to estimate the number of fish and the size of fish from images.
  • training data for a machine learning model a set of an image of underwater fish such as a school of fish in a cage and correct data indicating the position and size of each fish in the image is used.
  • the high quality of the training data means that the accuracy of the correct data indicating the position and size of the fish in the image is high.
  • FIG. 1 is a diagram for explaining the rate of color attenuation in water.
  • the graph shown in Figure 1 shows the intensity of light with wavelengths corresponding to red, green, and blue (hereinafter also referred to as red light, green light, and blue light) in water, and the light source from the measuring equipment that measures the light intensity. This shows the relationship with the distance to.
  • red light, green light, and blue light decreases as the distance from the measuring instrument to the light source increases.
  • red light light with a long wavelength
  • a method can be considered in which the position of a fish in the water in the depth direction is detected using another sensor outside the imaging device, and the position information in the depth direction of the fish in the water is complemented.
  • most of the techniques useful in air cannot be used underwater.
  • the light from a LiDAR (Light Detection And Ranging) sensor hardly reaches underwater than it does in the air.
  • ultrasonic sensors are suitable for finding schools of fish (also called schools of fish) in the water, but their resolution is low for counting the number of fish in schools of fish.
  • the information processing device 100 is configured to detect an object area in which the reference object is imaged in an image of an object imitating the surface of a fish's body (hereinafter also referred to as a reference object) located underwater.
  • Reference information indicating the relationship between the brightness value at (hereinafter also referred to as reference brightness value) and the distance from the imaging device to the reference object when the image was captured (hereinafter also referred to as reference distance) is acquired.
  • the information processing device 100 also acquires an image of a fish in the water (hereinafter also referred to as a target image).
  • the information processing device 100 compares the brightness value (hereinafter also referred to as target brightness value) in the area where the fish is imaged (hereinafter also referred to as target object area) in the acquired target image with the acquired reference information. Based on the comparison, the distance from the imaging device to the fish when the target image was captured (hereinafter also referred to as target distance) is estimated. Specifically, the information processing device 100 specifies a reference distance corresponding to the same reference brightness value as the target brightness value based on the reference information, and estimates the specified reference distance as the target distance.
  • target brightness value the brightness value in the area where the fish is imaged
  • target object area the distance from the imaging device to the fish when the target image was captured
  • the information processing device 100 can accurately estimate the distance from the imaging device to the fish in the water. Further, since the information processing device 100 can accurately estimate the distance from the imaging device to the fish in the water, for example, the information processing device 100 can estimate the distance from the imaging device to the fish in the water. It is possible to provide high accuracy data. That is, the information processing device 100 can obtain high-quality training data to which high-quality correct answer data is added, for example. Furthermore, since the information processing device 100 can obtain high-quality training data, for example, the high-quality training data is used to learn a machine learning model for image recognition that estimates information about fish in the water from images. Accuracy can be improved. Therefore, the information processing device 100 can accurately estimate information regarding fish in the water from the image.
  • the imaging target of the target image (hereinafter also referred to as a target object) is not limited to a fish in the water.
  • the object may be a living thing other than a fish.
  • the target object is a living creature other than a fish
  • the reference object is replaced with an object that mimics the surface of the body of the other living creature, instead of an object that mimics the surface of the fish's body.
  • the target object is not limited to living things.
  • the target object may be an object other than a living thing.
  • the reference object is replaced with an object that imitates the surface of the non-living object instead of an object that imitates the surface of the biological body.
  • FIG. 2 is a diagram for explaining a reference information acquisition method according to the embodiment.
  • FIG. 2 shows a state in which a reference object O1, a camera C1 (an example of an imaging device), and a light source L1 are located in water such as a fish cage.
  • the light source L1 irradiates light onto the underwater reference object O1.
  • Camera C1 captures an image (reference image) of underwater reference object O1 irradiated with light from light source L1. If the distance from the camera C1 to the underwater reference object O1 when the reference image is captured is z, then the camera C1 images the underwater reference object O1 irradiated with light from the light source L1 while changing the value of z. . That is, the camera C1 captures an image (reference image) of the underwater reference object O1 irradiated with light from the light source L1 while moving in a direction away from the reference object O1.
  • the reference object O1 is planar (also referred to as plate-like), and the planar reference object O1 is shown from the side.
  • the direction in which the camera C1 looks up at the reference object O1 coincides with the normal direction of the reference object O1. That is, the surface of the reference object O1 is located directly in front of the camera C1.
  • the light source L1 is provided at the bottom of the housing of the camera C1. Note that the light source L1 may be provided at the top of the housing of the camera C1. Moreover, the light source L1 is installed so that the direction in which the light source L1 irradiates light matches the direction in which the camera C1 looks up at the reference object O1.
  • the light source L1 emits light of a wavelength corresponding to a color depending on the type of target fish.
  • the light source L1 may be a light with various color filters such as red, green, and blue.
  • FIG. 3 is a diagram showing an example of reference information according to the embodiment.
  • a white light (white light), a light with a red filter (red light), a light with a blue filter (blue light), and a light with a violet filter (purple light) are used as the light source L1 shown in FIG.
  • each graph shown in FIG. 3 corresponds to each reference information obtained by the method shown in FIG. 2.
  • the horizontal axis of the graph indicates the distance (z) from the camera C1 to the underwater reference object O1 when the reference image was captured.
  • the vertical axis of the graph indicates the brightness value of the reference image. More specifically, the solid line in the graph represents the brightness value of R (red), the broken line represents the brightness value of G (green), and the thick line represents the brightness value of B (blue). From each graph shown in FIG. 3, the information processing device 100 determines that, for example, when the brightness value of R of the target brightness value in the target object region where the fish is photographed in the target image is close to zero, the target image is photographed.
  • the target distance from the imaging device to the fish can be estimated to be 3 m or more. Furthermore, if the target distance from the imaging device to the fish when the target image is captured is between 1 m and 2 m, the information processing device 100 can, for example, use the solid line in the graph of the light with a red filter (red light). The target distance can be estimated based on the comparison with the brightness value of R (red) shown by .
  • FIG. 4 is a diagram for explaining the shape of a reference object used to obtain reference information according to the embodiment.
  • the actual reference object O1 has a shape of a truncated quadrangular pyramid without a bottom surface as shown in FIG. More specifically, the actual reference object O1 is composed of five surfaces, surface #1 to surface #5. Four surfaces, surface #2 to surface #5, are provided to surround surface #1.
  • the surface of each side is covered with fish scales (scale models or real scales, pseudo skin (aurora bald skin), color charts, checkerboard, etc.) depending on the type of fish being targeted. is pasted.
  • surface #1 is a surface located directly in front of the camera C1.
  • the four surfaces #2 to #5 are installed at an angle with respect to surface #1. That is, the four surfaces #2 to #5 each have a different angle from the surface #1, which is located directly in front of the camera C1.
  • the reflectance of light differs depending on the angle of the surface on which the light is incident. For example, when the fish's body is oriented directly sideways to the imaging device, the reflectance from the surface of the fish's body is the highest. In this way, when acquiring reference information, by using the reference object O1 having a truncated quadrangular pyramid shape without a base as shown in FIG.
  • the information processing device 100 can use the reference information obtained from the four surfaces #2 to #5 even if the fish's body is oriented in a direction other than directly sideways relative to the imaging device, for example. Based on the information, the target distance from the imaging device to the fish can be estimated.
  • FIG. 5 is a diagram illustrating a configuration example of the information processing device 100 according to the embodiment.
  • the information processing device 100 includes a communication section 110, a storage section 120, an input section 130, an output section 140, and a control section 150.
  • the communication unit 110 is realized by, for example, a NIC (Network Interface Card).
  • the communication unit 110 is connected to the network by wire or wirelessly, and transmits and receives information to and from a terminal device used by, for example, a fish manager.
  • the storage unit 120 is realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory, or a storage device such as a hard disk or an optical disk.
  • the storage unit 120 stores an information processing program according to the embodiment.
  • the input unit 130 receives various operations from the user.
  • the input unit 130 may receive various operations from the user via a display screen (for example, the output unit 140) using a touch panel function.
  • the input unit 130 may accept various operations from buttons provided on the information processing device 100 or a keyboard or mouse connected to the information processing device 100.
  • the output unit 140 is a display screen realized by, for example, a liquid crystal display or an organic EL (Electro-Luminescence) display, and is a display device for displaying various information.
  • the output unit 140 displays various information under the control of the control unit 150. Note that when a touch panel is employed in the information processing apparatus 100, the input section 130 and the output section 140 are integrated. Furthermore, in the following description, the output unit 140 may be referred to as a screen.
  • the control unit 150 is a controller, and for example, executes various programs (information processing programs) stored in a storage device inside the information processing device 100 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like. (corresponding to one example) is realized by being executed using RAM as a work area. Further, the control unit 150 is a controller, and is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the control unit 150 has an acquisition unit 151, an estimation unit 152, and an output control unit 153 as functional units, and may realize or execute the information processing operation described below.
  • the internal configuration of the control unit 150 is not limited to the configuration shown in FIG. 5, and may be any other configuration as long as it performs information processing to be described later.
  • each functional unit indicates a function of the control unit 150, and does not necessarily have to be physically distinct.
  • the acquisition unit 151 obtains a reference image including an underwater reference object irradiated with light from a light source, which is a reference brightness value that is a brightness value in an object area occupied by the reference object in the reference image captured by an underwater imaging device.
  • Reference information indicating the relationship between the value and a reference distance is acquired.
  • the acquisition unit 151 indicates the relationship between a reference brightness value in a reference image including a reference object irradiated with light of a wavelength corresponding to a color corresponding to a different fish for each species and a reference distance.
  • Get reference information For example, the acquisition unit 151 acquires reference information shown by each graph shown in FIG. 3.
  • the acquisition unit 151 acquires reference information from a terminal device used by a user who has acquired the reference information through an experiment.
  • the acquisition unit 151 acquires a target image that includes an underwater fish irradiated with light from a light source and that is captured by an underwater imaging device. Specifically, the acquisition unit 151 acquires a target image including a fish irradiated with light of a wavelength corresponding to a color corresponding to a different fish species. For example, the acquisition unit 151 acquires a target image including a plurality of fish.
  • FIG. 6 is a diagram illustrating an example of a target image according to the embodiment. For example, the acquisition unit 151 acquires a target image G1 including a plurality of fish as shown in FIG. For example, the acquisition unit 151 acquires the target image from the imaging device that captured the target image.
  • the estimation unit 152 estimates the target distance, which is the distance from the imaging device to the fish, based on a comparison between the target brightness value, which is the brightness value in the target object area occupied by the fish, and the reference information in the target image. Specifically, the estimation unit 152 specifies a reference distance corresponding to the same reference brightness value as the target brightness value based on the reference information, and estimates the specified reference distance as the target distance.
  • the estimation unit 152 estimates the size of the fish according to the estimated target distance. For example, suppose that there is an image that includes two target object regions in which the size of the target object region occupied by a fish is approximately the same, and the brightness values in the target object regions are different. At this time, the estimation unit 152 calculates that the larger (smaller) the brightness value in the target object region, the closer (farther) the distance from the imaging device to the fish is. Also, it is estimated that the size of the object corresponding to the object region having a small brightness value is larger.
  • the estimation unit 152 calculates the target distance from the imaging device to each of the plurality of fish based on the comparison between the target brightness value in each of the plurality of target object areas occupied by each of the plurality of fish in the target image and the reference information. is estimated, and each of the plurality of fish is identified according to the estimated target distance from the imaging device to each of the plurality of fish. For example, assume that there is a target image in which a plurality of different fish are photographed overlapping each other and includes two target object regions in which the brightness values of the target object regions occupied by the plurality of fish are different.
  • the estimation unit 152 estimates that the larger (smaller) the brightness value in the target object area is, the closer (farther) the distance from the imaging device to the fish is. , it can be estimated that the fish is located in front of the position of the fish corresponding to the target area with the small brightness value. In other words, the estimating unit 152 can estimate that the position of the fish corresponding to the object area with a small brightness value is located further back than the position of the fish corresponding to the object area with a large brightness value. Furthermore, in the case of fish whose shapes and sizes are known to some extent (for example, fish in a cage), it is obvious that different positions in three-dimensional space indicate different fish. Therefore, the estimation unit 152 accurately identifies an object near the imaging device (for example, a predetermined fish) and an object located far from the imaging device (for example, another fish different from the predetermined fish). be able to.
  • the output control unit 153 controls the estimation result estimated by the estimation unit 152 to be displayed on the screen.
  • the output control unit 153 may control the screen to display a numerical value indicating the target distance estimated by the estimating unit 152 in a superimposed manner on the target object area.
  • the information processing device 100 according to the embodiment described above may be implemented in various different forms other than the embodiment described above. Therefore, other embodiments of the information processing device 100 will be described below. Note that the same parts as those in the embodiment are given the same reference numerals and the description thereof will be omitted.
  • [4-1. Machine learning model] In the embodiment described above, a case has been described in which the estimation unit 152 estimates the distance from the imaging device to the fish based on the comparison between the brightness value in the target object area occupied by the fish in the target image and the reference information.
  • the estimation unit 152 uses a machine learning model to determine the size and size of the fish included in the target image based on the brightness value in the target area occupied by the fish in the target image.
  • a case of estimating the number of fish (namely, the number of fish) will be explained.
  • FIG. 7 is a diagram showing an overview of information processing according to a modification.
  • the estimation unit 152 when an image including fish is input to the machine learning model M1, the estimation unit 152 outputs position information of each fish, information indicating the size, and each brightness value of (R, G, B). A machine learning model M1 trained to do the following is acquired. Next, the estimation unit 152 uses the machine learning model M1 to estimate the size and number of fish (ie, the number of fish) included in the target image.
  • FIG. 8 is a diagram showing an overview of information processing according to a modification.
  • the estimation unit 152 calculates the position information of each fish, information indicating the size, (R, G , B), and a machine learning model M2 trained to output the position information of each point in the point cloud. Subsequently, the estimating unit 152 uses the machine learning model M2 to estimate the size of the fish and the number of fish (ie, the number of fish) included in the target image.
  • the acquisition unit 151 calculates the reference brightness values and reference distances in each of a plurality of different reference images each including a reference object irradiated with light of a plurality of different wavelengths corresponding to each of a plurality of different colors. Obtain multiple different references that indicate relationships. Further, the acquisition unit 151 acquires a plurality of different target images each including a target object irradiated with light of a plurality of different wavelengths at different times within a predetermined time. The estimation unit 152 estimates the target distance based on a comparison between the target brightness value in each of a plurality of different target images and each of a plurality of different reference information.
  • Transparency in this specification refers to the distance that can be seen in the horizontal direction with respect to the bottom surface underwater.
  • a white planar (also referred to as a plate-shaped) object (hereinafter referred to as a white board) is submerged at a predetermined depth (for example, 1 m depth) underwater.
  • the white board may be a disc with a diameter of 30 cm. Then, the distance between the eyes of a person located at a predetermined depth underwater (for example, 1 m depth, etc.) and the white board is gradually increased underwater in the horizontal direction with respect to the bottom surface. Then, the distance between the human eye and the white board when the white board is no longer visible to the human eye is defined as the underwater visibility.
  • Transparency refers to the distance that can be seen vertically when looking into the water from above the water surface. Specifically, the white board mentioned above is submerged in water. Then, the distance between the water surface and the white board is gradually increased in the direction perpendicular to the water surface. The distance between the water surface and the white board when the white board is no longer visible to the human eye is defined as the underwater transparency.
  • the reference object O1 shown in FIG. 2 is the above-mentioned white board.
  • a measuring device is prepared in which the distance between the white board, which is the reference object O1 shown in FIG. 2, and the light source L1 and camera C1 is fixed at a measurement distance (for example, 1 m). Furthermore, the measuring device is submerged underwater at various locations in each season (that is, underwater corresponding to various degrees of visibility). Then, the camera C1 of the measuring device submerged at a predetermined depth (for example, 1 m depth) captures a transparency reference image including the underwater white board irradiated with light from the light source L1. In addition, the transparency of the water when the transparency reference image is captured is measured through experiments.
  • the acquisition unit 151 acquires transparency basic information. For example, the acquisition unit 151 acquires basic transparency information from a terminal device used by a user of the measuring device. Next, based on the visibility basic information, the acquisition unit 151 obtains a visibility reference brightness value, which is a brightness value in the area occupied by the white board, in the visibility reference image, and a value in the water when the visibility reference image is captured. Obtain transparency reference information that indicates the relationship with transparency.
  • the above measuring device is submerged at a predetermined depth (for example, 1 m depth, etc.) underwater on the day on which the visibility is desired to be measured (hereinafter referred to as the measurement date). Then, an object image including an underwater white board irradiated with light from a light source L1 by a camera C1 of a measurement device submerged at a predetermined depth (for example, 1 m depth, etc.), the object imaged by the underwater camera C1. Capture an image.
  • a predetermined depth for example, 1 m depth, etc.
  • the acquisition unit 151 acquires a target image.
  • the acquisition unit 151 acquires a target image from a terminal device used by a user of the measuring device.
  • the estimation unit 152 calculates the transparency on the measurement date based on the comparison between the target brightness value, which is the brightness value in the target object area occupied by the white board, and the transparency reference information in the target image acquired by the acquisition unit 151.
  • the estimation unit 152 estimates the visibility on the measurement date based on the visibility reference information corresponding to a predetermined depth (for example, a depth of 1 m, etc.). Thereby, the information processing apparatus 100 can estimate the degree of transparency based on the quantified information without using visual measurement.
  • the estimation unit 152 selects at least one of the R brightness value, the G brightness value, and the B brightness value in the target object area occupied by the white board as the target brightness value in the target image.
  • the transparency on the measurement date is estimated based on the comparison between the brightness value of the color and the transparency reference information.
  • the attenuation rate of R (red) light is the highest among R (red), G (green), and B (blue), and it is attenuated from R (red) light. It has been known. Further, it is known that in water where visibility is high, the attenuation rate of B (blue) light is the smallest, and the B (blue) light remains until the end.
  • the estimation unit 152 calculates the target brightness value, which is the brightness value of R (red) in the target object area occupied by the white board, in the target image, and the visibility reference. Based on the comparison with the information, the visibility at the measurement date is estimated. For example, based on the perspective reference information, the estimation unit 152 calculates a perspective view corresponding to the perspective reference brightness value of R (red) that is the same as the target brightness value that is the brightness value of R (red) in the target object area occupied by the white board. The specified visibility is estimated as the visibility on the measurement date.
  • the estimation unit 152 calculates the visibility on the measurement date based on the luminance values of a plurality of colors in the target object area occupied by the white board in the target image.
  • the estimating unit 152 calculates a perspective view corresponding to a B (blue) perspective reference brightness value that is the same as the B (blue) brightness value in the target object region occupied by the white board. Identify degree.
  • chlorophyll such as algae in the water.
  • chlorophyll such as algae in water has the property of easily absorbing blue light.
  • the visibility estimated based only on the brightness value of B (blue) may take a value that is actually impossible (for example, it takes a value of visibility that exceeds the size of the fish tank, such as 200 m).
  • the estimating unit 152 determines whether the other object in the object area occupied by the white board is The transparency on the measurement date is estimated based on the luminance value of the color.
  • the estimation unit 152 calculates a perspective view corresponding to the perspective reference brightness value of R (red) that is the same as the target brightness value that is the brightness value of R (red) in the target object area occupied by the white board.
  • the specified visibility is estimated as the visibility on the measurement date.
  • the estimating unit 152 estimates the transparency on the measurement date based on the correlation between the brightness values of a plurality of colors in the target object area occupied by the white board, as the target brightness value in the target image. For example, the estimation unit 152 calculates the ratio of brightness values of a plurality of colors at each transparency based on the transparency reference information. For example, the estimation unit 152 calculates the brightness value of R (hereinafter sometimes abbreviated as R), the brightness value of G (hereinafter sometimes abbreviated as G), and the brightness value of B (hereinafter sometimes abbreviated as B). (sometimes abbreviated) is calculated for each transparency level.
  • R the brightness value of R
  • G the brightness value of G
  • B the brightness value of B
  • the estimation unit 152 calculates that R:G:B at each perspective based on the perspective reference information. Based on the comparison with the ratio of , the visibility on the day of measurement is estimated to be 10 m.
  • the transparency of the day is measured in addition to the visibility. Then, information that associates the underwater visibility and transparency when the visibility reference image is captured is obtained through an experiment. In addition, the transparency on the day of measurement is measured experimentally.
  • the acquisition unit 151 also acquires information that associates underwater visibility with transparency, and information on transparency on the measurement date. For example, the acquisition unit 151 acquires information associating underwater visibility with transparency, and information on transparency on the measurement date from a terminal device used by a user of the measuring device. Furthermore, the estimation unit 152 estimates the transparency on the measurement date from the transparency on the measurement date. More specifically, the estimating unit 152 identifies the transparency corresponding to the same transparency as the transparency on the measurement date based on information that associates underwater visibility with transparency, and sets the identified transparency on the measurement date. Estimated as the transparency at .
  • the attenuation rate of light in water differs depending on the amount of suspended matter such as chlorophyll in the water.
  • the attenuation rate of each color of RGB in water with a small amount of suspended matter and high visibility for example, visibility of 20 m
  • the attenuation rate of light of each RGB color in water with a large amount of suspended matter and low visibility for example, visibility of 3 m.
  • the attenuation rate of light of each color of RGB is different. Therefore, even if fish of the same species are located at the same distance from the imaging device, the way the fish is seen in water with a low amount of suspended matter and high visibility is different from the way the fish is seen in water with a large amount of suspended matter and low visibility. is different. Therefore, an attenuation rate map (hereinafter also referred to as reference basic information) of each color of RGB in water at each transparency level is prepared in advance.
  • reference basic information for each depth in the water at each transparency level (for example, 1 m depth, 2 m depth, 3 m depth, etc.), a reference object with scales attached (as shown in Fig. 2 above) is obtained from the imaging device.
  • a reference image at each distance to the reference object O1) is captured.
  • the reference image for acquiring the reference basic information is captured for a reference object to which materials such as scales or pseudo skin (aurora bald skin), color charts, checkerboards, etc., which differ for each species of fish, are pasted.
  • the degree of transparency when capturing the reference image is measured through experiments.
  • the acquisition unit 151 obtains, as reference basic information, the transparency when the reference image was captured, the depth when the reference image was captured, the distance from the imaging device to the reference object, and the area occupied by the reference object in the reference image. Information that associates each value of R, G, and B in the area is acquired.
  • the estimation unit 152 performs the above [4-3. [Estimation of visibility based on brightness value]] may be used to estimate the visibility on the estimated date.
  • the species of fish kept in the fish tank are known. Further, it is assumed that the depth of the imaging device is known.
  • the acquisition unit 151 acquires reference basic information corresponding to the visibility on the estimated date.
  • the acquisition unit 151 also acquires a target image in the fish tank on the estimated date.
  • the estimation unit 152 calculates the target distance, which is the distance from the imaging device to the fish, based on the comparison between the target brightness value, which is the brightness value in the target object area occupied by the fish, and the reference basic information in the target image.
  • the estimation unit 152 compares the brightness value of at least one color among the brightness value of R, the brightness value of G, and the brightness value of B in the target object region with reference basic information. Estimate the target distance based on. For example, if the visibility on the estimated date exceeds the first visibility, the estimation unit 152 calculates the target brightness value, which is the brightness value of R in the target object region of the target image, based on the comparison with the reference basic information. , estimate the target distance.
  • the estimation unit 152 identifies the distance from the imaging device to the reference object corresponding to the same R brightness value as the target brightness value, which is the R brightness value in the target object region, based on the reference basic information. Estimate the distance as the target distance. Furthermore, when the visibility on the estimation date is equal to or lower than the first visibility, the estimation unit 152 estimates the target distance based on the luminance values of the plurality of colors in the target object area. For example, the estimation unit 152 estimates the target image based on a comparison between the ratio of the brightness values of a plurality of colors at each distance based on the reference basic information and the ratio of the brightness values of a plurality of colors in the target object area of the target image. Estimate distance. In this way, the estimation unit 152 may estimate the distance from the imaging device to the fish based on the visibility of the water.
  • the information processing apparatus 100 includes the acquisition section 151 and the estimation section 152.
  • the acquisition unit 151 obtains a reference image including an underwater reference object irradiated with light from a light source, which is a reference brightness value that is a brightness value in an object area occupied by the reference object in the reference image captured by an underwater imaging device.
  • reference information indicating the relationship between the value and the reference distance, which is the distance from the imaging device to the reference object, and a target image including an underwater object illuminated by light from a light source, which is a distance from the imaging device to the reference object.
  • the estimation unit 152 estimates the target distance, which is the distance from the imaging device to the target object, based on the comparison between the target brightness value, which is the brightness value in the target area occupied by the target object, and reference information in the target image. do.
  • the information processing device 100 can accurately estimate the distance from the imaging device to the underwater object.
  • the information processing device 100 can accurately estimate the distance from the imaging device to the underwater object, for example, the information processing device 100 can estimate the position of the object in the image, It is possible to provide high quality correct answer data regarding the size. That is, the information processing apparatus 100 can obtain, for example, high-quality training data to which high-quality correct answer data is added.
  • the information processing device 100 can obtain high-quality training data, for example, the high-quality training data can be used to develop a machine learning model for image recognition that estimates information about underwater objects from images. Learning accuracy can be improved. Therefore, the information processing device 100 can accurately estimate information regarding the underwater object from the image.
  • the information processing device 100 can accurately estimate information about underwater objects from images, it can help achieve Goal 9 of the Sustainable Development Goals (SDGs), “Create a foundation for industry and technological innovation.” I can contribute.
  • the information processing device 100 can, for example, accurately estimate information about fish in the water from images, so that it can help achieve Goal 14 of the Sustainable Development Goals (SDGs), “Let's protect the richness of the oceans.” I can contribute.
  • the estimation unit 152 estimates the size of the target object according to the estimated target distance.
  • the information processing device 100 can accurately estimate the size of the underwater object from the image. For example, it is assumed that there is an image that includes two object areas in which the size of the object area occupied by the object is approximately the same, but the brightness value in the object area is different. At this time, the information processing device 100 detects the target object corresponding to the object region having a large brightness value because the larger (smaller) the brightness value in the target object region is, the closer (farther) the distance from the imaging device to the target object is. It can be estimated that the size of the object corresponding to the object region with a small brightness value is larger than the size.
  • the information processing device 100 can determine the size of a small object (e.g., a predetermined fish) near the imaging device and the size of a large object (e.g., other than the predetermined fish) located far from the imaging device.
  • the size of each fish can be estimated with high accuracy.
  • the acquisition unit 151 acquires a target image including a plurality of targets.
  • the estimation unit 152 calculates the target distance from the imaging device to each of the plurality of objects based on the comparison between the target brightness values in each of the plurality of object regions occupied by each of the plurality of objects in the target image and reference information. is estimated, and each of the plurality of objects is identified according to the estimated object distance from the imaging device to each of the plurality of objects.
  • the information processing device 100 can accurately identify multiple underwater objects from the image. For example, assume that there is an image that includes two object areas in which a plurality of different objects overlap and the object areas occupied by the plurality of objects each have different brightness values. At this time, the information processing device 100 detects the target object corresponding to the object region having a large brightness value because the larger (smaller) the brightness value in the target object region is, the closer (farther) the distance from the imaging device to the target object is. It can be estimated that the position of the target object is located closer to the viewer than the position of the target object corresponding to the target object area with the lower luminance value.
  • the information processing device 100 estimates that the position of the object corresponding to the object area with a small brightness value is located further back than the position of the object corresponding to the object area with a large brightness value. be able to. In this way, the information processing device 100 can determine the position of an object near the imaging device (for example, a predetermined fish) and the position of an object far away from the imaging device (for example, another fish different from the predetermined fish). The positions of each can be identified with high accuracy. Furthermore, in the case of objects whose shapes and sizes are known to some extent (for example, fish in a cage), different positions in three-dimensional space mean that the objects are different. Therefore, the information processing device 100 accurately identifies an object near the imaging device (for example, a predetermined fish) and an object located far from the imaging device (for example, another fish different from the predetermined fish). can do.
  • the acquisition unit 151 obtains reference information indicating the relationship between a reference brightness value in a reference image including a reference object irradiated with light of a wavelength corresponding to a color corresponding to the target object, and a reference distance; A target image including a target object irradiated with light of a wavelength corresponding to a color corresponding to the color is acquired.
  • the estimation unit 152 estimates the target distance based on a comparison between the target brightness value in the target image and reference information.
  • the information processing device 100 can prevent the fish from escaping from within the imaging range while the image capturing device is capturing an image by emitting light that the fish prefers, so the information processing device 100 can emit light. It is possible to obtain an appropriate target image containing the fish. Furthermore, since the information processing apparatus 100 can acquire an appropriate target image, it can accurately estimate the target distance.
  • the acquisition unit 151 also obtains the relationship between the reference brightness value and the reference distance in each of a plurality of different reference images each including a reference object irradiated with light of a plurality of different wavelengths corresponding to each of a plurality of different colors.
  • a plurality of different target images are acquired, each including a plurality of different reference information shown and a target object irradiated with light of a plurality of different wavelengths at different times within a predetermined time.
  • the estimation unit 152 estimates the target distance based on a comparison between the target brightness value in each of a plurality of different target images and each of a plurality of different reference information.
  • the information processing device 100 can irradiate a target object with a plurality of lights of different wavelengths (that is, lights of different colors) in a very short period of time. That is, even if the target object is a fish and different fish species prefer different colors, the information processing device 100 prevents the fish from escaping from within the imaging range while the imaging device captures the image. Therefore, it is possible to obtain an appropriate target image including the fish illuminated with light. Furthermore, since the information processing apparatus 100 can acquire an appropriate target image, it can accurately estimate the target distance.
  • a target object that is, lights of different colors
  • the light source is provided at the top or bottom of the casing of the imaging device.
  • the information processing device 100 can, for example, align the direction in which the imaging device looks up at the object and the direction in which the light source irradiates the object, making it easier to capture images.
  • the reference object is a fish scale of a specific fish species or a model of a fish scale, or an object covered with pseudo-skin, a color chart, or a checkerboard.
  • the object is a specific species of fish.
  • the information processing device 100 can acquire appropriate reference information that differs for each species of fish, and therefore can accurately estimate information regarding fish in the water from the image.
  • the acquisition unit 151 also acquires a visibility reference image that includes an underwater reference object irradiated with light from a light source, and that is located in an object area occupied by the reference object in the visibility reference image captured by the underwater imaging device.
  • Underwater imaging which is a target image that includes a visibility reference brightness value, which is a brightness value, and visibility reference information that indicates the relationship between underwater visibility, and an underwater object that is irradiated with light from a light source.
  • a target image captured by the device is acquired.
  • the estimation unit 152 determines the target image based on the comparison between the brightness value of at least one color among the brightness values of a plurality of colors in the target area occupied by the target object in the target image and the transparency reference information. Estimate the visibility of the water when the image was captured.
  • the information processing device 100 can estimate the visibility of the water with high accuracy, so that, for example, the distance from the imaging device to the underwater object can be estimated with more accuracy.
  • the estimation unit 152 estimates the target distance based on the estimated underwater visibility.
  • the information processing device 100 can estimate the distance from the imaging device to the underwater object with higher accuracy.
  • the information processing device 100 includes an acquisition unit 151 and an estimation unit 152.
  • the acquisition unit 151 obtains a visibility reference image including an underwater reference object irradiated with light from a light source, which is a luminance value in an object area occupied by the reference object in the visibility reference image captured by the underwater imaging device.
  • a target image that includes a visibility reference brightness value, visibility reference information indicating the relationship between the underwater visibility, and an underwater object irradiated with light from a light source, which is captured by an underwater imaging device. Obtain the captured target image.
  • the estimation unit 152 determines the target image based on the comparison between the brightness value of at least one color among the brightness values of a plurality of colors in the target area occupied by the target object in the target image and the transparency reference information. Estimate the visibility of the water when the image was captured.
  • the information processing device 100 can estimate the visibility of the water with high accuracy, so that, for example, the distance from the imaging device to the underwater object can be estimated with more accuracy.
  • the estimation unit 152 calculates the brightness value of one color in the target area occupied by the target object in the target image and the visibility. Based on the comparison with the reference information, the degree of transparency in the water when the target image was captured is estimated.
  • the information processing device 100 can appropriately estimate the visibility of the water depending on the visibility of the water.
  • the estimation unit 152 calculates the luminance values of the plurality of colors in the target area occupied by the target object in the target image and the perspective Based on the comparison with the degree reference information, the degree of visibility of the water at the time the target image was captured is estimated.
  • the information processing device 100 can appropriately estimate the visibility of the water depending on the visibility of the water.
  • FIG. 9 is a hardware configuration diagram showing an example of a computer that implements the functions of the information processing device 100.
  • the computer 1000 includes a CPU 1100, a RAM 1200, a ROM 1300, an HDD 1400, a communication interface (I/F) 1500, an input/output interface (I/F) 1600, and a media interface (I/F) 1700.
  • the CPU 1100 operates based on a program stored in the ROM 1300 or the HDD 1400 and controls each part.
  • the ROM 1300 stores a boot program executed by the CPU 1100 when the computer 1000 is started, programs depending on the hardware of the computer 1000, and the like.
  • the HDD 1400 stores programs executed by the CPU 1100 and data used by the programs.
  • Communication interface 1500 receives data from other devices via a predetermined communication network and sends it to CPU 1100, and transmits data generated by CPU 1100 to other devices via a predetermined communication network.
  • the CPU 1100 controls output devices such as a display and printer, and input devices such as a keyboard and mouse via the input/output interface 1600.
  • CPU 1100 obtains data from an input device via input/output interface 1600. Further, CPU 1100 outputs the generated data to an output device via input/output interface 1600.
  • the media interface 1700 reads programs or data stored in the recording medium 1800 and provides them to the CPU 1100 via the RAM 1200.
  • CPU 1100 loads this program from recording medium 1800 onto RAM 1200 via media interface 1700, and executes the loaded program.
  • the recording medium 1800 is, for example, an optical recording medium such as a DVD (Digital Versatile Disc) or a PD (Phase change rewritable disk), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory. etc.
  • the CPU 1100 of the computer 1000 realizes the functions of the control unit 150 by executing a program loaded onto the RAM 1200.
  • the CPU 1100 of the computer 1000 reads these programs from the recording medium 1800 and executes them, but as another example, these programs may be acquired from another device via a predetermined communication network.
  • each component of each device shown in the drawings is functionally conceptual, and does not necessarily need to be physically configured as shown in the drawings.
  • the specific form of distributing and integrating each device is not limited to what is shown in the diagram, and all or part of the devices can be functionally or physically distributed or integrated in arbitrary units depending on various loads and usage conditions. Can be integrated and configured.
  • the information processing device 100 described above may be realized by a plurality of server computers, and depending on the function, it may be realized by calling an external platform etc. using an API (Application Programming Interface), network computing, etc. can be changed flexibly.
  • API Application Programming Interface
  • Information processing device 110 Communication unit 120 Storage unit 130 Input unit 140 Output unit 150 Control unit 151 Acquisition unit 152 Estimation unit 153 Output control unit

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

This information processing program causes a computer to execute: an acquisition procedure for acquiring reference information and a target image, said reference information being indicative of a relationship between a reference luminance value, which is a luminance value of an object region that is in a reference image including an underwater reference object irradiated with light of a light source and having been captured by an underwater image capture device and that is occupied by the reference object, and a reference distance, which is the distance from the image capture device to the reference object, said target image including an underwater target object irradiated with light from the light source and having been captured by the underwater image capture device; and an estimation procedure for estimating a target distance, which is the distance from the image capture device to the target object, on the basis of a comparison between the reference information and a target luminance value, which is the luminance value of a target object region that is in the target image and that is occupied by the target object.

Description

情報処理プログラム、情報処理装置及び情報処理方法Information processing program, information processing device, and information processing method
 本発明は、情報処理プログラム、情報処理装置及び情報処理方法に関する。 The present invention relates to an information processing program, an information processing device, and an information processing method.
 従来、画像(静止画像または動画像)に写っている対象物を識別する画像認識技術が知られている。例えば、魚の養殖技術を向上させるために、生簀内の魚の画像をカメラによって撮像し、撮像された画像を解析することにより生簀内の魚の数(以下、尾数ともいう)を推定する技術が知られている。 Image recognition technology for identifying objects in images (still images or moving images) is conventionally known. For example, in order to improve fish farming technology, there is a known technology that estimates the number of fish in a cage (hereinafter also referred to as the number of fish) by capturing images of fish in a cage with a camera and analyzing the captured images. ing.
国際公開第2019/045091号International Publication No. 2019/045091
 画像から水中の対象物に関する情報を精度よく推定する技術が求められている。 There is a need for technology that can accurately estimate information about underwater objects from images.
 本願は、上記に鑑みてなされたものであって、画像から水中の対象物に関する情報を精度よく推定することができる情報処理プログラム、情報処理装置及び情報処理方法を提供することを目的とする。 The present application has been made in view of the above, and aims to provide an information processing program, an information processing device, and an information processing method that can accurately estimate information regarding underwater objects from images.
 本願に係る情報処理プログラムは、光源の光を照射された水中の参照物体を含む参照画像であって、前記水中の撮像装置により撮像された参照画像のうち、前記参照物体が占める物体領域における輝度値である参照輝度値と、前記撮像装置から前記参照物体までの距離である参照距離との関係性を示す参照情報、および、前記光源の光を照射された前記水中の対象物を含む対象画像であって、前記水中の前記撮像装置により撮像された対象画像を取得する取得手順と、前記対象画像のうち、前記対象物が占める対象物領域における輝度値である対象輝度値と、前記参照情報との比較に基づいて、前記撮像装置から前記対象物までの距離である対象距離を推定する推定手順と、をコンピュータに実行させる。 The information processing program according to the present application provides a reference image including an underwater reference object irradiated with light from a light source, the brightness of an object area occupied by the reference object in the reference image captured by the underwater imaging device. reference information indicating a relationship between a reference brightness value, which is a value, and a reference distance, which is a distance from the imaging device to the reference object; and a target image including the underwater target irradiated with light from the light source. an acquisition procedure for acquiring a target image captured by the imaging device underwater; a target brightness value that is a brightness value in a target area occupied by the target object in the target image; and the reference information. an estimation procedure for estimating a target distance, which is a distance from the imaging device to the target object, based on a comparison with the target object.
 実施形態の一態様によれば、画像から水中の対象物に関する情報を精度よく推定することができるといった効果を奏する。 According to one aspect of the embodiment, it is possible to accurately estimate information regarding an underwater object from an image.
図1は、水中における色の減衰率について説明するための図である。FIG. 1 is a diagram for explaining the color attenuation rate in water. 図2は、実施形態に係る参照情報の取得方法について説明するための図である。FIG. 2 is a diagram for explaining a reference information acquisition method according to the embodiment. 図3は、実施形態に係る参照情報の一例を示す図である。FIG. 3 is a diagram illustrating an example of reference information according to the embodiment. 図4は、実施形態に係る参照情報の取得に用いられる参照物体の形状について説明するための図である。FIG. 4 is a diagram for explaining the shape of a reference object used to obtain reference information according to the embodiment. 図5は、実施形態に係る情報処理装置の構成例を示す図である。FIG. 5 is a diagram illustrating a configuration example of an information processing apparatus according to an embodiment. 図6は、実施形態に係る対象画像の一例を示す図である。FIG. 6 is a diagram illustrating an example of a target image according to the embodiment. 図7は、変形例に係る情報処理の概要を示す図である。FIG. 7 is a diagram showing an overview of information processing according to a modification. 図8は、変形例に係る情報処理の概要を示す図である。FIG. 8 is a diagram showing an overview of information processing according to a modification. 図9は、情報処理装置の機能を実現するコンピュータの一例を示すハードウェア構成図である。FIG. 9 is a hardware configuration diagram showing an example of a computer that implements the functions of the information processing device.
 以下に、本願に係る情報処理プログラム、情報処理装置及び情報処理方法を実施するための形態(以下、「実施形態」と呼ぶ)について図面を参照しつつ詳細に説明する。なお、この実施形態により本願に係る情報処理プログラム、情報処理装置及び情報処理方法が限定されるものではない。また、以下の各実施形態において同一の部位には同一の符号を付し、重複する説明は省略される。 Hereinafter, embodiments (hereinafter referred to as "embodiments") for implementing the information processing program, information processing apparatus, and information processing method according to the present application will be described in detail with reference to the drawings. Note that the information processing program, information processing apparatus, and information processing method according to the present application are not limited to this embodiment. Further, in each of the embodiments below, the same parts are given the same reference numerals, and redundant explanations will be omitted.
(実施形態)
〔1.はじめに〕
 近年、魚の養殖は、地球の食糧問題を解決する手段として注目されている。魚の養殖によって高品質な魚を供給するには、給餌(魚に対する餌やり)と関わりが深い魚の数(尾数)や魚の大きさを正確に知ることが重要である。
(Embodiment)
[1. Introduction]
In recent years, fish farming has attracted attention as a means to solve global food problems. In order to supply high-quality fish through fish farming, it is important to accurately know the number of fish (number of fish) and the size of the fish, which are closely related to feeding (feeding the fish).
 しかしながら、養殖場の水中という特殊な環境下では、地上のIT技術を活用することが難しい場合がある。そのため、従来は、人が網で一部の魚をすくい上げて魚の数(尾数)や魚の大きさを計量した後に、目視で尾数を数えていた。この方法は、魚への負担が大きいうえ、正確さに欠けるといった問題がある。 However, in the special environment of underwater aquaculture farms, it may be difficult to utilize terrestrial IT technology. Therefore, in the past, people scooped up some fish with a net, measured the number of fish (number of fish) and the size of the fish, and then counted the number of fish visually. This method has problems in that it puts a heavy burden on the fish and lacks accuracy.
 そこで近年、コンピュータビジョンを用いて生簀内の魚の群れを撮像した画像から尾数や魚の大きさを自動で検出する方法が注目を浴びている。具体例には、画像から尾数や魚の大きさを推定するよう画像認識用の機械学習モデルを学習する方法である。しかしながら、機械学習モデルを用いて画像から尾数や魚の大きさを精度よく推定するには、質の高いトレーニングデータを大量に機械学習モデルに学習させる必要がある。例えば、機械学習モデルのトレーニングデータとして、生簀内の魚の群れなどの水中の魚を撮像した画像と、画像中の各魚の位置や大きさを示す正解データとの組が用いられる。この場合、トレーニングデータの質が高いことは、画像中の魚の位置や大きさを示す正解データの精度が高いことを指す。 Therefore, in recent years, methods that use computer vision to automatically detect the number of fish and the size of fish from images of schools of fish in cages have been attracting attention. A specific example is a method in which a machine learning model for image recognition is trained to estimate the number of fish and the size of fish from images. However, in order to accurately estimate the number of fish and the size of fish from images using a machine learning model, it is necessary for the machine learning model to learn a large amount of high-quality training data. For example, as training data for a machine learning model, a set of an image of underwater fish such as a school of fish in a cage and correct data indicating the position and size of each fish in the image is used. In this case, the high quality of the training data means that the accuracy of the correct data indicating the position and size of the fish in the image is high.
 ここで、一般的に、水中を伝搬する光は、水による散乱や吸収により減衰を起こすため、遠くへ届かないことが知られている。例えば、可視光線(波長約380~780nm)のうち、特に赤色~橙色領域の波長に吸収帯があり、エネルギーが吸収されることが知られている。このため、赤色の光は、水深10mでは、ほぼ100%失われることが知られている。また、青色光でも、水深20mで約50%程度のエネルギーが吸収されてしまうことが知られている。このように、水中では赤色成分が早く失われるので、水は青色に見える。 Here, it is generally known that light propagating in water does not reach far because it is attenuated due to scattering and absorption by the water. For example, it is known that among visible light (wavelengths of about 380 to 780 nm), there is an absorption band particularly in the red to orange wavelength range, and energy is absorbed. For this reason, it is known that almost 100% of red light is lost at a depth of 10 meters. It is also known that approximately 50% of the energy of blue light is absorbed at a depth of 20 meters. In this way, the red component is quickly lost in water, so the water appears blue.
 図1は、水中における色の減衰率について説明するための図である。図1示すグラフは、水中における赤色、緑色および青色に対応する波長の光(以下、赤色の光、緑色の光および青色の光ともいう)の強度と、光の強度を計測する計測機器から光源までの距離との関係を示す。図1に示すように、赤色の光、緑色の光および青色の光のいずれも、計測機器から光源までの距離が大きくなるほど光の強度が小さくなる様子がわかる。また、図1のグラフから、緑色の光および青色の光と比べて、赤色の光(波長の長い光)は特に減衰しやすいことがわかる。 FIG. 1 is a diagram for explaining the rate of color attenuation in water. The graph shown in Figure 1 shows the intensity of light with wavelengths corresponding to red, green, and blue (hereinafter also referred to as red light, green light, and blue light) in water, and the light source from the measuring equipment that measures the light intensity. This shows the relationship with the distance to. As shown in FIG. 1, it can be seen that the intensity of the red light, green light, and blue light decreases as the distance from the measuring instrument to the light source increases. Moreover, from the graph of FIG. 1, it can be seen that red light (light with a long wavelength) is particularly easily attenuated compared to green light and blue light.
 上述したように、水中を伝搬する光は減衰しやすいため、一般的に、水中の対象物を撮像した画像は、空中の対象物を撮像した画像と比べると、対象物を視認しにくい。そのため、例えば、空気中の対象物を撮像した画像から、画像中の対象物の位置や大きさを認識することと比べると、水中の対象物を撮像した画像から、画像中の対象物の位置や大きさを推定することは困難である。特に、水中の対象物を撮像した画像から、画像中の対象物の奥行方向の位置(または、撮像装置から水中の対象物までの距離)を精度よく推定することは非常に困難である。これにより、水中の対象物を撮像した画像に対して、画像中の対象物の位置や大きさに関する質の高い正解データを付与することは難しい。したがって、質の高いトレーニングデータを得ることは難しい。 As mentioned above, since light propagating underwater is easily attenuated, it is generally difficult to visually recognize an object in an image of an object underwater than in an image of an object in the air. Therefore, for example, compared to recognizing the position and size of an object in an image from an image taken of an object in the air, it is much easier to recognize the position and size of an object in an image from an image taken of an object underwater. It is difficult to estimate the size and size. In particular, it is very difficult to accurately estimate the position of the object in the depth direction (or the distance from the imaging device to the underwater object) from an image of the object underwater. As a result, it is difficult to provide high-quality correct data regarding the position and size of the object in the image to an image of the object underwater. Therefore, it is difficult to obtain high quality training data.
 そこで、画像に含まれる水中の魚の奥行方向の位置(または、撮像装置から水中の魚までの距離)を精度よく認識する技術が求められている。例えば、撮像装置外の他のセンサを用いて、水中の魚の奥行方向の位置を検出し、水中の魚の奥行方向の位置情報を補完する方法が考えられる。しかしながら、水中では、空気中で有用な手法のほとんどが使用できない。例えば、LiDAR(Light Detection And Ranging)センサの光は、空気中と比べて、水中ではほとんど届かない。また、超音波センサは、水中の魚の群れ(魚群ともいう)を見つけるには向いているが、魚群の尾数をカウントするには、解像度が低い。 Therefore, there is a need for a technology that accurately recognizes the position in the depth direction of the fish in the water included in the image (or the distance from the imaging device to the fish in the water). For example, a method can be considered in which the position of a fish in the water in the depth direction is detected using another sensor outside the imaging device, and the position information in the depth direction of the fish in the water is complemented. However, most of the techniques useful in air cannot be used underwater. For example, the light from a LiDAR (Light Detection And Ranging) sensor hardly reaches underwater than it does in the air. Furthermore, ultrasonic sensors are suitable for finding schools of fish (also called schools of fish) in the water, but their resolution is low for counting the number of fish in schools of fish.
 これに対し、実施形態に係る情報処理装置100は、水中に位置する、魚の体の表面を模した物体(以下、参照物体ともいう)を撮像した画像のうち、参照物体が撮像された物体領域における輝度値(以下、参照輝度値ともいう)と、画像が撮像された際の撮像装置から参照物体までの距離(以下、参照距離ともいう)との関係性を示す参照情報を取得する。また、情報処理装置100は、水中の魚を撮像した画像(以下、対象画像ともいう)を取得する。そして、情報処理装置100は、取得した対象画像のうち、魚が撮像された領域(以下、対象物領域ともいう)における輝度値(以下、対象輝度値ともいう)と、取得した参照情報との比較に基づいて、対象画像が撮像された際の撮像装置から魚までの距離(以下、対象距離ともいう)を推定する。具体的には、情報処理装置100は、参照情報に基づいて、対象輝度値と同じ参照輝度値に対応する参照距離を特定し、特定した参照距離を対象距離として推定する。 On the other hand, the information processing device 100 according to the embodiment is configured to detect an object area in which the reference object is imaged in an image of an object imitating the surface of a fish's body (hereinafter also referred to as a reference object) located underwater. Reference information indicating the relationship between the brightness value at (hereinafter also referred to as reference brightness value) and the distance from the imaging device to the reference object when the image was captured (hereinafter also referred to as reference distance) is acquired. The information processing device 100 also acquires an image of a fish in the water (hereinafter also referred to as a target image). The information processing device 100 then compares the brightness value (hereinafter also referred to as target brightness value) in the area where the fish is imaged (hereinafter also referred to as target object area) in the acquired target image with the acquired reference information. Based on the comparison, the distance from the imaging device to the fish when the target image was captured (hereinafter also referred to as target distance) is estimated. Specifically, the information processing device 100 specifies a reference distance corresponding to the same reference brightness value as the target brightness value based on the reference information, and estimates the specified reference distance as the target distance.
 これにより、情報処理装置100は、撮像装置から水中の魚までの距離を精度よく推定することができる。また、情報処理装置100は、撮像装置から水中の魚までの距離を精度よく推定することができるので、例えば、水中の魚を撮像した画像に対して、画像中の魚の位置や大きさに関する質の高い正解データを付与可能とすることができる。すなわち、情報処理装置100は、例えば、質の高い正解データを付与された質の高いトレーニングデータを得ることができる。さらに、情報処理装置100は、質の高いトレーニングデータを得ることができるので、例えば、質の高いトレーニングデータを用いて、画像から水中の魚に関する情報を推定する画像認識用の機械学習モデルの学習精度を向上させることができる。したがって、情報処理装置100は、画像から水中の魚に関する情報を精度よく推定することができる。 Thereby, the information processing device 100 can accurately estimate the distance from the imaging device to the fish in the water. Further, since the information processing device 100 can accurately estimate the distance from the imaging device to the fish in the water, for example, the information processing device 100 can estimate the distance from the imaging device to the fish in the water. It is possible to provide high accuracy data. That is, the information processing device 100 can obtain high-quality training data to which high-quality correct answer data is added, for example. Furthermore, since the information processing device 100 can obtain high-quality training data, for example, the high-quality training data is used to learn a machine learning model for image recognition that estimates information about fish in the water from images. Accuracy can be improved. Therefore, the information processing device 100 can accurately estimate information regarding fish in the water from the image.
 なお、上記では、対象画像が、水中の魚を撮像した画像である場合について説明したが、対象画像の撮像対象(以下、対象物ともいう)は水中の魚に限られない。例えば、対象物は、魚以外の他の生物であってもよい。ここで、対象物が、魚以外の他の生物である場合、参照物体は、魚の体の表面を模した物体の代わりに、魚以外の他の生物の体の表面を模した物体に置き換えられる。また、対象物は、生物に限られない。例えば、対象物は、生物以外の物体であってもよい。ここで、対象物が、生物以外の物体である場合、参照物体は、生物の体の表面を模した物体の代わりに、生物以外の物体の表面を模した物体に置き換えられる。 In addition, although the case where the target image is an image of a fish in the water has been described above, the imaging target of the target image (hereinafter also referred to as a target object) is not limited to a fish in the water. For example, the object may be a living thing other than a fish. Here, if the target object is a living creature other than a fish, the reference object is replaced with an object that mimics the surface of the body of the other living creature, instead of an object that mimics the surface of the fish's body. . Furthermore, the target object is not limited to living things. For example, the target object may be an object other than a living thing. Here, when the target object is a non-living object, the reference object is replaced with an object that imitates the surface of the non-living object instead of an object that imitates the surface of the biological body.
〔2.参照情報の取得方法〕
 図2は、実施形態に係る参照情報の取得方法について説明するための図である。図2では、参照物体O1、カメラC1(撮像装置の一例)および光源L1が、生簀などの水中に位置する様子を示す。光源L1は、水中の参照物体O1に対して光を照射する。カメラC1は、光源L1の光を照射された水中の参照物体O1の画像(参照画像)を撮像する。参照画像が撮像された際のカメラC1から水中の参照物体O1までの距離をzとすると、カメラC1はzの値を変えながら、光源L1の光を照射された水中の参照物体O1を撮像する。すなわち、カメラC1は、参照物体O1から遠ざかる方向へ移動しながら、光源L1の光を照射された水中の参照物体O1の画像(参照画像)を撮像する。
[2. How to obtain reference information〕
FIG. 2 is a diagram for explaining a reference information acquisition method according to the embodiment. FIG. 2 shows a state in which a reference object O1, a camera C1 (an example of an imaging device), and a light source L1 are located in water such as a fish cage. The light source L1 irradiates light onto the underwater reference object O1. Camera C1 captures an image (reference image) of underwater reference object O1 irradiated with light from light source L1. If the distance from the camera C1 to the underwater reference object O1 when the reference image is captured is z, then the camera C1 images the underwater reference object O1 irradiated with light from the light source L1 while changing the value of z. . That is, the camera C1 captures an image (reference image) of the underwater reference object O1 irradiated with light from the light source L1 while moving in a direction away from the reference object O1.
 また、一般的に、魚の鱗の種類によって、魚の表面による光の反射率が異なることが知られている。そのため、参照物体O1の表面には、対象となる魚の種類に応じた魚の鱗(鱗の模型または実物の鱗)が貼り付けられている。また、多くの魚の体は、左右に平たい形をしている。そのため、魚の表面による光の反射は、平面状の物体の表面による光の反射に類似すると考えられる。そこで、図2に示す例では、参照物体O1が平面状(板状ともいう)であり、平面状の参照物体O1を横から見た様子を示す。図2では、カメラC1が参照物体O1を仰ぐ向きは、参照物体O1の法線方向と一致する。すなわち、参照物体O1の面は、カメラC1の真正面に位置する。 Additionally, it is generally known that the reflectance of light on the surface of a fish varies depending on the type of fish scale. Therefore, fish scales (scale models or real scales) depending on the type of target fish are pasted on the surface of the reference object O1. Also, the bodies of many fish are flat on both sides. Therefore, the reflection of light by the surface of a fish is considered to be similar to the reflection of light by the surface of a flat object. Therefore, in the example shown in FIG. 2, the reference object O1 is planar (also referred to as plate-like), and the planar reference object O1 is shown from the side. In FIG. 2, the direction in which the camera C1 looks up at the reference object O1 coincides with the normal direction of the reference object O1. That is, the surface of the reference object O1 is located directly in front of the camera C1.
 また、図2では、光源L1は、カメラC1の筐体の下部に備えられる。なお、光源L1は、カメラC1の筐体の上部に備えられてもよい。また、光源L1は、光源L1が光を照射する向きが、カメラC1が参照物体O1を仰ぐ向きと一致するように設置される。 Furthermore, in FIG. 2, the light source L1 is provided at the bottom of the housing of the camera C1. Note that the light source L1 may be provided at the top of the housing of the camera C1. Moreover, the light source L1 is installed so that the direction in which the light source L1 irradiates light matches the direction in which the camera C1 looks up at the reference object O1.
 また、一般的に、魚は、魚種によって好む光の色(あるいは嫌う光の色)が異なることが知られている。そのため、光源L1は、対象となる魚の種類に応じた色に対応する波長の光を照射する。具体的には、光源L1は、赤色、緑色、青色などの各種の色フィルタ付きのライトであってよい。 Additionally, it is generally known that the color of light that fish prefers (or the color of light that they dislike) differs depending on the species of fish. Therefore, the light source L1 emits light of a wavelength corresponding to a color depending on the type of target fish. Specifically, the light source L1 may be a light with various color filters such as red, green, and blue.
 図3は、実施形態に係る参照情報の一例を示す図である。図3では、図2に示す光源L1として、白ライト(白色光)、赤フィルタ付きライト(赤色光)、青フィルタ付きライト(青色光)、および、紫フィルタ付きライト(紫色光)をそれぞれ用いた場合に得られた4つの参照情報を示す。 FIG. 3 is a diagram showing an example of reference information according to the embodiment. In FIG. 3, a white light (white light), a light with a red filter (red light), a light with a blue filter (blue light), and a light with a violet filter (purple light) are used as the light source L1 shown in FIG. The four reference information obtained when
 また、図3に示す各グラフは、図2に示した方法で取得した各参照情報に対応する。グラフの横軸は、参照画像が撮像された際のカメラC1から水中の参照物体O1までの距離(z)を示す。グラフの縦軸は、参照画像の輝度値を示す。より具体的には、グラフ中の実線がR(赤色)の輝度値を、破線がG(緑色)の輝度値を、太線がB(青色)の輝度値を示す。図3に示す各グラフから、情報処理装置100は、例えば、対象画像のうち、魚が撮像された対象物領域における対象輝度値のRの輝度値がゼロに近い場合、対象画像が撮像された際の撮像装置から魚までの対象距離は、3m以上離れていると推定することができる。また、情報処理装置100は、対象画像が撮像された際の撮像装置から魚までの対象距離が1m~2mの間である場合には、例えば、赤フィルタ付きライト(赤色光)のグラフの実線で示されるR(赤色)の輝度値との比較に基づいて、対象距離を推定することができる。 Furthermore, each graph shown in FIG. 3 corresponds to each reference information obtained by the method shown in FIG. 2. The horizontal axis of the graph indicates the distance (z) from the camera C1 to the underwater reference object O1 when the reference image was captured. The vertical axis of the graph indicates the brightness value of the reference image. More specifically, the solid line in the graph represents the brightness value of R (red), the broken line represents the brightness value of G (green), and the thick line represents the brightness value of B (blue). From each graph shown in FIG. 3, the information processing device 100 determines that, for example, when the brightness value of R of the target brightness value in the target object region where the fish is photographed in the target image is close to zero, the target image is photographed. The target distance from the imaging device to the fish can be estimated to be 3 m or more. Furthermore, if the target distance from the imaging device to the fish when the target image is captured is between 1 m and 2 m, the information processing device 100 can, for example, use the solid line in the graph of the light with a red filter (red light). The target distance can be estimated based on the comparison with the brightness value of R (red) shown by .
 図4は、実施形態に係る参照情報の取得に用いられる参照物体の形状について説明するための図である。図2では、簡単のため、参照物体O1が平面状である場合について説明したが、実際の参照物体O1は、図4に示すような四角錐台の底面がない形状をしている。より具体的には、実際の参照物体O1は、面#1~面#5の5つの面で構成される。面#1を囲むようにして、面#2~面#5の4つの面が設けられる。図2と同様に、各面の表面には、対象となる魚の種類に応じた魚の鱗(鱗の模型または実物の鱗、疑似皮(オーロラハゲ皮)や、カラーチャート、チェッカーボードなどの素材)が貼り付けられている。図4に示す面#1は、図2に示す参照物体O1に相当する。具体的には、面#1は、カメラC1から見て真正面に位置する面である。面#2~面#5の4つの面は、面#1に対して傾きをもって設置される。すなわち、面#2~面#5の4つの面は、カメラC1から見て真正面に位置する面#1とは異なる角度をそれぞれ有している。また、一般的に、光の反射率は、光が入射する面の角度に応じて異なる。例えば、撮像装置に対する魚の体の向きが真横である場合、魚の体の表面からの反射率は最も高くなる。このように、参照情報を取得する際に、図4に示すような四角錐台の底面がない形状の参照物体O1を用いることで、カメラC1に対する角度が異なる面ごとの参照情報を取得することができる。これにより、情報処理装置100は、例えば、撮像装置に対する魚の体の向きが、真横以外の向きを向いている場合であっても、面#2~面#5の4つの面から得られた参照情報に基づいて、撮像装置から魚までの対象距離を推定することができる。 FIG. 4 is a diagram for explaining the shape of a reference object used to obtain reference information according to the embodiment. In FIG. 2, for the sake of simplicity, the case where the reference object O1 is planar has been described, but the actual reference object O1 has a shape of a truncated quadrangular pyramid without a bottom surface as shown in FIG. More specifically, the actual reference object O1 is composed of five surfaces, surface #1 to surface #5. Four surfaces, surface #2 to surface #5, are provided to surround surface #1. As in Figure 2, the surface of each side is covered with fish scales (scale models or real scales, pseudo skin (aurora bald skin), color charts, checkerboard, etc.) depending on the type of fish being targeted. is pasted. Surface #1 shown in FIG. 4 corresponds to reference object O1 shown in FIG. 2. Specifically, surface #1 is a surface located directly in front of the camera C1. The four surfaces #2 to #5 are installed at an angle with respect to surface #1. That is, the four surfaces #2 to #5 each have a different angle from the surface #1, which is located directly in front of the camera C1. Further, in general, the reflectance of light differs depending on the angle of the surface on which the light is incident. For example, when the fish's body is oriented directly sideways to the imaging device, the reflectance from the surface of the fish's body is the highest. In this way, when acquiring reference information, by using the reference object O1 having a truncated quadrangular pyramid shape without a base as shown in FIG. 4, it is possible to acquire reference information for each surface having a different angle with respect to the camera C1. I can do it. As a result, the information processing device 100 can use the reference information obtained from the four surfaces #2 to #5 even if the fish's body is oriented in a direction other than directly sideways relative to the imaging device, for example. Based on the information, the target distance from the imaging device to the fish can be estimated.
〔3.情報処理装置の構成〕
 図5は、実施形態に係る情報処理装置100の構成例を示す図である。情報処理装置100は、通信部110と、記憶部120と、入力部130と、出力部140と、制御部150とを有する。
[3. Configuration of information processing device]
FIG. 5 is a diagram illustrating a configuration example of the information processing device 100 according to the embodiment. The information processing device 100 includes a communication section 110, a storage section 120, an input section 130, an output section 140, and a control section 150.
(通信部110)
 通信部110は、例えば、NIC(Network Interface Card)等によって実現される。そして、通信部110は、ネットワークと有線または無線で接続され、例えば、魚を管理する管理者によって利用される端末装置との間で情報の送受信を行う。
(Communication Department 110)
The communication unit 110 is realized by, for example, a NIC (Network Interface Card). The communication unit 110 is connected to the network by wire or wirelessly, and transmits and receives information to and from a terminal device used by, for example, a fish manager.
(記憶部120)
 記憶部120は、例えば、RAM(Random Access Memory)、フラッシュメモリ(Flash Memory)等の半導体メモリ素子、または、ハードディスク、光ディスク等の記憶装置によって実現される。例えば、記憶部120は、実施形態に係る情報処理プログラムを記憶する。
(Storage unit 120)
The storage unit 120 is realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory, or a storage device such as a hard disk or an optical disk. For example, the storage unit 120 stores an information processing program according to the embodiment.
(入力部130)
 入力部130は、利用者から各種操作が入力される。例えば、入力部130は、タッチパネル機能により表示面(例えば出力部140)を介して利用者からの各種操作を受け付けてもよい。また、入力部130は、情報処理装置100に設けられたボタンや、情報処理装置100に接続されたキーボードやマウスからの各種操作を受け付けてもよい。
(Input section 130)
The input unit 130 receives various operations from the user. For example, the input unit 130 may receive various operations from the user via a display screen (for example, the output unit 140) using a touch panel function. Further, the input unit 130 may accept various operations from buttons provided on the information processing device 100 or a keyboard or mouse connected to the information processing device 100.
(出力部140)
 出力部140は、例えば、液晶ディスプレイや有機EL(Electro-Luminescence)ディスプレイ等によって実現される表示画面であり、各種情報を表示するための表示装置である。出力部140は、制御部150の制御に従って、各種情報を表示する。なお、情報処理装置100にタッチパネルが採用される場合には、入力部130と出力部140とは一体化される。また、以下の説明では、出力部140を画面と記載する場合がある。
(Output section 140)
The output unit 140 is a display screen realized by, for example, a liquid crystal display or an organic EL (Electro-Luminescence) display, and is a display device for displaying various information. The output unit 140 displays various information under the control of the control unit 150. Note that when a touch panel is employed in the information processing apparatus 100, the input section 130 and the output section 140 are integrated. Furthermore, in the following description, the output unit 140 may be referred to as a screen.
(制御部150)
 制御部150は、コントローラ(controller)であり、例えば、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等によって、情報処理装置100内部の記憶装置に記憶されている各種プログラム(情報処理プログラムの一例に相当)がRAMを作業領域として実行されることにより実現される。また、制御部150は、コントローラであり、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)等の集積回路により実現される。
(Control unit 150)
The control unit 150 is a controller, and for example, executes various programs (information processing programs) stored in a storage device inside the information processing device 100 by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like. (corresponding to one example) is realized by being executed using RAM as a work area. Further, the control unit 150 is a controller, and is realized by, for example, an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
 制御部150は、取得部151と、推定部152と、出力制御部153を機能部として有し、以下に説明する情報処理の作用を実現または実行してよい。なお、制御部150の内部構成は、図5に示した構成に限られず、後述する情報処理を行う構成であれば他の構成であってもよい。また、各機能部は、制御部150の機能を示したものであり、必ずしも物理的に区別されるものでなくともよい。 The control unit 150 has an acquisition unit 151, an estimation unit 152, and an output control unit 153 as functional units, and may realize or execute the information processing operation described below. Note that the internal configuration of the control unit 150 is not limited to the configuration shown in FIG. 5, and may be any other configuration as long as it performs information processing to be described later. Further, each functional unit indicates a function of the control unit 150, and does not necessarily have to be physically distinct.
(取得部151)
 取得部151は、光源の光を照射された水中の参照物体を含む参照画像であって、水中の撮像装置により撮像された参照画像のうち、参照物体が占める物体領域における輝度値である参照輝度値と、撮像装置から参照物体までの距離である参照距離との関係性を示す参照情報を取得する。具体的には、取得部151は、魚種ごとに異なる魚に応じた色に対応する波長の光を照射された参照物体を含む参照画像における参照輝度値と、参照距離との関係性を示す参照情報を取得する。例えば、取得部151は、図3に示す各グラフによって示される参照情報を取得する。例えば、取得部151は、実験により参照情報を取得した利用者が使用する端末装置から参照情報を取得する。
(Acquisition unit 151)
The acquisition unit 151 obtains a reference image including an underwater reference object irradiated with light from a light source, which is a reference brightness value that is a brightness value in an object area occupied by the reference object in the reference image captured by an underwater imaging device. Reference information indicating the relationship between the value and a reference distance, which is the distance from the imaging device to the reference object, is acquired. Specifically, the acquisition unit 151 indicates the relationship between a reference brightness value in a reference image including a reference object irradiated with light of a wavelength corresponding to a color corresponding to a different fish for each species and a reference distance. Get reference information. For example, the acquisition unit 151 acquires reference information shown by each graph shown in FIG. 3. For example, the acquisition unit 151 acquires reference information from a terminal device used by a user who has acquired the reference information through an experiment.
 また、取得部151は、光源の光を照射された水中の魚を含む対象画像であって、水中の撮像装置により撮像された対象画像を取得する。具体的には、取得部151は、魚種ごとに異なる魚に応じた色に対応する波長の光を照射された魚を含む対象画像を取得する。例えば、取得部151は、複数の魚を含む対象画像を取得する。図6は、実施形態に係る対象画像の一例を示す図である。例えば、取得部151は、図6に示すような複数の魚を含む対象画像G1を取得する。例えば、取得部151は、対象画像を撮像した撮像装置から対象画像を取得する。 Furthermore, the acquisition unit 151 acquires a target image that includes an underwater fish irradiated with light from a light source and that is captured by an underwater imaging device. Specifically, the acquisition unit 151 acquires a target image including a fish irradiated with light of a wavelength corresponding to a color corresponding to a different fish species. For example, the acquisition unit 151 acquires a target image including a plurality of fish. FIG. 6 is a diagram illustrating an example of a target image according to the embodiment. For example, the acquisition unit 151 acquires a target image G1 including a plurality of fish as shown in FIG. For example, the acquisition unit 151 acquires the target image from the imaging device that captured the target image.
(推定部152)
 推定部152は、対象画像のうち、魚が占める対象物領域における輝度値である対象輝度値と、参照情報との比較に基づいて、撮像装置から魚までの距離である対象距離を推定する。具体的には、推定部152は、参照情報に基づいて、対象輝度値と同じ参照輝度値に対応する参照距離を特定し、特定した参照距離を対象距離として推定する。
(Estimation unit 152)
The estimation unit 152 estimates the target distance, which is the distance from the imaging device to the fish, based on a comparison between the target brightness value, which is the brightness value in the target object area occupied by the fish, and the reference information in the target image. Specifically, the estimation unit 152 specifies a reference distance corresponding to the same reference brightness value as the target brightness value based on the reference information, and estimates the specified reference distance as the target distance.
 また、推定部152は、推定した対象距離に応じて、魚の大きさを推定する。例えば、対象画像のうち、魚が占める対象物領域の大きさが同程度であって、対象物領域における輝度値が異なる2つの対象物領域を含む画像があるとする。このとき、推定部152は、対象物領域における輝度値が大きい(小さい)ほど、撮像装置から魚までの距離が近い(遠い)ので、輝度値が大きい対象物領域に対応する対象物の大きさよりも、輝度値が小さい対象物領域に対応する対象物の大きさの方が大きいと推定する。 Furthermore, the estimation unit 152 estimates the size of the fish according to the estimated target distance. For example, suppose that there is an image that includes two target object regions in which the size of the target object region occupied by a fish is approximately the same, and the brightness values in the target object regions are different. At this time, the estimation unit 152 calculates that the larger (smaller) the brightness value in the target object region, the closer (farther) the distance from the imaging device to the fish is. Also, it is estimated that the size of the object corresponding to the object region having a small brightness value is larger.
 また、推定部152は、対象画像のうち、複数の魚それぞれが占める複数の対象物領域それぞれにおける対象輝度値と、参照情報との比較に基づいて、撮像装置から複数の魚それぞれまでの対象距離を推定し、推定した撮像装置から複数の魚それぞれまでの対象距離に応じて、複数の魚それぞれを識別する。例えば、異なる複数の魚が重なって写っており、複数の魚それぞれが占める対象物領域における輝度値が異なる2つの対象物領域を含む対象画像があるとする。このとき、推定部152は、対象物領域における輝度値が大きい(小さい)ほど、撮像装置から魚までの距離が近い(遠い)ので、輝度値が大きい対象物領域に対応する魚の位置の方が、輝度値が小さい対象物領域に対応する魚の位置よりも手前側に位置すると推定することができる。言い換えると、推定部152は、輝度値が大きい対象物領域に対応する魚の位置よりも、輝度値が小さい対象物領域に対応する魚の位置の方が奥側に位置すると推定することができる。また、形状や大きさがある程度わかっている魚(例えば、生簀内の魚など)の場合、3次元空間における位置が異なることは、異なる魚であることは自明である。したがって、推定部152は、撮像装置の近くにある対象物(例えば、所定の魚)と撮像装置の遠くにある対象物(例えば、所定の魚とは異なる他の魚)をそれぞれ精度よく識別することができる。 Furthermore, the estimation unit 152 calculates the target distance from the imaging device to each of the plurality of fish based on the comparison between the target brightness value in each of the plurality of target object areas occupied by each of the plurality of fish in the target image and the reference information. is estimated, and each of the plurality of fish is identified according to the estimated target distance from the imaging device to each of the plurality of fish. For example, assume that there is a target image in which a plurality of different fish are photographed overlapping each other and includes two target object regions in which the brightness values of the target object regions occupied by the plurality of fish are different. At this time, the estimation unit 152 estimates that the larger (smaller) the brightness value in the target object area is, the closer (farther) the distance from the imaging device to the fish is. , it can be estimated that the fish is located in front of the position of the fish corresponding to the target area with the small brightness value. In other words, the estimating unit 152 can estimate that the position of the fish corresponding to the object area with a small brightness value is located further back than the position of the fish corresponding to the object area with a large brightness value. Furthermore, in the case of fish whose shapes and sizes are known to some extent (for example, fish in a cage), it is obvious that different positions in three-dimensional space indicate different fish. Therefore, the estimation unit 152 accurately identifies an object near the imaging device (for example, a predetermined fish) and an object located far from the imaging device (for example, another fish different from the predetermined fish). be able to.
(出力制御部153)
 出力制御部153は、推定部152が推定した推定結果を画面に表示するよう制御する。例えば、出力制御部153は、推定部152が推定した対象距離を示す数値を対象物領域に重畳して表示するよう画面を制御してよい。
(Output control unit 153)
The output control unit 153 controls the estimation result estimated by the estimation unit 152 to be displayed on the screen. For example, the output control unit 153 may control the screen to display a numerical value indicating the target distance estimated by the estimating unit 152 in a superimposed manner on the target object area.
〔4.変形例〕
 上述した実施形態に係る情報処理装置100は、上記実施形態以外にも種々の異なる形態にて実施されてよい。そこで、以下では、情報処理装置100の他の実施形態について説明する。なお、実施形態と同一部分には、同一符号を付して説明を省略する。
[4. Modified example]
The information processing device 100 according to the embodiment described above may be implemented in various different forms other than the embodiment described above. Therefore, other embodiments of the information processing device 100 will be described below. Note that the same parts as those in the embodiment are given the same reference numerals and the description thereof will be omitted.
〔4-1.機械学習モデル〕
 上述した実施形態では、推定部152が、対象画像のうち魚が占める対象物領域における輝度値と参照情報との比較に基づいて、撮像装置から魚までの距離を推定する場合について説明した。ここでは、実施形態に係る変形例として、推定部152が、機械学習モデルを用いて、対象画像のうち、魚が占める対象物領域における輝度値に基づいて、対象画像に含まれる魚の大きさおよび魚の数(すなわち、尾数)を推定する場合について説明する。
[4-1. Machine learning model]
In the embodiment described above, a case has been described in which the estimation unit 152 estimates the distance from the imaging device to the fish based on the comparison between the brightness value in the target object area occupied by the fish in the target image and the reference information. Here, as a modified example of the embodiment, the estimation unit 152 uses a machine learning model to determine the size and size of the fish included in the target image based on the brightness value in the target area occupied by the fish in the target image. A case of estimating the number of fish (namely, the number of fish) will be explained.
 図7は、変形例に係る情報処理の概要を示す図である。図7では、推定部152は、魚を含む画像が機械学習モデルM1に入力された場合に、各魚の位置情報、大きさを示す情報、および(R、G、B)の各輝度値を出力するように学習された機械学習モデルM1を取得する。続いて、推定部152は、機械学習モデルM1を用いて、対象画像に含まれる魚の大きさおよび魚の数(すなわち、尾数)を推定する。 FIG. 7 is a diagram showing an overview of information processing according to a modification. In FIG. 7, when an image including fish is input to the machine learning model M1, the estimation unit 152 outputs position information of each fish, information indicating the size, and each brightness value of (R, G, B). A machine learning model M1 trained to do the following is acquired. Next, the estimation unit 152 uses the machine learning model M1 to estimate the size and number of fish (ie, the number of fish) included in the target image.
 図8は、変形例に係る情報処理の概要を示す図である。図8では、推定部152は、魚を含む画像および魚を含む画像から生成したポイントクラウドが機械学習モデルM2に入力された場合に、各魚の位置情報、大きさを示す情報、(R、G、B)の各輝度値、およびポイントクラウドの各点の位置情報を出力するように学習された機械学習モデルM2を取得する。続いて、推定部152は、機械学習モデルM2を用いて、対象画像に含まれる魚の大きさおよび魚の数(すなわち、尾数)を推定する。 FIG. 8 is a diagram showing an overview of information processing according to a modification. In FIG. 8, when an image containing a fish and a point cloud generated from the image containing a fish are input to the machine learning model M2, the estimation unit 152 calculates the position information of each fish, information indicating the size, (R, G , B), and a machine learning model M2 trained to output the position information of each point in the point cloud. Subsequently, the estimating unit 152 uses the machine learning model M2 to estimate the size of the fish and the number of fish (ie, the number of fish) included in the target image.
〔4-2.フルカラーLEDを高速でON/OFFする光源〕
 上述した実施形態では、光源が、対象となる魚の種類に応じた色に対応する波長の光を照射する場合について説明した。ここでは、光源が、フルカラーLEDを高速でON/OFFする光源である場合について説明する。すなわち、光源が、極短時間の間に、ほぼ同じタイミングで、複数の異なる波長の光(つまり、異なる色の光)を魚に照射する場合について説明する。具体的には、取得部151は、複数の異なる色それぞれに対応する複数の異なる波長の光それぞれを照射された参照物体をそれぞれ含む複数の異なる参照画像それぞれにおける参照輝度値と、参照距離との関係性を示す複数の異なる参照情報を取得する。また、取得部151は、所定時間内のそれぞれ異なる時間に複数の異なる波長の光それぞれを照射された対象物をそれぞれ含む複数の異なる対象画像を取得する。推定部152は、複数の異なる対象画像それぞれにおける対象輝度値と、複数の異なる参照情報それぞれとの比較に基づいて、対象距離を推定する。
[4-2. Light source that turns full color LED on/off at high speed]
In the embodiment described above, a case has been described in which the light source emits light of a wavelength corresponding to a color depending on the type of target fish. Here, a case will be described in which the light source is a light source that turns on/off a full-color LED at high speed. That is, a case will be described in which a light source irradiates a fish with a plurality of lights of different wavelengths (that is, lights of different colors) at almost the same timing over a very short period of time. Specifically, the acquisition unit 151 calculates the reference brightness values and reference distances in each of a plurality of different reference images each including a reference object irradiated with light of a plurality of different wavelengths corresponding to each of a plurality of different colors. Obtain multiple different references that indicate relationships. Further, the acquisition unit 151 acquires a plurality of different target images each including a target object irradiated with light of a plurality of different wavelengths at different times within a predetermined time. The estimation unit 152 estimates the target distance based on a comparison between the target brightness value in each of a plurality of different target images and each of a plurality of different reference information.
〔4-3.輝度値に基づく透視度の推定〕
 上述した実施形態では、推定部152が、対象画像のうち魚が占める対象物領域における輝度値と参照情報との比較に基づいて、撮像装置から魚までの距離を推定する場合について説明した。ここでは、実施形態に係る変形例として、推定部152が、対象画像のうち白板が占める対象物領域における輝度値と参照情報との比較に基づいて、水中の透視度を推定する場合について説明する。
[4-3. Estimation of transparency based on brightness value]
In the embodiment described above, a case has been described in which the estimation unit 152 estimates the distance from the imaging device to the fish based on the comparison between the brightness value in the target object area occupied by the fish in the target image and the reference information. Here, as a modified example of the embodiment, a case will be described in which the estimating unit 152 estimates the underwater transparency based on a comparison between the luminance value in the target object area occupied by the white board in the target image and reference information. .
 まず、透視度について説明する。本明細書における透視度とは、水中で底面に対して水平方向に見ることのできる距離を指す。具体的には、水中の所定の深度(例えば、深度1mなど)に白色の平面状(板状ともいう)の物体(以下、白板と記載する)を沈める。例えば、白板は、直径30cmの円板であってよい。そして、水中の所定の深度(例えば、深度1mなど)に位置する人の目と白板との間の距離を、水中で底面に対して水平方向に徐々に大きくする。そして、人の目で白板が見えなくなったときの人の目と白板との間の距離をその水中における透視度と定義する。 First, transparency will be explained. Transparency in this specification refers to the distance that can be seen in the horizontal direction with respect to the bottom surface underwater. Specifically, a white planar (also referred to as a plate-shaped) object (hereinafter referred to as a white board) is submerged at a predetermined depth (for example, 1 m depth) underwater. For example, the white board may be a disc with a diameter of 30 cm. Then, the distance between the eyes of a person located at a predetermined depth underwater (for example, 1 m depth, etc.) and the white board is gradually increased underwater in the horizontal direction with respect to the bottom surface. Then, the distance between the human eye and the white board when the white board is no longer visible to the human eye is defined as the underwater visibility.
 次に、透明度について説明する。本明細書における透明度とは、水面の上から水中を見たときに垂直方向に見ることのできる距離を指す。具体的には、上述した白板を水中に沈める。そして、水面と白板との間の距離を、水面に対して垂直方向に徐々に大きくする。そして、人の目で白板が見えなくなったときの水面と白板との間の距離をその水中における透明度と定義する。 Next, transparency will be explained. Transparency as used herein refers to the distance that can be seen vertically when looking into the water from above the water surface. Specifically, the white board mentioned above is submerged in water. Then, the distance between the water surface and the white board is gradually increased in the direction perpendicular to the water surface. The distance between the water surface and the white board when the white board is no longer visible to the human eye is defined as the underwater transparency.
 具体的には、本変形例では、図2に示す参照物体O1が、上述した白板である。また、変形例では、図2に示す参照物体O1である白板と、光源L1およびカメラC1との間の距離を、測定用の距離(例えば、1m)に固定した測定装置を用意する。また、各季節における様々な場所における海中(つまり、様々な透視度に対応する水中)に測定装置を沈める。そして、所定の深度(例えば、深度1mなど)に沈めた測定装置のカメラC1により、光源L1の光を照射された水中の白板を含む透視度参照画像を撮像する。また、透視度参照画像を撮像した際の水中の透視度を実験により測定する。このようにして、各季節における海中(つまり、様々な透視度に対応する水中)における各深度(例えば、深度1m、深度2m、深度3m、…等)に対応する透視度参照画像、および、透視度参照画像を撮像した際の水中の透視度を対応付けた透視度基礎情報を実験により取得する。 Specifically, in this modification, the reference object O1 shown in FIG. 2 is the above-mentioned white board. In a modified example, a measuring device is prepared in which the distance between the white board, which is the reference object O1 shown in FIG. 2, and the light source L1 and camera C1 is fixed at a measurement distance (for example, 1 m). Furthermore, the measuring device is submerged underwater at various locations in each season (that is, underwater corresponding to various degrees of visibility). Then, the camera C1 of the measuring device submerged at a predetermined depth (for example, 1 m depth) captures a transparency reference image including the underwater white board irradiated with light from the light source L1. In addition, the transparency of the water when the transparency reference image is captured is measured through experiments. In this way, perspective reference images corresponding to each depth (for example, 1 m depth, 2 m depth, 3 m depth, etc.) in the sea (that is, underwater corresponding to various transparency degrees) in each season, and Basic transparency information, which correlates the underwater visibility when the degree reference image is captured, is obtained through experiments.
 また、取得部151は、透視度基礎情報を取得する。例えば、取得部151は、測定装置の利用者が使用する端末装置から透視度基礎情報を取得する。続いて、取得部151は、透視度基礎情報に基づいて、透視度参照画像のうち、白板が占める領域における輝度値である透視度参照輝度値と、透視度参照画像を撮像した際の水中の透視度との関係性を示す透視度参照情報を取得する。 Additionally, the acquisition unit 151 acquires transparency basic information. For example, the acquisition unit 151 acquires basic transparency information from a terminal device used by a user of the measuring device. Next, based on the visibility basic information, the acquisition unit 151 obtains a visibility reference brightness value, which is a brightness value in the area occupied by the white board, in the visibility reference image, and a value in the water when the visibility reference image is captured. Obtain transparency reference information that indicates the relationship with transparency.
 また、透視度を測定したい日(以下、測定日と記載する)の海中の所定の深度(例えば、深度1mなど)に上記の測定装置を沈める。そして、所定の深度(例えば、深度1mなど)に沈めた測定装置のカメラC1により、光源L1の光を照射された水中の白板を含む対象画像であって、水中のカメラC1により撮像された対象画像を撮像する。 Also, the above measuring device is submerged at a predetermined depth (for example, 1 m depth, etc.) underwater on the day on which the visibility is desired to be measured (hereinafter referred to as the measurement date). Then, an object image including an underwater white board irradiated with light from a light source L1 by a camera C1 of a measurement device submerged at a predetermined depth (for example, 1 m depth, etc.), the object imaged by the underwater camera C1. Capture an image.
 また、取得部151は、対象画像を取得する。例えば、取得部151は、測定装置の利用者が使用する端末装置から対象画像を取得する。また、推定部152は、取得部151が取得した対象画像のうち、白板が占める対象物領域における輝度値である対象輝度値と、透視度参照情報との比較に基づいて、測定日における透視度を推定する。推定部152は、所定の深度(例えば、深度1mなど)に対応する透視度参照情報に基づいて、測定日における透視度を推定する。これにより、情報処理装置100は、目視による測定によらずに、定量化された情報に基づいて、透視度を推定することができる。 Additionally, the acquisition unit 151 acquires a target image. For example, the acquisition unit 151 acquires a target image from a terminal device used by a user of the measuring device. In addition, the estimation unit 152 calculates the transparency on the measurement date based on the comparison between the target brightness value, which is the brightness value in the target object area occupied by the white board, and the transparency reference information in the target image acquired by the acquisition unit 151. Estimate. The estimation unit 152 estimates the visibility on the measurement date based on the visibility reference information corresponding to a predetermined depth (for example, a depth of 1 m, etc.). Thereby, the information processing apparatus 100 can estimate the degree of transparency based on the quantified information without using visual measurement.
 具体的には、推定部152は、対象画像のうち、対象輝度値として、白板が占める対象物領域におけるRの輝度値、Gの輝度値、および、Bの輝度値のうち、少なくとも1つ以上の色の輝度値と、透視度参照情報との比較に基づいて、測定日における透視度を推定する。ここで、透視度が高い水中では、R(赤色)、G(緑色)、B(青色)のうち、R(赤色)の光の減衰率が最も大きく、R(赤色)の光から減衰することが知られている。また、透視度が高い水中では、B(青色)の光の減衰率が最も小さく、B(青色)の光が最後まで残ることが知られている。そこで、推定部152は、測定日における透視度が第1透視度を超える場合、対象画像のうち、白板が占める対象物領域におけるR(赤色)の輝度値である対象輝度値と、透視度参照情報との比較に基づいて、測定日における透視度を推定する。例えば、推定部152は、透視度参照情報に基づいて、白板が占める対象物領域におけるR(赤色)の輝度値である対象輝度値と同じR(赤色)の透視度参照輝度値に対応する透視度を特定し、特定した透視度を測定日における透視度として推定する。 Specifically, the estimation unit 152 selects at least one of the R brightness value, the G brightness value, and the B brightness value in the target object area occupied by the white board as the target brightness value in the target image. The transparency on the measurement date is estimated based on the comparison between the brightness value of the color and the transparency reference information. Here, in water with high visibility, the attenuation rate of R (red) light is the highest among R (red), G (green), and B (blue), and it is attenuated from R (red) light. It has been known. Further, it is known that in water where visibility is high, the attenuation rate of B (blue) light is the smallest, and the B (blue) light remains until the end. Therefore, if the visibility on the measurement date exceeds the first visibility, the estimation unit 152 calculates the target brightness value, which is the brightness value of R (red) in the target object area occupied by the white board, in the target image, and the visibility reference. Based on the comparison with the information, the visibility at the measurement date is estimated. For example, based on the perspective reference information, the estimation unit 152 calculates a perspective view corresponding to the perspective reference brightness value of R (red) that is the same as the target brightness value that is the brightness value of R (red) in the target object area occupied by the white board. The specified visibility is estimated as the visibility on the measurement date.
 また、推定部152は、測定日における透視度が第1透視度以下である場合、対象画像のうち、白板が占める対象物領域における複数の色の輝度値に基づいて、測定日における透視度を推定する。例えば、推定部152は、透視度参照情報に基づいて、白板が占める対象物領域におけるB(青色)の輝度値である対象輝度値と同じB(青色)の透視度参照輝度値に対応する透視度を特定する。ここで、透視度が低い原因として、水中の藻などの葉緑素が多いことが挙げられる。また、水中の藻などの葉緑素は、青色の光を吸収しやすい性質がある。そのため、藻などの葉緑素が多いため、透視度が低い水中では、B(青色)の光から減衰する。そのため、B(青色)の輝度値のみに基づいて推定された透視度は、実際にはありえない値をとる(例えば、200mなど、生け簀の大きさを超える透視度の値をとる)場合がある。このように、推定部152は、白板が占める対象物領域における一つの色の輝度値に基づいて特定された透視度が第1透視度以下である場合には、白板が占める対象物領域における他の色の輝度値に基づいて、測定日における透視度を推定する。例えば、推定部152は、透視度参照情報に基づいて、白板が占める対象物領域におけるR(赤色)の輝度値である対象輝度値と同じR(赤色)の透視度参照輝度値に対応する透視度を特定し、特定した透視度を測定日における透視度として推定する。 Furthermore, when the visibility on the measurement date is equal to or lower than the first visibility, the estimation unit 152 calculates the visibility on the measurement date based on the luminance values of a plurality of colors in the target object area occupied by the white board in the target image. presume. For example, based on the perspective reference information, the estimating unit 152 calculates a perspective view corresponding to a B (blue) perspective reference brightness value that is the same as the B (blue) brightness value in the target object region occupied by the white board. Identify degree. Here, one of the reasons for the low visibility is that there is a lot of chlorophyll such as algae in the water. In addition, chlorophyll such as algae in water has the property of easily absorbing blue light. Therefore, because there is a lot of chlorophyll such as algae, B (blue) light is attenuated in water where visibility is low. Therefore, the visibility estimated based only on the brightness value of B (blue) may take a value that is actually impossible (for example, it takes a value of visibility that exceeds the size of the fish tank, such as 200 m). In this way, if the transparency specified based on the luminance value of one color in the object area occupied by the white board is equal to or lower than the first transparency, the estimating unit 152 determines whether the other object in the object area occupied by the white board is The transparency on the measurement date is estimated based on the luminance value of the color. For example, based on the perspective reference information, the estimation unit 152 calculates a perspective view corresponding to the perspective reference brightness value of R (red) that is the same as the target brightness value that is the brightness value of R (red) in the target object area occupied by the white board. The specified visibility is estimated as the visibility on the measurement date.
 また、推定部152は、対象画像のうち、対象輝度値として、白板が占める対象物領域における複数の色の輝度値の相関関係に基づいて、測定日における透視度を推定する。例えば、推定部152は、透視度参照情報に基づいて、各透視度における複数の色の輝度値の比率を算出する。例えば、推定部152は、Rの輝度値(以下、Rと略記する場合がある)と、Gの輝度値(以下、Gと略記する場合がある)と、Bの輝度値(以下、Bと略記する場合がある)との比率を各透視度について算出する。例えば、推定部152は、透視度参照情報に基づいて、透視度が30mの場合、R:G:B=1:2:3と算出する。また、推定部152は、透視度参照情報に基づいて、透視度が20mの場合、R:G:B=1:2:2と算出する。また、推定部152は、透視度参照情報に基づいて、透視度が10mの場合、R:G:B=1:1:1と算出する。また、推定部152は、透視度参照情報に基づく各透視度における複数の色の輝度値の比率と、対象画像のうち、白板が占める対象物領域における複数の色の輝度値の比率との比較に基づいて、測定日における透視度を推定する。例えば、推定部152は、対象画像のうち、白板が占める対象物領域におけるR:G:B=1:1:1と算出した場合、透視度参照情報に基づく各透視度におけるR:G:Bの比率との比較に基づいて、測定日における透視度を10mと推定する。 Furthermore, the estimating unit 152 estimates the transparency on the measurement date based on the correlation between the brightness values of a plurality of colors in the target object area occupied by the white board, as the target brightness value in the target image. For example, the estimation unit 152 calculates the ratio of brightness values of a plurality of colors at each transparency based on the transparency reference information. For example, the estimation unit 152 calculates the brightness value of R (hereinafter sometimes abbreviated as R), the brightness value of G (hereinafter sometimes abbreviated as G), and the brightness value of B (hereinafter sometimes abbreviated as B). (sometimes abbreviated) is calculated for each transparency level. For example, the estimation unit 152 calculates R:G:B=1:2:3 when the visibility is 30 m based on the visibility reference information. Furthermore, the estimation unit 152 calculates R:G:B=1:2:2 when the visibility is 20 m based on the visibility reference information. Furthermore, the estimation unit 152 calculates R:G:B=1:1:1 when the visibility is 10 m based on the visibility reference information. The estimation unit 152 also compares the ratio of the brightness values of the plurality of colors at each perspective based on the transparency reference information with the ratio of the brightness values of the plurality of colors in the target object area occupied by the white board in the target image. Based on this, estimate the transparency on the measurement date. For example, if the estimation unit 152 calculates that R:G:B=1:1:1 in the target object area occupied by the white board in the target image, the estimation unit 152 calculates that R:G:B at each perspective based on the perspective reference information. Based on the comparison with the ratio of , the visibility on the day of measurement is estimated to be 10 m.
 また、上述した透視度基礎情報を実験により取得する際に、透視度と合わせて、その日の透明度を測定しておく。そして、透視度参照画像を撮像した際の水中の透視度と透明度とを対応付けた情報を実験により取得する。また、測定日における透明度を実験により測定する。 Additionally, when acquiring the above-mentioned basic visibility information through experiments, the transparency of the day is measured in addition to the visibility. Then, information that associates the underwater visibility and transparency when the visibility reference image is captured is obtained through an experiment. In addition, the transparency on the day of measurement is measured experimentally.
 また、取得部151は、水中の透視度と透明度とを対応付けた情報、および、測定日における透明度の情報を取得する。例えば、取得部151は、水中の透視度と透明度とを対応付けた情報、および、測定日における透明度の情報を測定装置の利用者が使用する端末装置から取得する。また、推定部152は、測定日における透明度から測定日における透視度を推定する。より具体的には、推定部152は、水中の透視度と透明度とを対応付けた情報に基づいて、測定日における透明度と同じ透明度に対応する透視度を特定し、特定した透視度を測定日における透視度として推定する。 The acquisition unit 151 also acquires information that associates underwater visibility with transparency, and information on transparency on the measurement date. For example, the acquisition unit 151 acquires information associating underwater visibility with transparency, and information on transparency on the measurement date from a terminal device used by a user of the measuring device. Furthermore, the estimation unit 152 estimates the transparency on the measurement date from the transparency on the measurement date. More specifically, the estimating unit 152 identifies the transparency corresponding to the same transparency as the transparency on the measurement date based on information that associates underwater visibility with transparency, and sets the identified transparency on the measurement date. Estimated as the transparency at .
〔4-4.透視度に基づく対象距離の推定〕
 上述した実施形態では、推定部152が、対象画像のうち魚が占める対象物領域における輝度値と参照情報との比較に基づいて、撮像装置から魚までの距離を推定する場合について説明した。ここでは、実施形態に係る変形例として、推定部152が、水中の透視度に基づいて、撮像装置から魚までの距離を推定する場合について説明する。
[4-4. Estimation of target distance based on transparency]
In the embodiment described above, a case has been described in which the estimation unit 152 estimates the distance from the imaging device to the fish based on the comparison between the brightness value in the target object area occupied by the fish in the target image and the reference information. Here, as a modification of the embodiment, a case will be described in which the estimation unit 152 estimates the distance from the imaging device to the fish based on the visibility of the water.
 一般的に、水中の浮遊物や、クロロフィル等の浮遊物の量により、水中における光の減衰率は異なる。すなわち、浮遊物量が少なく透視度が高い水中(例えば、透視度が20mなど)におけるRGBの各色の光の減衰率と、浮遊物量が多く透視度が低い水中(例えば、透視度が3mなど)におけるRGBの各色の光の減衰率は異なる。そのため、同じ魚種の魚が撮像装置から同じ距離に位置する場合であっても、浮遊物量が少なく透視度が高い水中における魚の見え方と、浮遊物量が多く透視度が低い水中における魚の見え方は異なる。そこで、あらかじめ、各透視度の水中におけるRGBの各色の減衰率マップ(以下、参照基礎情報ともいう)を用意する。 In general, the attenuation rate of light in water differs depending on the amount of suspended matter such as chlorophyll in the water. In other words, the attenuation rate of each color of RGB in water with a small amount of suspended matter and high visibility (for example, visibility of 20 m), and the attenuation rate of light of each RGB color in water with a large amount of suspended matter and low visibility (for example, visibility of 3 m). The attenuation rate of light of each color of RGB is different. Therefore, even if fish of the same species are located at the same distance from the imaging device, the way the fish is seen in water with a low amount of suspended matter and high visibility is different from the way the fish is seen in water with a large amount of suspended matter and low visibility. is different. Therefore, an attenuation rate map (hereinafter also referred to as reference basic information) of each color of RGB in water at each transparency level is prepared in advance.
 具体的には、参照基礎情報として、各透視度の水中における各深度(例えば、深度1m、深度2m、深度3m、…等)について、撮像装置から鱗を貼り付けた参照物体(上述した図2の参照物体O1)までの各距離における参照画像を撮像する。なお、参照基礎情報を取得するための参照画像は、魚種ごとに異なる鱗、あるいは疑似皮(オーロラハゲ皮)や、カラーチャート、チェッカーボードなどの素材を貼り付けた参照物体について撮像される。また、参照画像を撮像した際の透視度を実験により測定する。取得部151は、参照基礎情報として、参照画像を撮像した際の透視度と、参照画像を撮像した際の深度と、撮像装置から参照物体までの距離と、参照画像のうち、参照物体が占める領域におけるR、G、Bの各値とを対応付けた情報を取得する。 Specifically, as reference basic information, for each depth in the water at each transparency level (for example, 1 m depth, 2 m depth, 3 m depth, etc.), a reference object with scales attached (as shown in Fig. 2 above) is obtained from the imaging device. A reference image at each distance to the reference object O1) is captured. Note that the reference image for acquiring the reference basic information is captured for a reference object to which materials such as scales or pseudo skin (aurora bald skin), color charts, checkerboards, etc., which differ for each species of fish, are pasted. In addition, the degree of transparency when capturing the reference image is measured through experiments. The acquisition unit 151 obtains, as reference basic information, the transparency when the reference image was captured, the depth when the reference image was captured, the distance from the imaging device to the reference object, and the area occupied by the reference object in the reference image. Information that associates each value of R, G, and B in the area is acquired.
 なお、参照基礎情報にないデータは計算により補完する。例えば、透視度が1mの場合のRGBの各輝度値が(Rの輝度値、Gの輝度値、Bの輝度値)=(5、5、5)であり、透視度が2mの場合のRGBの各輝度値が(Rの輝度値、Gの輝度値、Bの輝度値)=(10、10、10)であるとする。また、透視度が1.5mのRGBの各輝度値のデータが存在しないとする。このとき、透視度が1mの場合のRGBの各輝度値と、透視度が2mの場合のRGBの各輝度値とに基づいて、透視度が1.5mのRGBの各輝度値を(Rの輝度値、Gの輝度値、Bの輝度値)=(7.5、7.5、7.5)であると推定してよい。なお、この例は線形の補完だが、非線形の補完もあり得る。 In addition, data not included in the reference basic information will be supplemented by calculation. For example, the brightness values of RGB when the visibility is 1m are (R brightness value, G brightness value, B brightness value) = (5, 5, 5), and the RGB brightness values when the visibility is 2m. It is assumed that the respective brightness values of (R brightness value, G brightness value, B brightness value) = (10, 10, 10). Further, it is assumed that there is no data for each RGB luminance value with a visibility of 1.5 m. At this time, each RGB brightness value when the visibility is 1.5m is calculated based on the RGB brightness values when the visibility is 1m and the RGB brightness values when the visibility is 2m. It may be estimated that (brightness value, brightness value of G, brightness value of B) = (7.5, 7.5, 7.5). Note that although this example is linear interpolation, non-linear interpolation is also possible.
 また、撮像装置から魚までの距離を推定する日(以下、推定日と記載する)の生簀における透視度を測定する。なお、推定部152は、上記〔4-3.輝度値に基づく透視度の推定〕を用いて、推定日における透視度を推定してもよい。また、生簀で飼われている魚の魚種は既知とする。また、撮像装置の深度は既知とする。取得部151は、推定日における透視度に対応する参照基礎情報を取得する。また、取得部151は、推定日の生簀における対象画像を取得する。また、推定部152は、対象画像のうち、魚が占める対象物領域における輝度値である対象輝度値と、参照基礎情報との比較に基づいて、撮像装置から魚までの距離である対象距離を推定する。具体的には、推定部152は、対象物領域におけるRの輝度値、Gの輝度値、および、Bの輝度値のうち、少なくとも1つ以上の色の輝度値と、参照基礎情報との比較に基づいて、対象距離を推定する。例えば、推定部152は、推定日における透視度が第1透視度を超える場合、対象画像のうち、対象物領域におけるRの輝度値である対象輝度値と、参照基礎情報との比較に基づいて、対象距離を推定する。例えば、推定部152は、参照基礎情報に基づいて、対象物領域におけるRの輝度値である対象輝度値と同じRの輝度値に対応する撮像装置から参照物体までの距離を特定し、特定した距離を対象距離として推定する。また、推定部152は、推定日における透視度が第1透視度以下である場合、対象物領域における複数の色の輝度値に基づいて、対象距離を推定する。例えば、推定部152は、参照基礎情報に基づく各距離における複数の色の輝度値の比率と、対象画像のうち、対象物領域における複数の色の輝度値の比率との比較に基づいて、対象距離を推定する。このように、推定部152は、水中の透視度に基づいて、撮像装置から魚までの距離を推定してもよい。 In addition, the visibility in the fish tank on the day when the distance from the imaging device to the fish is estimated (hereinafter referred to as the estimation day) is measured. Note that the estimation unit 152 performs the above [4-3. [Estimation of visibility based on brightness value]] may be used to estimate the visibility on the estimated date. In addition, the species of fish kept in the fish tank are known. Further, it is assumed that the depth of the imaging device is known. The acquisition unit 151 acquires reference basic information corresponding to the visibility on the estimated date. The acquisition unit 151 also acquires a target image in the fish tank on the estimated date. Furthermore, the estimation unit 152 calculates the target distance, which is the distance from the imaging device to the fish, based on the comparison between the target brightness value, which is the brightness value in the target object area occupied by the fish, and the reference basic information in the target image. presume. Specifically, the estimation unit 152 compares the brightness value of at least one color among the brightness value of R, the brightness value of G, and the brightness value of B in the target object region with reference basic information. Estimate the target distance based on. For example, if the visibility on the estimated date exceeds the first visibility, the estimation unit 152 calculates the target brightness value, which is the brightness value of R in the target object region of the target image, based on the comparison with the reference basic information. , estimate the target distance. For example, the estimation unit 152 identifies the distance from the imaging device to the reference object corresponding to the same R brightness value as the target brightness value, which is the R brightness value in the target object region, based on the reference basic information. Estimate the distance as the target distance. Furthermore, when the visibility on the estimation date is equal to or lower than the first visibility, the estimation unit 152 estimates the target distance based on the luminance values of the plurality of colors in the target object area. For example, the estimation unit 152 estimates the target image based on a comparison between the ratio of the brightness values of a plurality of colors at each distance based on the reference basic information and the ratio of the brightness values of a plurality of colors in the target object area of the target image. Estimate distance. In this way, the estimation unit 152 may estimate the distance from the imaging device to the fish based on the visibility of the water.
〔5.効果〕
 上述したように、実施形態に係る情報処理装置100は、取得部151と推定部152を備える。取得部151は、光源の光を照射された水中の参照物体を含む参照画像であって、水中の撮像装置により撮像された参照画像のうち、参照物体が占める物体領域における輝度値である参照輝度値と、撮像装置から参照物体までの距離である参照距離との関係性を示す参照情報、および、光源の光を照射された水中の対象物を含む対象画像であって、水中の撮像装置により撮像された対象画像を取得する。推定部152は、対象画像のうち、対象物が占める対象物領域における輝度値である対象輝度値と、参照情報との比較に基づいて、撮像装置から対象物までの距離である対象距離を推定する。
[5. effect〕
As described above, the information processing apparatus 100 according to the embodiment includes the acquisition section 151 and the estimation section 152. The acquisition unit 151 obtains a reference image including an underwater reference object irradiated with light from a light source, which is a reference brightness value that is a brightness value in an object area occupied by the reference object in the reference image captured by an underwater imaging device. reference information indicating the relationship between the value and the reference distance, which is the distance from the imaging device to the reference object, and a target image including an underwater object illuminated by light from a light source, which is a distance from the imaging device to the reference object. Obtain the captured target image. The estimation unit 152 estimates the target distance, which is the distance from the imaging device to the target object, based on the comparison between the target brightness value, which is the brightness value in the target area occupied by the target object, and reference information in the target image. do.
 これにより、情報処理装置100は、撮像装置から水中の対象物までの距離を精度よく推定することができる。また、情報処理装置100は、撮像装置から水中の対象物までの距離を精度よく推定することができるので、例えば、水中の対象物を撮像した画像に対して、画像中の対象物の位置や大きさに関する質の高い正解データを付与可能とすることができる。すなわち、情報処理装置100は、例えば、質の高い正解データを付与された質の高いトレーニングデータを得ることができる。さらに、情報処理装置100は、質の高いトレーニングデータを得ることができるので、例えば、質の高いトレーニングデータを用いて、画像から水中の対象物に関する情報を推定する画像認識用の機械学習モデルの学習精度を向上させることができる。したがって、情報処理装置100は、画像から水中の対象物に関する情報を精度よく推定することができる。また、情報処理装置100は、画像から水中の対象物に関する情報を精度よく推定することができるため、持続可能な開発目標(SDGs)の目標9「産業と技術革新の基盤をつくろう」の達成に貢献できる。また、情報処理装置100は、例えば、画像から水中の魚に関する情報を精度よく推定することができるため、持続可能な開発目標(SDGs)の目標14「海の豊かさを守ろう」の達成に貢献できる。 Thereby, the information processing device 100 can accurately estimate the distance from the imaging device to the underwater object. In addition, since the information processing device 100 can accurately estimate the distance from the imaging device to the underwater object, for example, the information processing device 100 can estimate the position of the object in the image, It is possible to provide high quality correct answer data regarding the size. That is, the information processing apparatus 100 can obtain, for example, high-quality training data to which high-quality correct answer data is added. Furthermore, since the information processing device 100 can obtain high-quality training data, for example, the high-quality training data can be used to develop a machine learning model for image recognition that estimates information about underwater objects from images. Learning accuracy can be improved. Therefore, the information processing device 100 can accurately estimate information regarding the underwater object from the image. Furthermore, since the information processing device 100 can accurately estimate information about underwater objects from images, it can help achieve Goal 9 of the Sustainable Development Goals (SDGs), “Create a foundation for industry and technological innovation.” I can contribute. In addition, the information processing device 100 can, for example, accurately estimate information about fish in the water from images, so that it can help achieve Goal 14 of the Sustainable Development Goals (SDGs), “Let's protect the richness of the oceans.” I can contribute.
 また、推定部152は、推定した対象距離に応じて、対象物の大きさを推定する。 Furthermore, the estimation unit 152 estimates the size of the target object according to the estimated target distance.
 これにより、情報処理装置100は、画像から水中の対象物の大きさを精度よく推定することができる。例えば、画像のうち、対象物が占める対象物領域の大きさが同程度であって、対象物領域における輝度値が異なる2つの対象物領域を含む画像があるとする。このとき、情報処理装置100は、対象物領域における輝度値が大きい(小さい)ほど、撮像装置から対象物までの距離が近い(遠い)ので、輝度値が大きい対象物領域に対応する対象物の大きさよりも、輝度値が小さい対象物領域に対応する対象物の大きさの方が大きいと推定することができる。このように、情報処理装置100は、撮像装置の近くにある小さい対象物(例えば、所定の魚)の大きさと、撮像装置の遠くにある大きい対象物(例えば、所定の魚とは異なる他の魚)の大きさをそれぞれ精度よく推定することができる。 Thereby, the information processing device 100 can accurately estimate the size of the underwater object from the image. For example, it is assumed that there is an image that includes two object areas in which the size of the object area occupied by the object is approximately the same, but the brightness value in the object area is different. At this time, the information processing device 100 detects the target object corresponding to the object region having a large brightness value because the larger (smaller) the brightness value in the target object region is, the closer (farther) the distance from the imaging device to the target object is. It can be estimated that the size of the object corresponding to the object region with a small brightness value is larger than the size. In this way, the information processing device 100 can determine the size of a small object (e.g., a predetermined fish) near the imaging device and the size of a large object (e.g., other than the predetermined fish) located far from the imaging device. The size of each fish can be estimated with high accuracy.
 また、取得部151は、複数の対象物を含む対象画像を取得する。推定部152は、対象画像のうち、複数の対象物それぞれが占める複数の対象物領域それぞれにおける対象輝度値と、参照情報との比較に基づいて、撮像装置から複数の対象物それぞれまでの対象距離を推定し、推定した撮像装置から複数の対象物それぞれまでの対象距離に応じて、複数の対象物それぞれを識別する。 Additionally, the acquisition unit 151 acquires a target image including a plurality of targets. The estimation unit 152 calculates the target distance from the imaging device to each of the plurality of objects based on the comparison between the target brightness values in each of the plurality of object regions occupied by each of the plurality of objects in the target image and reference information. is estimated, and each of the plurality of objects is identified according to the estimated object distance from the imaging device to each of the plurality of objects.
 これにより、情報処理装置100は、画像から水中の複数の対象物を精度よく識別することができる。例えば、画像のうち、異なる複数の対象物が重なっており、複数の対象物それぞれが占める対象物領域における輝度値が異なる2つの対象物領域を含む画像があるとする。このとき、情報処理装置100は、対象物領域における輝度値が大きい(小さい)ほど、撮像装置から対象物までの距離が近い(遠い)ので、輝度値が大きい対象物領域に対応する対象物の位置の方が、輝度値が小さい対象物領域に対応する対象物の位置よりも手前側に位置すると推定することができる。言い換えると、情報処理装置100は、輝度値が大きい対象物領域に対応する対象物の位置よりも、輝度値が小さい対象物領域に対応する対象物の位置の方が奥側に位置すると推定することができる。このように、情報処理装置100は、撮像装置の近くにある対象物(例えば、所定の魚)の位置と、撮像装置の遠くにある対象物(例えば、所定の魚とは異なる他の魚)の位置をそれぞれ精度よく識別することができる。また、形状や大きさがある程度わかっている対象物(例えば、生簀内の魚など)の場合、3次元空間における位置が異なることは、異なる対象物であることを意味する。したがって、情報処理装置100は、撮像装置の近くにある対象物(例えば、所定の魚)と撮像装置の遠くにある対象物(例えば、所定の魚とは異なる他の魚)をそれぞれ精度よく識別することができる。 Thereby, the information processing device 100 can accurately identify multiple underwater objects from the image. For example, assume that there is an image that includes two object areas in which a plurality of different objects overlap and the object areas occupied by the plurality of objects each have different brightness values. At this time, the information processing device 100 detects the target object corresponding to the object region having a large brightness value because the larger (smaller) the brightness value in the target object region is, the closer (farther) the distance from the imaging device to the target object is. It can be estimated that the position of the target object is located closer to the viewer than the position of the target object corresponding to the target object area with the lower luminance value. In other words, the information processing device 100 estimates that the position of the object corresponding to the object area with a small brightness value is located further back than the position of the object corresponding to the object area with a large brightness value. be able to. In this way, the information processing device 100 can determine the position of an object near the imaging device (for example, a predetermined fish) and the position of an object far away from the imaging device (for example, another fish different from the predetermined fish). The positions of each can be identified with high accuracy. Furthermore, in the case of objects whose shapes and sizes are known to some extent (for example, fish in a cage), different positions in three-dimensional space mean that the objects are different. Therefore, the information processing device 100 accurately identifies an object near the imaging device (for example, a predetermined fish) and an object located far from the imaging device (for example, another fish different from the predetermined fish). can do.
 また、取得部151は、対象物に応じた色に対応する波長の光を照射された参照物体を含む参照画像における参照輝度値と、参照距離との関係性を示す参照情報、および、対象物に応じた色に対応する波長の光を照射された対象物を含む対象画像を取得する。推定部152は、対象画像における対象輝度値と、参照情報との比較に基づいて、対象距離を推定する。 In addition, the acquisition unit 151 obtains reference information indicating the relationship between a reference brightness value in a reference image including a reference object irradiated with light of a wavelength corresponding to a color corresponding to the target object, and a reference distance; A target image including a target object irradiated with light of a wavelength corresponding to a color corresponding to the color is acquired. The estimation unit 152 estimates the target distance based on a comparison between the target brightness value in the target image and reference information.
 これにより、情報処理装置100は、例えば、魚が好む光を照射することにより、撮像装置による画像を撮像する間に、撮像範囲内から魚が逃げないようにすることができるので、光を照射された魚を含む適切な対象画像を取得することができる。また、情報処理装置100は、適切な対象画像を取得することができるので、対象距離を精度よく推定することができる。 Thereby, the information processing device 100 can prevent the fish from escaping from within the imaging range while the image capturing device is capturing an image by emitting light that the fish prefers, so the information processing device 100 can emit light. It is possible to obtain an appropriate target image containing the fish. Furthermore, since the information processing apparatus 100 can acquire an appropriate target image, it can accurately estimate the target distance.
 また、取得部151は、複数の異なる色それぞれに対応する複数の異なる波長の光それぞれを照射された参照物体をそれぞれ含む複数の異なる参照画像それぞれにおける参照輝度値と、参照距離との関係性を示す複数の異なる参照情報、および、所定時間内のそれぞれ異なる時間に複数の異なる波長の光それぞれを照射された対象物をそれぞれ含む複数の異なる対象画像を取得する。推定部152は、複数の異なる対象画像それぞれにおける対象輝度値と、複数の異なる参照情報それぞれとの比較に基づいて、対象距離を推定する。 The acquisition unit 151 also obtains the relationship between the reference brightness value and the reference distance in each of a plurality of different reference images each including a reference object irradiated with light of a plurality of different wavelengths corresponding to each of a plurality of different colors. A plurality of different target images are acquired, each including a plurality of different reference information shown and a target object irradiated with light of a plurality of different wavelengths at different times within a predetermined time. The estimation unit 152 estimates the target distance based on a comparison between the target brightness value in each of a plurality of different target images and each of a plurality of different reference information.
 これにより、情報処理装置100は、極短時間の間に複数の異なる波長の光(つまり、異なる色の光)を対象物に照射することができる。すなわち、情報処理装置100は、例えば、対象物が魚であり、魚種ごとに好む色が異なる場合であっても、撮像装置による画像を撮像する間に、撮像範囲内から魚が逃げないようにすることができるので、光を照射された魚を含む適切な対象画像を取得することができる。また、情報処理装置100は、適切な対象画像を取得することができるので、対象距離を精度よく推定することができる。 Thereby, the information processing device 100 can irradiate a target object with a plurality of lights of different wavelengths (that is, lights of different colors) in a very short period of time. That is, even if the target object is a fish and different fish species prefer different colors, the information processing device 100 prevents the fish from escaping from within the imaging range while the imaging device captures the image. Therefore, it is possible to obtain an appropriate target image including the fish illuminated with light. Furthermore, since the information processing apparatus 100 can acquire an appropriate target image, it can accurately estimate the target distance.
 また、光源は、撮像装置の筐体の上部または下部に備えられる。 Additionally, the light source is provided at the top or bottom of the casing of the imaging device.
 これにより、情報処理装置100は、例えば、撮像装置が対象物を仰ぐ向きと、光源が対象物を照射する向きを揃えることができるので、画像の撮像をしやすくすることができる。 With this, the information processing device 100 can, for example, align the direction in which the imaging device looks up at the object and the direction in which the light source irradiates the object, making it easier to capture images.
 また、参照物体は、特定の魚種の魚の鱗もしくは魚の鱗の模型、または、疑似皮、カラーチャートもしくはチェッカーボードで覆われた物体である。対象物は、特定の魚種の魚である。 Further, the reference object is a fish scale of a specific fish species or a model of a fish scale, or an object covered with pseudo-skin, a color chart, or a checkerboard. The object is a specific species of fish.
 これにより、情報処理装置100は、魚種ごとに異なる適切な参照情報を取得することができるため、画像から水中の魚に関する情報を精度よく推定することができる。 As a result, the information processing device 100 can acquire appropriate reference information that differs for each species of fish, and therefore can accurately estimate information regarding fish in the water from the image.
 また、取得部151は、光源の光を照射された水中の参照物体を含む透視度参照画像であって、水中の撮像装置により撮像された透視度参照画像のうち、参照物体が占める物体領域における輝度値である透視度参照輝度値と、水中の透視度との関係性を示す透視度参照情報、および、光源の光を照射された水中の対象物を含む対象画像であって、水中の撮像装置により撮像された対象画像を取得する。推定部152は、対象画像のうち、対象物が占める対象物領域における複数の色の輝度値のうち、少なくとも1つ以上の色の輝度値と、透視度参照情報との比較に基づいて、対象画像が撮像された際の水中の透視度を推定する。 The acquisition unit 151 also acquires a visibility reference image that includes an underwater reference object irradiated with light from a light source, and that is located in an object area occupied by the reference object in the visibility reference image captured by the underwater imaging device. Underwater imaging, which is a target image that includes a visibility reference brightness value, which is a brightness value, and visibility reference information that indicates the relationship between underwater visibility, and an underwater object that is irradiated with light from a light source. A target image captured by the device is acquired. The estimation unit 152 determines the target image based on the comparison between the brightness value of at least one color among the brightness values of a plurality of colors in the target area occupied by the target object in the target image and the transparency reference information. Estimate the visibility of the water when the image was captured.
 これにより、情報処理装置100は、水中の透視度を精度よく推定することができるので、例えば、撮像装置から水中の対象物までの距離をより精度よく推定することができる。 As a result, the information processing device 100 can estimate the visibility of the water with high accuracy, so that, for example, the distance from the imaging device to the underwater object can be estimated with more accuracy.
 また、推定部152は、推定された水中の透視度に基づいて、対象距離を推定する。 Furthermore, the estimation unit 152 estimates the target distance based on the estimated underwater visibility.
 これにより、情報処理装置100は、撮像装置から水中の対象物までの距離をより精度よく推定することができる。 Thereby, the information processing device 100 can estimate the distance from the imaging device to the underwater object with higher accuracy.
 また、変形例に係る情報処理装置100は、取得部151と推定部152を備える。取得部151は、光源の光を照射された水中の参照物体を含む透視度参照画像であって、水中の撮像装置により撮像された透視度参照画像のうち、参照物体が占める物体領域における輝度値である透視度参照輝度値と、水中の透視度との関係性を示す透視度参照情報、および、光源の光を照射された水中の対象物を含む対象画像であって、水中の撮像装置により撮像された対象画像を取得する。推定部152は、対象画像のうち、対象物が占める対象物領域における複数の色の輝度値のうち、少なくとも1つ以上の色の輝度値と、透視度参照情報との比較に基づいて、対象画像が撮像された際の水中の透視度を推定する。 Additionally, the information processing device 100 according to the modification includes an acquisition unit 151 and an estimation unit 152. The acquisition unit 151 obtains a visibility reference image including an underwater reference object irradiated with light from a light source, which is a luminance value in an object area occupied by the reference object in the visibility reference image captured by the underwater imaging device. A target image that includes a visibility reference brightness value, visibility reference information indicating the relationship between the underwater visibility, and an underwater object irradiated with light from a light source, which is captured by an underwater imaging device. Obtain the captured target image. The estimation unit 152 determines the target image based on the comparison between the brightness value of at least one color among the brightness values of a plurality of colors in the target area occupied by the target object in the target image and the transparency reference information. Estimate the visibility of the water when the image was captured.
 これにより、情報処理装置100は、水中の透視度を精度よく推定することができるので、例えば、撮像装置から水中の対象物までの距離をより精度よく推定することができる。 As a result, the information processing device 100 can estimate the visibility of the water with high accuracy, so that, for example, the distance from the imaging device to the underwater object can be estimated with more accuracy.
 また、推定部152は、対象画像が撮像された際の水中の透視度が第1透視度を超える場合、対象画像のうち、対象物が占める対象物領域における1色の輝度値と、透視度参照情報との比較に基づいて、対象画像が撮像された際の水中の透視度を推定する。 Furthermore, if the visibility in the water when the target image is captured exceeds the first visibility, the estimation unit 152 calculates the brightness value of one color in the target area occupied by the target object in the target image and the visibility. Based on the comparison with the reference information, the degree of transparency in the water when the target image was captured is estimated.
 これにより、情報処理装置100は、水中の透視度に応じて、水中の透視度を適切に推定することができる。 Thereby, the information processing device 100 can appropriately estimate the visibility of the water depending on the visibility of the water.
 また、推定部152は、対象画像が撮像された際の水中の透明度が第1透視度以下である場合、対象画像のうち、対象物が占める対象物領域における複数の色の輝度値と、透視度参照情報との比較に基づいて、対象画像が撮像された際の水中の透視度を推定する。 Furthermore, if the transparency of the water when the target image is captured is below the first perspective, the estimation unit 152 calculates the luminance values of the plurality of colors in the target area occupied by the target object in the target image and the perspective Based on the comparison with the degree reference information, the degree of visibility of the water at the time the target image was captured is estimated.
 これにより、情報処理装置100は、水中の透視度に応じて、水中の透視度を適切に推定することができる。 Thereby, the information processing device 100 can appropriately estimate the visibility of the water depending on the visibility of the water.
〔6.ハードウェア構成〕
 また、上述してきた実施形態に係る情報処理装置100は、例えば図9に示すような構成のコンピュータ1000によって実現される。図9は、情報処理装置100の機能を実現するコンピュータの一例を示すハードウェア構成図である。コンピュータ1000は、CPU1100、RAM1200、ROM1300、HDD1400、通信インターフェイス(I/F)1500、入出力インターフェイス(I/F)1600、及びメディアインターフェイス(I/F)1700を備える。
[6. Hardware configuration]
Further, the information processing apparatus 100 according to the embodiments described above is realized by, for example, a computer 1000 having a configuration as shown in FIG. FIG. 9 is a hardware configuration diagram showing an example of a computer that implements the functions of the information processing device 100. The computer 1000 includes a CPU 1100, a RAM 1200, a ROM 1300, an HDD 1400, a communication interface (I/F) 1500, an input/output interface (I/F) 1600, and a media interface (I/F) 1700.
 CPU1100は、ROM1300またはHDD1400に格納されたプログラムに基づいて動作し、各部の制御を行う。ROM1300は、コンピュータ1000の起動時にCPU1100によって実行されるブートプログラムや、コンピュータ1000のハードウェアに依存するプログラム等を格納する。 The CPU 1100 operates based on a program stored in the ROM 1300 or the HDD 1400 and controls each part. The ROM 1300 stores a boot program executed by the CPU 1100 when the computer 1000 is started, programs depending on the hardware of the computer 1000, and the like.
 HDD1400は、CPU1100によって実行されるプログラム、及び、かかるプログラムによって使用されるデータ等を格納する。通信インターフェイス1500は、所定の通信網を介して他の機器からデータを受信してCPU1100へ送り、CPU1100が生成したデータを所定の通信網を介して他の機器へ送信する。 The HDD 1400 stores programs executed by the CPU 1100 and data used by the programs. Communication interface 1500 receives data from other devices via a predetermined communication network and sends it to CPU 1100, and transmits data generated by CPU 1100 to other devices via a predetermined communication network.
 CPU1100は、入出力インターフェイス1600を介して、ディスプレイやプリンタ等の出力装置、及び、キーボードやマウス等の入力装置を制御する。CPU1100は、入出力インターフェイス1600を介して、入力装置からデータを取得する。また、CPU1100は、生成したデータを入出力インターフェイス1600を介して出力装置へ出力する。 The CPU 1100 controls output devices such as a display and printer, and input devices such as a keyboard and mouse via the input/output interface 1600. CPU 1100 obtains data from an input device via input/output interface 1600. Further, CPU 1100 outputs the generated data to an output device via input/output interface 1600.
 メディアインターフェイス1700は、記録媒体1800に格納されたプログラムまたはデータを読み取り、RAM1200を介してCPU1100に提供する。CPU1100は、かかるプログラムを、メディアインターフェイス1700を介して記録媒体1800からRAM1200上にロードし、ロードしたプログラムを実行する。記録媒体1800は、例えばDVD(Digital Versatile Disc)、PD(Phase change rewritable Disk)等の光学記録媒体、MO(Magneto-Optical disk)等の光磁気記録媒体、テープ媒体、磁気記録媒体、または半導体メモリ等である。 The media interface 1700 reads programs or data stored in the recording medium 1800 and provides them to the CPU 1100 via the RAM 1200. CPU 1100 loads this program from recording medium 1800 onto RAM 1200 via media interface 1700, and executes the loaded program. The recording medium 1800 is, for example, an optical recording medium such as a DVD (Digital Versatile Disc) or a PD (Phase change rewritable disk), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory. etc.
 例えば、コンピュータ1000が実施形態に係る情報処理装置100として機能する場合、コンピュータ1000のCPU1100は、RAM1200上にロードされたプログラムを実行することにより、制御部150の機能を実現する。コンピュータ1000のCPU1100は、これらのプログラムを記録媒体1800から読み取って実行するが、他の例として、他の装置から所定の通信網を介してこれらのプログラムを取得してもよい。 For example, when the computer 1000 functions as the information processing device 100 according to the embodiment, the CPU 1100 of the computer 1000 realizes the functions of the control unit 150 by executing a program loaded onto the RAM 1200. The CPU 1100 of the computer 1000 reads these programs from the recording medium 1800 and executes them, but as another example, these programs may be acquired from another device via a predetermined communication network.
 以上、本願の実施形態のいくつかを図面に基づいて詳細に説明したが、これらは例示であり、発明の開示の欄に記載の態様を始めとして、当業者の知識に基づいて種々の変形、改良を施した他の形態で本発明を実施することが可能である。 Some of the embodiments of the present application have been described above in detail based on the drawings, but these are merely examples, and various modifications and variations may be made based on the knowledge of those skilled in the art, including the embodiments described in the disclosure section of the invention. It is possible to carry out the invention in other forms with modifications.
〔7.その他〕
 また、上記実施形態及び変形例において説明した各処理のうち、自動的に行われるものとして説明した処理の全部または一部を手動的に行うこともでき、あるいは、手動的に行われるものとして説明した処理の全部または一部を公知の方法で自動的に行うこともできる。この他、上記文書中や図面中で示した処理手順、具体的名称、各種のデータやパラメータを含む情報については、特記する場合を除いて任意に変更することができる。例えば、各図に示した各種情報は、図示した情報に限られない。
[7. others〕
Furthermore, among the processes described in the above embodiments and modified examples, all or part of the processes described as being performed automatically can be performed manually, or may be described as being performed manually. All or part of this processing can also be performed automatically using known methods. In addition, information including the processing procedures, specific names, and various data and parameters shown in the above documents and drawings may be changed arbitrarily, unless otherwise specified. For example, the various information shown in each figure is not limited to the illustrated information.
 また、図示した各装置の各構成要素は機能概念的なものであり、必ずしも物理的に図示の如く構成されていることを要しない。すなわち、各装置の分散・統合の具体的形態は図示のものに限られず、その全部または一部を、各種の負荷や使用状況などに応じて、任意の単位で機能的または物理的に分散・統合して構成することができる。 Furthermore, each component of each device shown in the drawings is functionally conceptual, and does not necessarily need to be physically configured as shown in the drawings. In other words, the specific form of distributing and integrating each device is not limited to what is shown in the diagram, and all or part of the devices can be functionally or physically distributed or integrated in arbitrary units depending on various loads and usage conditions. Can be integrated and configured.
 また、上述した情報処理装置100は、複数のサーバコンピュータで実現してもよく、また、機能によっては外部のプラットホーム等をAPI(Application Programming Interface)やネットワークコンピューティング等で呼び出して実現するなど、構成は柔軟に変更できる。 Further, the information processing device 100 described above may be realized by a plurality of server computers, and depending on the function, it may be realized by calling an external platform etc. using an API (Application Programming Interface), network computing, etc. can be changed flexibly.
 また、上述してきた実施形態及び変形例は、処理内容を矛盾させない範囲で適宜組み合わせることが可能である。 Furthermore, the embodiments and modifications described above can be combined as appropriate within the scope of not contradicting the processing contents.
 100 情報処理装置
 110 通信部
 120 記憶部
 130 入力部
 140 出力部
 150 制御部
 151 取得部
 152 推定部
 153 出力制御部
100 Information processing device 110 Communication unit 120 Storage unit 130 Input unit 140 Output unit 150 Control unit 151 Acquisition unit 152 Estimation unit 153 Output control unit

Claims (16)

  1.  光源の光を照射された水中の参照物体を含む参照画像であって、前記水中の撮像装置により撮像された参照画像のうち、前記参照物体が占める物体領域における輝度値である参照輝度値と、前記撮像装置から前記参照物体までの距離である参照距離との関係性を示す参照情報、および、前記光源の光を照射された前記水中の対象物を含む対象画像であって、前記水中の前記撮像装置により撮像された対象画像を取得する取得手順と、
     前記対象画像のうち、前記対象物が占める対象物領域における輝度値である対象輝度値と、前記参照情報との比較に基づいて、前記撮像装置から前記対象物までの距離である対象距離を推定する推定手順と、
     をコンピュータに実行させる情報処理プログラム。
    a reference image including an underwater reference object irradiated with light from a light source, a reference brightness value that is a brightness value in an object region occupied by the reference object among the reference images captured by the underwater imaging device; Reference information indicating a relationship with a reference distance that is a distance from the imaging device to the reference object, and a target image including the underwater object irradiated with light from the light source, an acquisition procedure for acquiring a target image captured by an imaging device;
    Estimating a target distance, which is a distance from the imaging device to the target object, based on a comparison between a target brightness value, which is a brightness value in a target object area occupied by the target object, in the target image, and the reference information. an estimation procedure to
    An information processing program that causes a computer to execute.
  2.  前記推定手順は、
     推定した前記対象距離に応じて、前記対象物の大きさを推定する、
     請求項1に記載の情報処理プログラム。
    The estimation procedure is
    estimating the size of the target object according to the estimated target distance;
    The information processing program according to claim 1.
  3.  前記取得手順は、
     複数の前記対象物を含む前記対象画像を取得し、
     前記推定手順は、
     前記対象画像のうち、複数の前記対象物それぞれが占める複数の前記対象物領域それぞれにおける前記対象輝度値と、前記参照情報との比較に基づいて、前記撮像装置から複数の前記対象物それぞれまでの前記対象距離を推定し、推定した前記撮像装置から複数の前記対象物それぞれまでの前記対象距離に応じて、複数の前記対象物それぞれを識別する、
     請求項1に記載の情報処理プログラム。
    The acquisition procedure is as follows:
    acquiring the target image including a plurality of the target objects;
    The estimation procedure is
    Based on the comparison between the target brightness values in each of the plurality of target object areas occupied by each of the plurality of target objects in the target image and the reference information, the distance from the imaging device to each of the plurality of target objects is estimating the object distance and identifying each of the plurality of objects according to the estimated object distance from the imaging device to each of the plurality of objects;
    The information processing program according to claim 1.
  4.  前記取得手順は、
     前記対象物に応じた色に対応する波長の前記光を照射された前記参照物体を含む前記参照画像における前記参照輝度値と、前記参照距離との関係性を示す前記参照情報、および、前記対象物に応じた色に対応する波長の前記光を照射された前記対象物を含む前記対象画像を取得し、
     前記推定手順は、
     前記対象画像における前記対象輝度値と、前記参照情報との比較に基づいて、前記対象距離を推定する、
     請求項1に記載の情報処理プログラム。
    The acquisition procedure is as follows:
    the reference information indicating the relationship between the reference brightness value in the reference image including the reference object irradiated with the light of the wavelength corresponding to the color corresponding to the target object and the reference distance; obtaining the target image including the target object irradiated with the light having the wavelength corresponding to the color corresponding to the object;
    The estimation procedure is
    estimating the target distance based on a comparison between the target brightness value in the target image and the reference information;
    The information processing program according to claim 1.
  5.  前記取得手順は、
     複数の異なる色それぞれに対応する複数の異なる波長の前記光それぞれを照射された前記参照物体をそれぞれ含む複数の異なる前記参照画像それぞれにおける前記参照輝度値と、前記参照距離との関係性を示す複数の異なる前記参照情報、および、所定時間内のそれぞれ異なる時間に前記複数の異なる波長の前記光それぞれを照射された前記対象物をそれぞれ含む複数の異なる前記対象画像を取得し、
     前記推定手順は、
     前記複数の異なる前記対象画像それぞれにおける前記対象輝度値と、前記複数の異なる前記参照情報それぞれとの比較に基づいて、前記対象距離を推定する、
     請求項1に記載の情報処理プログラム。
    The acquisition procedure is as follows:
    A plurality of information indicating the relationship between the reference brightness value and the reference distance in each of the plurality of different reference images each including the reference object irradiated with the light of a plurality of different wavelengths corresponding to each of a plurality of different colors. and obtaining a plurality of different target images each including the target object irradiated with each of the plurality of lights of different wavelengths at different times within a predetermined time;
    The estimation procedure is
    estimating the target distance based on a comparison between the target brightness value in each of the plurality of different target images and each of the plurality of different reference information;
    The information processing program according to claim 1.
  6.  前記光源は、前記撮像装置の筐体の上部または下部に備えられる、
     請求項1に記載の情報処理プログラム。
    The light source is provided at the top or bottom of the casing of the imaging device,
    The information processing program according to claim 1.
  7.  前記参照物体は、特定の魚種の魚の鱗もしくは前記魚の鱗の模型、または、疑似皮、カラーチャートもしくはチェッカーボードで覆われた物体であり、
     前記対象物は、前記特定の魚種の魚である、
     請求項1に記載の情報処理プログラム。
    The reference object is a fish scale of a specific fish species, a model of the fish scale, or an object covered with pseudo skin, a color chart, or a checkerboard,
    The target object is a fish of the specific fish species,
    The information processing program according to claim 1.
  8.  前記取得手順は、
     前記光源の光を照射された水中の参照物体を含む透視度参照画像であって、前記水中の撮像装置により撮像された透視度参照画像のうち、前記参照物体が占める物体領域における輝度値である透視度参照輝度値と、前記水中の透視度との関係性を示す透視度参照情報、および、前記光源の光を照射された水中の対象物を含む対象画像であって、前記水中の前記撮像装置により撮像された対象画像を取得し、
     前記推定手順は、
     前記対象画像のうち、前記対象物が占める対象物領域における複数の色の輝度値のうち、少なくとも1つ以上の色の輝度値と、前記透視度参照情報との比較に基づいて、前記対象画像が撮像された際の水中の透視度を推定する、
     請求項1に記載の情報処理プログラム。
    The acquisition procedure is as follows:
    A visibility reference image including an underwater reference object irradiated with light from the light source, which is a brightness value in an object area occupied by the reference object in the visibility reference image captured by the underwater imaging device. Transparency reference information indicating a relationship between a transparency reference brightness value and the transparency in the water, and a target image including an underwater object irradiated with light from the light source, the image being captured in the water. Obtain the target image captured by the device,
    The estimation procedure is
    The target image is calculated based on a comparison between the brightness value of at least one color among the brightness values of a plurality of colors in the target object area occupied by the target object in the target image and the transparency reference information. Estimate the underwater visibility when the image is taken.
    The information processing program according to claim 1.
  9.  前記推定手順は、
     推定された水中の透視度に基づいて、前記対象距離を推定する、
     請求項8に記載の情報処理プログラム。
    The estimation procedure is
    estimating the target distance based on the estimated underwater visibility;
    The information processing program according to claim 8.
  10.  光源の光を照射された水中の参照物体を含む透視度参照画像であって、前記水中の撮像装置により撮像された透視度参照画像のうち、前記参照物体が占める物体領域における輝度値である透視度参照輝度値と、前記水中の透視度との関係性を示す透視度参照情報、および、前記光源の光を照射された水中の対象物を含む対象画像であって、前記水中の前記撮像装置により撮像された対象画像を取得する取得手順と、
     前記対象画像のうち、前記対象物が占める対象物領域における複数の色の輝度値のうち、少なくとも1つ以上の色の輝度値と、前記透視度参照情報との比較に基づいて、前記対象画像が撮像された際の水中の透視度を推定する推定手順と、
     をコンピュータに実行させる情報処理プログラム。
    A perspective reference image that includes an underwater reference object irradiated with light from a light source, which is a luminance value in an object area occupied by the reference object among the perspective reference images captured by the underwater imaging device. a target image including visibility reference information indicating a relationship between a degree reference brightness value and the underwater visibility, and an underwater object irradiated with light from the light source, the image capturing device underwater; an acquisition procedure for acquiring a target image captured by;
    The target image is calculated based on a comparison between the brightness value of at least one color among the brightness values of a plurality of colors in the target object area occupied by the target object in the target image and the transparency reference information. an estimation procedure for estimating underwater visibility when imaged;
    An information processing program that causes a computer to execute.
  11.  前記推定手順は、
     前記対象画像が撮像された際の水中の透視度が第1透視度を超える場合、前記対象画像のうち、前記対象物が占める対象物領域における1色の輝度値と、前記透視度参照情報との比較に基づいて、前記対象画像が撮像された際の水中の透視度を推定する、
     請求項10に記載の情報処理プログラム。
    The estimation procedure is
    If the visibility in the water when the target image is captured exceeds the first visibility, the brightness value of one color in the target area occupied by the target object in the target image, and the visibility reference information. estimating the underwater transparency when the target image was captured based on the comparison;
    The information processing program according to claim 10.
  12.  前記推定手順は、
     前記対象画像が撮像された際の水中の透明度が第1透視度以下である場合、前記対象画像のうち、前記対象物が占める対象物領域における複数の色の輝度値と、前記透視度参照情報との比較に基づいて、前記対象画像が撮像された際の水中の透視度を推定する、
     請求項10に記載の情報処理プログラム。
    The estimation procedure is
    If the transparency of the water when the target image is captured is equal to or lower than the first transparency, brightness values of a plurality of colors in a target area occupied by the target object in the target image and the transparency reference information. estimating the underwater transparency when the target image was captured based on the comparison with the
    The information processing program according to claim 10.
  13.  光源の光を照射された水中の参照物体を含む参照画像であって、前記水中の撮像装置により撮像された参照画像のうち、前記参照物体が占める物体領域における輝度値である参照輝度値と、前記撮像装置から前記参照物体までの距離である参照距離との関係性を示す参照情報、および、前記光源の光を照射された前記水中の対象物を含む対象画像であって、前記水中の前記撮像装置により撮像された対象画像を取得する取得部と、
     前記対象画像のうち、前記対象物が占める対象物領域における輝度値である対象輝度値と、前記参照情報との比較に基づいて、前記撮像装置から前記対象物までの距離である対象距離を推定する推定部と、
     を備える情報処理装置。
    a reference image including an underwater reference object irradiated with light from a light source, a reference brightness value that is a brightness value in an object region occupied by the reference object among the reference images captured by the underwater imaging device; Reference information indicating a relationship with a reference distance that is a distance from the imaging device to the reference object, and a target image including the underwater object irradiated with light from the light source, an acquisition unit that acquires a target image captured by the imaging device;
    Estimating a target distance, which is a distance from the imaging device to the target object, based on a comparison between a target brightness value, which is a brightness value in a target object area occupied by the target object, in the target image, and the reference information. an estimator to
    An information processing device comprising:
  14.  光源の光を照射された水中の参照物体を含む透視度参照画像であって、前記水中の撮像装置により撮像された透視度参照画像のうち、前記参照物体が占める物体領域における輝度値である透視度参照輝度値と、前記水中の透視度との関係性を示す透視度参照情報、および、前記光源の光を照射された水中の対象物を含む対象画像であって、前記水中の前記撮像装置により撮像された対象画像を取得する取得部と、
     前記対象画像のうち、前記対象物が占める対象物領域における複数の色の輝度値のうち、少なくとも1つ以上の色の輝度値と、前記透視度参照情報との比較に基づいて、前記対象画像が撮像された際の水中の透視度を推定する推定部と、
     を備える情報処理装置。
    A perspective reference image that includes an underwater reference object irradiated with light from a light source, which is a luminance value in an object area occupied by the reference object among the perspective reference images captured by the underwater imaging device. a target image including visibility reference information indicating a relationship between a degree reference brightness value and the underwater visibility, and an underwater object irradiated with light from the light source, the image capturing device underwater; an acquisition unit that acquires a target image captured by the
    The target image is calculated based on a comparison between the brightness value of at least one color among the brightness values of a plurality of colors in the target object area occupied by the target object in the target image and the transparency reference information. an estimation unit that estimates underwater transparency when the image is captured;
    An information processing device comprising:
  15.  コンピュータが実行する情報処理方法であって、
     光源の光を照射された水中の参照物体を含む参照画像であって、前記水中の撮像装置により撮像された参照画像のうち、前記参照物体が占める物体領域における輝度値である参照輝度値と、前記撮像装置から前記参照物体までの距離である参照距離との関係性を示す参照情報、および、前記光源の光を照射された前記水中の対象物を含む対象画像であって、前記水中の前記撮像装置により撮像された対象画像を取得する取得工程と、
     前記対象画像のうち、前記対象物が占める対象物領域における輝度値である対象輝度値と、前記参照情報との比較に基づいて、前記撮像装置から前記対象物までの距離である対象距離を推定する推定工程と、
     を含む情報処理方法。
    An information processing method performed by a computer, the method comprising:
    a reference image including an underwater reference object irradiated with light from a light source, a reference brightness value that is a brightness value in an object region occupied by the reference object among the reference images captured by the underwater imaging device; Reference information indicating a relationship with a reference distance that is a distance from the imaging device to the reference object, and a target image including the underwater object irradiated with light from the light source, an acquisition step of acquiring a target image captured by the imaging device;
    Estimating a target distance, which is a distance from the imaging device to the target object, based on a comparison between a target brightness value, which is a brightness value in a target object area occupied by the target object, in the target image, and the reference information. an estimation process to
    Information processing methods including.
  16.  光源の光を照射された水中の参照物体を含む透視度参照画像であって、前記水中の撮像装置により撮像された透視度参照画像のうち、前記参照物体が占める物体領域における輝度値である透視度参照輝度値と、前記水中の透視度との関係性を示す透視度参照情報、および、前記光源の光を照射された水中の対象物を含む対象画像であって、前記水中の前記撮像装置により撮像された対象画像を取得する取得工程と、
     前記対象画像のうち、前記対象物が占める対象物領域における複数の色の輝度値のうち、少なくとも1つ以上の色の輝度値と、前記透視度参照情報との比較に基づいて、前記対象画像が撮像された際の水中の透視度を推定する推定工程と、
     を含む情報処理方法。
    A perspective reference image that includes an underwater reference object irradiated with light from a light source, which is a luminance value in an object area occupied by the reference object among the perspective reference images captured by the underwater imaging device. a target image including visibility reference information indicating a relationship between a degree reference brightness value and the underwater visibility, and an underwater object irradiated with light from the light source, the image capturing device underwater; an acquisition step of acquiring a target image captured by;
    The target image is calculated based on a comparison between the brightness value of at least one color among the brightness values of a plurality of colors in the target object area occupied by the target object in the target image and the transparency reference information. an estimation step of estimating the underwater visibility when the image is taken;
    Information processing methods including.
PCT/JP2023/016503 2022-04-27 2023-04-26 Information processing program, information processing device, and information processing method WO2023210702A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022073423A JP2023162785A (en) 2022-04-27 2022-04-27 Information processing program, information processing device, and information processing method
JP2022-073423 2022-04-27

Publications (1)

Publication Number Publication Date
WO2023210702A1 true WO2023210702A1 (en) 2023-11-02

Family

ID=88519124

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/016503 WO2023210702A1 (en) 2022-04-27 2023-04-26 Information processing program, information processing device, and information processing method

Country Status (2)

Country Link
JP (1) JP2023162785A (en)
WO (1) WO2023210702A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS539143A (en) * 1976-07-12 1978-01-27 Koito Kogyo Kk Photoelectric distance measuring device
JPS56130608A (en) * 1980-03-19 1981-10-13 Ricoh Co Ltd Range finding system
JPH0989553A (en) * 1995-09-26 1997-04-04 Olympus Optical Co Ltd Distance measuring device
JP2012117896A (en) * 2010-11-30 2012-06-21 Saxa Inc Range finder, intruder monitoring device, and distance measuring method and program
WO2017175261A1 (en) * 2016-04-04 2017-10-12 パナソニックIpマネジメント株式会社 Turbidity detection apparatus, turbidity detection method, and submerged inspection apparatus
WO2021130950A1 (en) * 2019-12-26 2021-07-01 日本電気株式会社 Device for aiding image capture of underwater organisms, method for aiding image capture of underwater organisms, and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS594405U (en) * 1982-07-02 1984-01-12 株式会社東芝 distance measuring device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS539143A (en) * 1976-07-12 1978-01-27 Koito Kogyo Kk Photoelectric distance measuring device
JPS56130608A (en) * 1980-03-19 1981-10-13 Ricoh Co Ltd Range finding system
JPH0989553A (en) * 1995-09-26 1997-04-04 Olympus Optical Co Ltd Distance measuring device
JP2012117896A (en) * 2010-11-30 2012-06-21 Saxa Inc Range finder, intruder monitoring device, and distance measuring method and program
WO2017175261A1 (en) * 2016-04-04 2017-10-12 パナソニックIpマネジメント株式会社 Turbidity detection apparatus, turbidity detection method, and submerged inspection apparatus
WO2021130950A1 (en) * 2019-12-26 2021-07-01 日本電気株式会社 Device for aiding image capture of underwater organisms, method for aiding image capture of underwater organisms, and storage medium

Also Published As

Publication number Publication date
JP2023162785A (en) 2023-11-09

Similar Documents

Publication Publication Date Title
JP5761789B2 (en) Plant image region extraction method, plant image region extraction apparatus, and plant growth monitoring system
Becker et al. An assessment of the size structure, distribution and behaviour of fish populations within a temporarily closed estuary using dual frequency identification sonar (DIDSON)
Lougee et al. The effects of haloclines on the vertical distribution and migration of zooplankton
Herkül et al. Applying multibeam sonar and mathematical modeling for mapping seabed substrate and biota of offshore shallows
EP3843542A1 (en) Optimal feeding based on signals in an aquaculture environment
NO337305B1 (en) System and method for calculating physical sizes for freely moving objects in water
US20130194261A1 (en) System For Skin Treatment Analysis Using Spectral Image Data To Generate 3D RGB Model
US11532153B2 (en) Splash detection for surface splash scoring
Escobar-Flores et al. Predicting distribution and relative abundance of mid-trophic level organisms using oceanographic parameters and acoustic backscatter
EP3769036B1 (en) Method and system for extraction of statistical sample of moving fish
US10360667B2 (en) Biological material fouling assessment systems and methods
CN111263817A (en) Method and system for automated assessment of antibiotic susceptibility
Infantes et al. An automated work-flow for pinniped surveys: A new tool for monitoring population dynamics
Lin et al. Three-dimensional location of target fish by monocular infrared imaging sensor based on a L–z correlation model
Gutiérrez-Estrada et al. Fish abundance estimation with imaging sonar in semi-intensive aquaculture ponds
WO2023210702A1 (en) Information processing program, information processing device, and information processing method
Savina et al. Developing and testing a computer vision method to quantify 3D movements of bottom-set gillnets on the seabed
Al-Wahaibi et al. Extraocular vision in the sea urchin Diadema setosum
Dahms et al. Perspectives of underwater optics in biological oceanography and plankton ecology studies
Bianco et al. Plankton 3D tracking: the importance of camera calibration in stereo computer vision systems
Zsebok et al. Trawling bats exploit an echo-acoustic ground effect
JPH0820460B2 (en) A Method for Calculating Spacing or Separation Behavior Characteristics of Multiple Moving Biological Objects
Kent et al. Risk balancing through selective use of social and physical information: a case study in the humbug damselfish
Price et al. Use of high-resolution DIDSON sonar to quantify attributes of predation at ecologically relevant space and time scales
Nguyen et al. Using digital photographs to evaluate the effectiveness of plover egg crypsis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23796442

Country of ref document: EP

Kind code of ref document: A1