WO2019026619A1 - 画像処理装置、画像処理方法、並びにプログラム - Google Patents
画像処理装置、画像処理方法、並びにプログラム Download PDFInfo
- Publication number
- WO2019026619A1 WO2019026619A1 PCT/JP2018/026826 JP2018026826W WO2019026619A1 WO 2019026619 A1 WO2019026619 A1 WO 2019026619A1 JP 2018026826 W JP2018026826 W JP 2018026826W WO 2019026619 A1 WO2019026619 A1 WO 2019026619A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- images
- reference image
- reduced
- wavelength
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 360
- 238000003672 processing method Methods 0.000 title claims abstract description 8
- 238000003384 imaging method Methods 0.000 claims abstract description 154
- 238000000034 method Methods 0.000 claims description 178
- 230000008569 process Effects 0.000 claims description 165
- 239000002131 composite material Substances 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 abstract description 29
- 238000007689 inspection Methods 0.000 description 126
- 238000004364 calculation method Methods 0.000 description 54
- 238000000605 extraction Methods 0.000 description 31
- 230000003595 spectral effect Effects 0.000 description 29
- 238000003860 storage Methods 0.000 description 13
- 238000005259 measurement Methods 0.000 description 11
- 239000011159 matrix material Substances 0.000 description 10
- 238000013507 mapping Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 238000009966 trimming Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 239000000284 extract Substances 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000029553 photosynthesis Effects 0.000 description 3
- 238000010672 photosynthesis Methods 0.000 description 3
- 239000010409 thin film Substances 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 229930002875 chlorophyll Natural products 0.000 description 2
- 235000019804 chlorophyll Nutrition 0.000 description 2
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000003989 dielectric material Substances 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 239000010408 film Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20016—Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Definitions
- the present technology relates to an image processing apparatus, an image processing method, and a program, and, for example, to an image processing apparatus, an image processing method, and a program that can easily perform stitch processing.
- the present technology has been made in view of such a situation, and makes it easy to extract feature quantities and facilitates stitching and mapping.
- An image processing apparatus generates a first reference image regarding the first imaging area based on a plurality of first images regarding the first imaging area, and generates at least the first imaging area
- An image generation unit configured to generate a second reference image related to the second imaging region based on a plurality of second images related to a second imaging region partially overlapping; a first reference image and the second image
- a processing unit that generates positioning information indicating a correspondence between the first imaging region and the second imaging region based on the reference image.
- An image processing method generates a first reference image regarding the first imaging area based on a plurality of first images regarding the first imaging area, and generates at least the first imaging area
- a second reference image for the second imaging region is generated based on a plurality of second images for a partially overlapping second imaging region, and the first reference image and the second reference image Generating positioning information indicating a correspondence between the first imaging area and the second imaging area based on
- a program causes a computer to generate a first reference image regarding the first imaging area based on a plurality of first images regarding the first imaging area, and A second reference image of the second imaging region is generated based on a plurality of second images of the second imaging region that at least partially overlap, and the first reference image and the second reference image are generated. And performing processing including the step of generating positioning information indicating a correspondence between the first imaging region and the second imaging region.
- a first reference image regarding a first imaging region is generated based on a plurality of first images regarding the first imaging region
- a second reference image for the second imaging area is generated based on the plurality of second images for the second imaging area that at least partially overlaps with one imaging area
- the first reference image and the second reference image are generated.
- Positioning information indicating the correspondence between the first imaging region and the second imaging region is generated based on the reference image.
- the image processing apparatus may be an independent apparatus or an internal block constituting one apparatus.
- the program can be provided by transmitting via a transmission medium or recording on a recording medium.
- feature quantities can be easily extracted, and stitching and mapping can be easily performed.
- FIG. 1 is a diagram showing the configuration of an embodiment of an image processing system to which the present technology is applied. It is a figure which shows the structure of other embodiment of the image processing system to which this technique is applied. It is a figure showing an example of composition of an image processing system. It is a figure showing an example of composition of an image processing system. It is a figure for demonstrating a unit pixel. It is a figure for demonstrating the characteristic of a filter. It is a figure for demonstrating the characteristic of a filter. It is a figure for demonstrating the structure of an example of a multi camera. It is a figure for demonstrating the structure of an example of a multi camera. It is a figure for demonstrating the characteristic of a filter.
- FIG. 1 is a diagram showing a configuration of an embodiment of an image processing system including an image processing apparatus to which the present technology is applied.
- the image processing system 10 is a system that captures an object, performs stitch processing, and creates a desired image. For example, in the image processing system 10, a plant (vegetation) is targeted as a subject, and a map (image) representing the growth state of the plant is created.
- the image processing system 10 includes an imaging device 11, an illuminance sensor 12, a hub 13, and an arithmetic device 14.
- the imaging device 11, the illuminance sensor 12, and the arithmetic device 14 are mutually connected via the hub 13.
- the imaging device 11 images an object to be measured.
- the illuminance sensor 12 is a device that measures the illuminance of a light source, for example, sunlight, and supplies the measured illuminance information (illuminance value) to the imaging device 11.
- the imaging device 11 is mounted on the lower side (ground side) of a remotely operated or autonomous unmanned aerial vehicle called a drone or the like, and the illuminance sensor 12 is mounted on the upper side (empty side) of the unmanned aerial vehicle.
- a remotely operated or autonomous unmanned aerial vehicle called a drone or the like
- the illuminance sensor 12 is mounted on the upper side (empty side) of the unmanned aerial vehicle.
- imaging device 11 and the illumination intensity sensor 12 continue description as a separate body here, you may be provided in the same housing
- imaging device 11 and the illumination sensor 12 may be configured to be included in an unmanned aerial vehicle.
- the imaging device 11 captures an image of an object, and outputs data obtained by the imaging to the arithmetic device 14 via the hub 13.
- the illuminance sensor 12 is a sensor for measuring the illuminance, and outputs the illuminance value as the measurement result to the arithmetic device 14 via the hub 13.
- the arithmetic device 14 is a device having an arithmetic function by a circuit such as a central processing unit (CPU) or a field programmable gate array (FPGA).
- the arithmetic device 14 is configured as a personal computer, a dedicated terminal device, or the like.
- the arithmetic device 14 such as a personal computer performs image processing in a local environment via the hub 13.
- image processing is performed in a cloud environment via a network. It may be made to be known.
- FIG. 2 shows a configuration example of an image processing system 30 compatible with a cloud environment as another configuration example of the image processing system.
- the imaging device 11 and the illuminance sensor 12 output the image data and the illuminance value to the client device 31 via the hub 13, similarly to the imaging device 11 and the illuminance sensor 12 of FIG. 1. .
- the client device 31 includes a personal computer or the like, and outputs the image data and the illuminance value input from the imaging device 11 and the illuminance sensor 12 to the router 32 through the hub 13. That is, the client device 31 corresponds to the arithmetic device 14 of FIG. 1, but is provided as a device that does not perform image processing or performs only a part of it.
- the router 32 is, for example, a mobile router, and can be connected to a network 34 such as the Internet via the base station 33.
- the router 32 transmits the image data and the illuminance value input from the client device 31 to the server 35 via the network 34.
- the server 35 receives the image data and the illuminance value transmitted from the client device 31 via the network 34.
- the server 35 has a function equivalent to or at least a part of the function of the computing device 14 shown in FIG. 1.
- a storage 36 is connected to the server 35, and the image data and the illuminance value supplied to the server 35 are stored in the storage 36 as needed, and data required when performing a stitching process, etc. It is read appropriately.
- the image processing system 10 (or the image processing system 30) has a configuration (including functions) as shown in FIG.
- the image processing system shown in FIG. 3 is described as an image processing system 50.
- the image processing system 50 includes a lens 101, an exposure unit 102, an MS sensor 103, a designated wavelength calculation processing unit 104, a reference image generation processing unit 105, an inspection wavelength image extraction unit 106, a reference image stitch processing unit 107, and an inspection image stitch processing A section 108 is included.
- the upper diagram of FIG. 4 shows the image processing system 50 shown in FIG. 3, and the lower diagram shows a configuration example, and here shows configuration examples A to G.
- the image processing system 50 is divided into four.
- the lens 101, the exposure unit 102, and the MS sensor 103 are assumed to be an imaging unit 61 that performs imaging of an object.
- the designated wavelength calculation processing unit 104, the reference image generation processing unit 105, and the inspection wavelength image extraction unit 106 are the image generation unit 62 that generates an image such as a reference image or an inspection wavelength image.
- the reference image stitch processing unit 107 is a first stitch processing unit 63 that performs stitch processing.
- the inspection image stitch processing unit 108 is a second stitch processing unit 64 that performs stitch processing.
- the configuration example A is a case where the image processing system 50 is configured to be completed in the imaging device 11, and the imaging unit 61, the image generation unit 62, the first stitch processing unit 63, and the second stitch processing unit All of the components 64 are included in the imaging device 11. Although not shown in FIG. 1 and FIG. 2, only the imaging device 11 can perform a configuration up to the stitch processing.
- the imaging unit 61 and the image generation unit 62 are provided in the imaging device 11, and the first stitch processing unit 63 and the second stitch processing unit 64 are arithmetic devices 14 (FIG. Or the server 35 (FIG. 2).
- the imaging unit 61 is provided in the imaging device 11, and the image generation unit 62, the first stitch processing unit 63, and the second stitch processing unit 64 are arithmetic devices 14 (see FIG. 1) or a configuration example provided in the server 35 (FIG. 2).
- the imaging unit 61 is provided in the imaging device 11
- the image generation unit 62 is provided in the client device 31 (FIG. 2)
- the first stitch processing unit 63 and the second stitch processing unit 63 are provided.
- the stitch processing unit 64 is provided in the server 35 (FIG. 2).
- the imaging unit 61 is provided in the imaging device 11, and the image generation unit 62, the first stitch processing unit 63, and the second stitch processing unit 64 are servers 35 (FIG. Configuration example).
- the imaging unit 61 is provided in the imaging device 11, and the image generation unit 62 and the first stitch processing unit 63 are provided in the client device 31 (FIG. 2). It is a structural example which provides the process part 64 in the server 35 (FIG. 2).
- the imaging unit 61 is provided in the imaging device 11
- the image generation unit 62 is provided in the client device 31 (FIG. 2)
- the first stitch processing unit 63 is a server 35.
- This configuration example is provided in FIG. 2 and the second stitch processing unit 64 is provided in the client device 31 (FIG. 2).
- the image processing system 50 may have any one of the configuration examples A to G, may be configured as a single device, or may be configured from a plurality of devices. According to the present technology, as described below, regardless of the configuration, processing can be reduced, and processing time and processing load can be reduced.
- the lens 101, the exposure unit 102, and the MS sensor 103 are included in the imaging device 11 in any of the configurations A to G described with reference to FIG.
- the imaging device 11 In the imaging device 11, light (reflected light) from an object such as a measurement object is incident on the MS sensor 103 via the lens 101 and the exposure unit 102.
- the MS of the MS sensor 103 means multispectral.
- the imaging device 11 is configured to be able to obtain signals of a plurality of different wavelengths from one unit pixel.
- the exposure unit 102 uses an optical system such as the lens 101 or an aperture amount by an iris (aperture) so that sensing is performed in a state where the signal charge is not saturated but is within the dynamic range in the MS sensor 103. Exposure control is performed by adjusting the exposure time of 103, the shutter speed, and the like.
- the MS sensor 103 is composed of an MS filter 103-1 and a sensor 103-2 as shown in FIG.
- the MS filter 103-1 can be an optical filter corresponding to an index to be measured, and is a filter that transmits a plurality of different wavelengths.
- the MS filter 103-1 transmits light incident through the lens 101 to the sensor 103-2 of the MS sensor 103.
- the sensor 103-2 is an image sensor composed of sensing elements in which a plurality of pixels are two-dimensionally arrayed in a repeating pattern on the sensor surface.
- the MS sensor 103 detects the light that has passed through the MS filter 103-1 with the sensing element (sensor 103-2), and thereby the measurement signal (measurement data) corresponding to the light amount of the light is processed by the designated wavelength calculation processing unit 104. Output to
- photosynthetically available photon flux density may be sensed.
- the photosynthesis of plants depends on the number of photons (photons) that are particles of light, and the number of photons incident at a wavelength of 400 nm to 700 nm, which is the absorption wavelength of chlorophyll (chlorophyll), per unit area in unit time
- the PPFD value is shown in.
- RGB signals are required, and MS filter 103-1 transmits wavelengths such as R (red), G (green), B (blue), and IR (infrared light). It is assumed that filters are combined.
- the MS filter 103-1 is a filter that transmits light of wavelengths A to H, and the pixel of the MS sensor 103 is a sensor that receives the transmitted light.
- one unit pixel is configured by eight pixels as shown in FIG. 5, and description will be continued assuming that each pixel is a pixel that receives light of different wavelengths. Further, in the following description, when simply described as a pixel, it represents one pixel in one unit pixel, and when described as a unit pixel, it is described as a pixel (group) composed of eight pixels. Continue.
- One unit pixel is a sensor that receives light of wavelengths A to H.
- a sensor that receives light of 400 nm to 750 nm may be a wavelength of 400 nm, a wavelength of 450 nm, a wavelength of 500 nm, and a wavelength of D Can be set to 550 nm, wavelength E to 600 nm, wavelength F to 650 nm, wavelength G to 700 nm, and wavelength H to 750 nm.
- the number of pixels included in one unit pixel here is an example, and is not a description showing limitation. For example, it may be configured by 4 pixels of 2 ⁇ 2 or 16 pixels of 4 ⁇ 4. good.
- the range of the wavelength of light received by one unit pixel and the wavelength of light received by each pixel are not limited to the above-described example, and a wavelength capable of appropriately sensing the measurement object to be sensed Can be set.
- the wavelength is not limited to visible light, and may be infrared light, ultraviolet light, or the like.
- the MS filter 103-1 transmitting light of a plurality of wavelengths is an optical filter transmitting narrow band light of a predetermined narrow wavelength band (narrow band).
- a camera using such a narrow band filter or the like may be referred to as a multispectral camera or the like, and the multispectral camera can also be used in the present technology.
- the description of the multispectral camera is added.
- narrow band filters can be provided independently of each other.
- the sensor output can be predicted without solving the inverse matrix.
- an optical filter in the case where the wavelengths of light passed through the MS filter 103-1 overlap, it is a kind of metal thin film filter using a thin film made of metal such as aluminum, and What applied the principle can be applied.
- a thin film is formed on the sensor surface and the one utilizing the principle of Fabry-Perot resonance is applied as an optical filter in the case where the overlapping range of the wavelengths of light passing through the MS filter 103-1 is reduced. be able to.
- the multi filter 151 is a filter that transmits different light not for each pixel but for each pixel group.
- the areas A to H have the size of a ⁇ b pixel groups respectively, and the corresponding area of the multi camera 152 is configured to have the same size as the size of the corresponding areas A to H (a ⁇ b pixel groups are arranged).
- convex lenses are two-dimensionally arranged, and a light beam incident on the multi lens array 161 forms a light source image in two dimensions on the multi filter 162 (each lens constituting the multi lens array 161) Form a light source image).
- the multi filter 162 is a filter divided into a plurality of areas, and in this case, it is a filter divided into areas corresponding to one lens of the multi lens array 161. There is. Alternatively, the multi filter 162 is a filter that transmits a predetermined wavelength for each pixel as in the MS filter 103-1 of FIG.
- the characteristic of the multi filter 151 (162) may be a filter having the characteristic as shown in FIG. 6, or the characteristic as shown in FIG. It can also be a filter having Furthermore, an ultra narrow band multi filter 151 (162) having the characteristics as shown in FIG. 10 may be used.
- a filter having characteristics as shown in FIG. 10 is called a dichroic filter or the like, and can be constituted by a multilayer film of dielectrics having different refractive indexes.
- the present technology can be applied to (used with) any of these multi-cameras.
- a filter having the characteristics as shown in FIG. 6 is used is taken as an example, and as shown in FIG. 5, a multi-camera comprising the MS filter 103-1 and the MS sensor 103-2 is used. The explanation will be continued by taking the case where it was
- the signal (image signal) from the MS sensor 103 is supplied to the designated wavelength calculation processing unit 104.
- the designated wavelength calculation processing unit 104 executes processing for generating an image of a desired light wavelength using the supplied image signal.
- the MS sensor 103 is a sensor that receives light of a plurality of different wavelengths, and an image obtained from each sensor is supplied to the designated wavelength calculation processing unit 104.
- the output from the pixel that receives light of the wavelength A is configured Image (referred to as image A), an image composed of outputs from pixels that receive light of wavelength B (referred to as image B), an image composed of outputs from pixels that receive light of wavelength C (Image C), an image composed of outputs from pixels that receive light of wavelength D (image D), and an image composed of outputs from pixels that receive light of wavelength E (image E) ),
- Image A an image composed of outputs from pixels that receive light of wavelength B
- Image C an image composed of outputs from pixels that receive light of wavelength C
- image D image D
- image E image E
- An image composed of an output from a pixel that receives light of wavelength F referred to as an image F
- an image composed of an output from a pixel that receives light of wavelength G referred to as an image G
- a wavelength H Image composed of the output from the pixel that receives the That is supplied to the specified wavelength processor 104.
- a plurality of images can be obtained at one time by receiving a plurality of different wavelengths.
- the image obtained at one time is appropriately described as a multispectral image.
- the multispectral image is a plurality of spectral images extracted according to the characteristics of the multispectral filter (MS filter 103-1 described above).
- the designated wavelength calculation processing unit 104 extracts (generates) an image of an arbitrary wavelength by performing inverse matrix calculation using a multispectral image, using the multispectral image supplied from the MS sensor 103 as it is, Alternatively, a multispectral image and an image of an arbitrary wavelength generated by calculation are used.
- the image used by the designated wavelength calculation processing unit 104 is supplied to the reference image generation processing unit 105 or the inspection wavelength image extraction unit 106.
- the reference image generation processing unit 105 generates a reference image.
- the reference image is an image used when generating information (stitch reference information 303 to be described later) which is a source when performing the stitch processing. Also, the reference image is the most characteristic image that is matched to the features of the subject. The generation of the reference image and the stitch processing which is the processing after this will be described later.
- the reference image generated by the reference image generation processing unit 105 is supplied to the reference image stitch processing unit 107.
- the inspection wavelength image extraction unit 106 generates an inspection image. For example, when it is desired to generate an image for examining the growth condition of a plant, an image of a wavelength (inspection wavelength) suitable for sensing vegetation is generated (extracted). The inspection image extracted by the inspection wavelength image extraction unit 106 is supplied to the inspection image stitch processing unit 108.
- the reference image stitch processing unit 107 performs stitch processing on the reference image.
- An image generated by the reference image being stitched (information of such an image) is used when the inspection image is stitched.
- an image (information) generated by the reference image being stitched is described as a stitching reference map or stitch reference information.
- the generated stitch reference information is supplied to the inspection image stitch processing unit 108.
- the inspection image stitch processing unit 108 is also supplied with the inspection image from the inspection wavelength image extraction unit 106.
- the inspection image stitch processing unit 108 stitches the inspection image based on the stitch reference information, and generates an inspection image after stitching.
- a flight plan is made. For example, by moving the movement measurement device 201 on a predetermined field 211, when a plant or the like in the field 211 is imaged as a measurement target, a flight plan as shown at time T1 in FIG. 11 is established.
- the movement measurement device 201 is, for example, an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle), and flies by rotating a propeller-like rotary wing 202, and senses an object to be measured such as a plant of the field 211 from the sky Take a picture.
- An illuminance sensor 12 (FIG. 1) is attached to the surface of the movement measuring device 201 capable of receiving sunlight at the time of aerial photographing, and is configured to be able to sense an illuminance value. Further, at the time of aerial photographing, the imaging device 11 is attached to the surface of the movement measuring device 201 facing the field 211 side.
- the mobile measurement device 201 performs aerial shooting by radio piloting based on a flight plan, stores the flight plan in advance as coordinate data, and autonomously flies by using position information such as GPS (Global Positioning System). Take a picture.
- GPS Global Positioning System
- an actual aerial view of the farm field 211 is performed based on the flight plan.
- This aerial image is captured such that there is an overlap in the captured image.
- an aerial imaging result is obtained.
- an overlapping portion exists in the captured image. This overlapping portion is trimmed or stitched, leaving one end and deleting the other, combining, and the like.
- an image is generated in which a plurality of images are arranged such that there is no overlapping portion.
- the imaging device 11 to which the present technology is applied includes the MS sensor 103, receives light of wavelengths A to H, and generates images A to H.
- the MS sensor 103 receives light of wavelengths A to H, and generates images A to H.
- 500 images are captured by the MS sensor 103 as described above will be described with reference to FIG.
- the pixels that receive the light of wavelength A of the MS sensor 103 (hereinafter referred to as the pixel A, the same applies to the other pixels) by performing aerial imaging, the images 301A-1 to 301A-500. 500 images of are obtained as an aerial result.
- 500 images are obtained from each of the pixels A to H.
- an aerial image result at time T3 for example, an image of field 211 formed of images 301A-1 to image 301A-500, an image of field 211 formed of images 301B-1 to image 301B-500, etc. are generated Be done. If the image to be finally obtained is an image based on the wavelength A, trimming and stitching are performed using the images 301A-1 to 301A-500.
- the image to be finally obtained is an image based on each of the wavelengths A to H
- trimming and stitching are performed using the images 301A-1 to 301A-500
- the image 301B-1 is obtained.
- Processing using each of the images 301A to 301H is performed such that trimming and stitching are performed using the images 301B to 500.
- the image processing system 10 as shown in FIG. 1 is constructed by a configuration other than the image processing system 50 as shown in FIG. 3, the amount of data transferred from the imaging device 11 to the arithmetic device 14 will be large. The time for this transfer may be long. In addition, the amount of computation by the computing device 14 may be increased, and the processing time of the computing device 14 may be increased. In other words, according to the configuration other than the image processing system 50 as shown in FIG. 3, there is a high possibility that data transfer and processing time will be long.
- the images captured at the same time become the images captured by the MS sensor 103 at the same time, the images capture the same part. That is, for example, the images 301A-1 to 301H-1 captured at time t1 are images obtained by capturing the same part, and become images with different wavelengths. As described above, the reference image 302 is generated from a plurality of images 301 obtained by imaging the same part.
- the reference image 302-2 is generated from the images 301A-2 to 301H-2 captured at time t2, and the reference image 302-3 is generated from the images 301A-3 to 301H-3 captured at time t3. .
- the same process is repeated to generate reference images 302-500. That is, reference images 302-1 to 302-500 are generated.
- the reference image 302 is an image that is the most characteristic image that is matched to the characteristics of the subject.
- the building is a characteristic image, for example, an image in which a portion of the building is extracted as an edge. Therefore, stitching can be performed by superimposing images in which the characteristic portions coincide with each other.
- a characteristic portion for example, an area in which an edge is detected, hereinafter referred to as a characteristic area as appropriate, for example, overlapping a part where a building is present.
- FIG. 15 illustrates the same state as the aerial imaging result shown at time T3 in FIG.
- a reference image 302-100 is disposed on the right of the reference image 302-1 with an overlap.
- the reference images 302-150 are arranged on the right of the reference images 302-100 with an overlap.
- the reference images 302-200 are arranged on the right of the reference images 302-150 with an overlap.
- the reference image 302-300 is disposed below the reference image 302-1 with an overlap.
- the reference images 302-400 are arranged on the right of the reference images 302-300 with an overlap.
- the reference images 302-450 are arranged on the right of the reference images 302-400 with an overlap.
- a part of the reference image 302-1 is disposed at the upper right of the stitching reference map 303, and a part of the reference image 302-100 is adjacent to the right of the reference image 302-1. Be placed.
- a part of the reference image 302-150 is disposed to the right of the reference image 302-100, and a part of the reference image 302-200 is disposed to the right of the reference image 302-150.
- a part of the reference image 302-300 is disposed below the reference image 302-100, and a part of the reference image 302-400 is disposed next to the right of the reference image 302-300.
- a part of the reference image 302-450 is arranged on the right of the reference image 301-400.
- stitching reference map 303 is generated.
- the stitching reference map 303 is information on positioning of the reference images 302.
- the reference image 302 is an image generated from the plurality of images 301, for example, as described with reference to FIG.
- the reference image 302-1 is generated from the images 301A-1 to 301H-1
- the reference image 302-2 is generated from the images 301A-2 to 301H-2. .
- the images 301A-1 to 301H-1 are, for example, images obtained by imaging a predetermined imaging area A
- the images 301A-2 to 301H-2 are images obtained by imaging a predetermined imaging area B, for example.
- the imaging area A and the imaging area B are at least partially overlapping areas.
- the reference image 302-1 related to the imaging area A is generated based on the plurality of images 301A-1 to 301H-1 related to the imaging area A, and at least a part thereof overlaps with the imaging area A
- reference images 302-2 related to the imaging area B are generated.
- a stitching reference map 303 (stitch reference information 303), which is positioning information indicating the correspondence between the imaging area A and the imaging area B, is generated. .
- the positioning information is information indicating a relative positional deviation when the reference image 302-1 and the reference image 302-2 are superimposed. It may be
- stitching reference map 303 (stitch reference information 303) is generated, an inspection image of a desired wavelength is mapped based on the stitching reference map 303.
- the inspection image of wavelength A is mapped based on the stitching reference map 303
- the image of the wavelength A as described with reference to FIG. 13, the images 301A-1 to 301A-500 are acquired.
- the reference image 302-1 is a reference image 302 generated from the images 301A-1 to 301H-1 captured at time t1 at which the image 301A-1 is captured.
- the image 301 used when generating the reference image 302 is mapped.
- the image 301 has the same shape as the reference image 302 in the stitching reference map 303, and the corresponding region is cut out and arranged.
- the stitching reference map 303 (stitch reference information 303) has written therein information indicating where to place the image 301 captured at what time and in what shape (size) to cut out.
- a part of the image 301A-100 is arranged in a portion of the stitching reference map 303 in which the reference image 302-100 is arranged.
- a part of the image 301A-150 is arranged in a portion where the reference image 302-150 is arranged in the stitching reference map 303, and a portion where the reference image 302-200 is arranged in the stitching reference map 303.
- Parts of the images 301A-200 are arranged.
- a part of the image 301A-300 is arranged in a portion where the reference image 302-300 is arranged in the stitching reference map 303, and a portion where the reference image 302-300 in the stitching reference map 303 is arranged.
- a portion of the image 301A-400 is disposed, and a portion of the image 301A-450 is disposed in a portion of the stitching reference map 303 in which the reference image 302-450 is disposed.
- the stitch reference map 303 (stitch reference information 303) is created, and based on that, the inspection image of the desired wavelength is mapped, whereby the inspection image of the final desired wavelength (the final inspection image Can be generated.
- the case where the final inspection image of wavelength A is generated is taken as an example, but, for example, also when the final inspection image of wavelength B is generated, the same stitching reference map 303 (stitch reference information 303) is used , As described above. Furthermore, also in the case of generating the final inspection image of the wavelengths C to H, it can be performed as described above using the same stitching reference map 303 (stitch reference information 303).
- the stitching reference map 303 (stitch reference information 303) is generated, the final inspection image of the desired wavelength can be generated using the stitching reference map 303 (stitch reference information 303).
- an inspection image other than an image acquired as a multispectral image, and to generate a final inspection image from the inspection image.
- images A to B images 301A to 301H
- images A to B images corresponding to the wavelengths A to H are acquired using the MS sensor 103 having pixels that respectively receive the wavelengths A to H.
- image X an image X of a wavelength (referred to as wavelength X) which is not directly generated from the MS sensor 103, and to use the image X as an inspection image to generate a final inspection image.
- the sunlight 402 from the sun 401 is irradiated to the plant 403 and the road 404 and is also irradiated to the illuminance sensor 12.
- the illuminance sensor 12 measures the illuminance of the emitted sunlight 402.
- the illuminance sensor 12 measures the illuminance of the emitted sunlight 402 and acquires the illuminance value.
- the acquired illuminance value is, for example, a spectral characteristic of the sunlight 402 as shown in FIG.
- the graph shown in FIG. 19 shows the spectral characteristics of the sunlight 402, the horizontal axis shows the wavelength of light, and the vertical axis shows the intensity of light.
- a part of the sunlight 402 irradiated to the plant 403 is reflected by the plant 403, and the reflected light 405 is received by the imaging device 11. Further, part of the sunlight 402 irradiated to the road 404 is reflected by the road 404, and the reflected light 406 is received by the imaging device 11.
- FIG. 20 shows a graph showing the relationship between wavelength and reflectance when vegetation and concrete are to be measured, respectively.
- the horizontal axis indicates the wavelength
- the vertical axis indicates the reflectance.
- the graph shown by the solid line represents the result obtained when the object to be measured is vegetation
- the graph shown by the dotted line is concrete (the road 404 etc. is a structure here). Represents the result obtained when
- the reflectance when light of the same wavelength is irradiated differs between the plant 403 and the road 404 (concrete).
- the reflectance of the concrete shown by the dotted line in FIG. 20 is higher than the reflectance of the vegetation shown by the solid line.
- the reflectance of concrete shown by the dotted line in FIG. 20 has a substantially constant reflectance regardless of the wavelength of light
- the reflectance of the vegetation shown by the solid line becomes steep at a specific wavelength of light Change. It can be read from FIG. 20 that, when a plant is to be measured, the reflectance is high if the light is a wavelength of 700 nm or more.
- the reflected light 405 from the plant 403 and the reflected light 406 from the road 404 are different, and when the reflectance is measured, the result is obtained.
- the wavelength of light with the highest reflectance differs depending on the growth condition.
- an image obtained when imaging at the wavelength X suitable for the object to be measured is generated.
- an image of the desired wavelength X is generated by performing inverse matrix operation using the multispectral image obtained from the MS sensor 103.
- the image of the desired wavelength X can be represented by the reflection spectral characteristics of the object to be measured.
- the following relational expression (1) is established with ( ⁇ ).
- (Spectral characteristic L ( ⁇ ) of light source) ⁇ (Spectral characteristic P ( ⁇ ) of subject) ⁇ (Spectral characteristic S ( ⁇ ) of imaging system) (image (O ( ⁇ ))) (1)
- the spectral characteristics of the light source are the spectral characteristics obtained from the illuminance sensor 12, for example, the spectral characteristics of the sunlight 402 as shown in FIG.
- the spectral characteristic of the light source may use any value. In other words, not the spectral characteristics obtained from the illuminance sensor 12 but the spectral characteristics set in advance may be used.
- the spectral characteristics of the subject are the spectral characteristics obtained from the reflected light from the subject. For example, the spectral characteristics of the reflected light when sunlight is irradiated to the plant or concrete as shown in FIG. 20 and reflected. It is.
- the spectral characteristics of the imaging system are the spectral characteristics of the MS sensor 103, for example, the spectral characteristics as shown in FIG.
- the MS sensor 103 is, for example, a sensor that receives signals of eight different wavelengths in one unit pixel.
- the MS sensor 103 is a sensor composed of a combination of the MS filter 103-1 and the sensor 103-2, the spectral characteristics of the MS sensor 103 are affected by the transmittance of the MS filter 103-1 and the sensitivity of the sensor 103-2. Become the received characteristic. Specifically, it is a value obtained by multiplying the transmittance of the MS filter 103-1 and the sensitivity of the sensor 103-2. Further, the sensitivity of the sensor 103-2 is the sensitivity to the sensitivity set as a reference, and the normalized sensitivity is used.
- the spectral characteristics of the MS sensor 103 are spectral characteristics for each of the wavelengths A to H as shown in FIG.
- Equation (2) for obtaining the spectral characteristics of the subject can be obtained.
- the reflectance spectral characteristic of the subject can be obtained by the equation (2), that is, the inverse matrix operation.
- the equation (2) that is, the inverse matrix operation.
- an image of a desired wavelength can be generated, and the image can be used as an inspection image and mapped based on the stitching reference map 303 to generate a final inspection image.
- step S11 the reference image generation processing unit 105 executes feature amount calculation processing.
- the reference image generation processing unit 105 is supplied with a multispectral image from the designated wavelength calculation processing unit 104.
- processing using the multispectral image (image 301) acquired by the MS sensor 103 as it is performed is performed.
- the designated wavelength calculation processing unit 104 causes the reference image generation processing unit 105 to generate the multispectral image from the MS sensor 103, for example, FIG. To provide the image 301 described with reference to FIG.
- the multispectral image may be supplied directly to the reference image generation processing unit 105 from the MS sensor 103 instead of the designated wavelength calculation processing unit 104.
- step S11 The feature amount computing process (first feature amount computing process) in step S11 will be described with reference to the flowchart in FIG.
- step S31 the reference image generation processing unit 105 extracts feature amounts of the multispectral image.
- the feature amount is an index that represents the size of a feature of an image, as described later.
- the feature amount is an index indicating the degree to which a feature area (as described above, a feature of the image) is present in the image.
- the feature amount is used to select a reference image, and the feature region is used at the time of stitching processing described later.
- a spatial change amount of a pixel value can be used as a feature amount (an index indicating the size of a feature of an image).
- the spatial variation of the pixel value can be the difference between the pixel values of adjacent pixels.
- an edge area can be extracted by extracting an area where the difference value of the pixel value between the pixels is equal to or more than the threshold value.
- the spatial variation of the pixel value may be used as an index indicating the size of the feature of the image, and the variation may be calculated as the feature in step S31.
- a method of extracting an edge region from an image there is a method of using differential operation, a method of using a high pass filter, or the like.
- the edge area can be defined as an area in which the change in color or the like is sharp, and in other words, in the edge area, it can be said that the slope of the graph is steep.
- step S51 When a reference image is generated in the subsequent processing (for example, step S51 in FIG. 24), comparison of feature amounts for each image is performed.
- the feature value used in step S51 is, for example, the pixel value between adjacent pixels calculated from within one image. It can be a sum of differences.
- the size (ratio) of the area extracted as an edge in one image is a feature Calculated as a quantity.
- a feature point may be extracted and the feature point may be used.
- the feature point divides the image into a plurality of regions, and calculates the difference between the average value of the pixel values in the region and the pixel value in the region using, for example, an arithmetic method called mean square error, etc.
- the difference (DIF) can also be determined by setting the pixels to be significantly different from the pixel values of the surrounding pixels.
- a feature point is used as an index indicating the size of a feature of an image
- the total number and ratio of feature points extracted from one image can be used as a feature amount.
- a statistical value (for example, a variance value) may be used as an index indicating the size of the feature of the image.
- a representative statistical value for each area can be calculated by dividing the image into a plurality of areas and the variance within the area can be obtained, and the value (statistical value) can be used.
- the variance in the area When the variance in the area is calculated, the variance indicates the complexity of the pixel distribution in the area, and the variance becomes larger in an area including an image such as an edge where the pixel value changes rapidly. Therefore, the variance in the area can be obtained as an index indicating the size of the feature of the image, and the variance can be used as the feature amount.
- a statistical value is used as an index indicating the size of a feature of an image, for example, the sum of statistical values calculated from within one image, an average value, or the like can be used.
- the feature amount extracts an edge from within the image, and a case where the extracted edge area occupies in one image will be described as an example. Further, the extraction of the edge will be described by taking the case of extracting the high frequency component as an example.
- step S31 extraction of high frequency components (that is, extraction of an edge) is performed for each multispectral image in order to calculate a feature amount.
- images 301A-1 to 301A-500, images 301B-1 to 301B-500, and images 301C-1 to 301C-500 as multispectral images from the MS sensor 103.
- images 301E-1 to 301E-500, images 301F-1 to 301F-500, images 301G-1 to 301G-500, and images 301H-1 to 301H-500 is performed.
- the ratio within one image of the extracted high frequency component is used, and such ratio is calculated in step S31. Also, as described later, when the stitching process is performed on the reference image, the feature regions are compared, so while the feature amount is calculated, the information on the feature regions extracted as the edge region is appropriately stored. Ru.
- step S31 high-frequency components are extracted from each of the images 301 to perform feature amount calculation processing, and then the process proceeds to step S12 (FIG. 22).
- step S12 reference image generation processing is performed.
- the reference image generation process (first reference image generation process) executed in step S12 will be described with reference to the flowchart in FIG.
- step S51 the reference image generation processing unit 105 compares the feature amounts of the respective images 301 calculated in the process in step S11.
- the reference image is generated in step S52 using the comparison result.
- step S51 The comparison of the feature amounts in step S51 is performed between the images captured at the same time (the images obtained by capturing the same portion) as described with reference to FIG. Then, the image 301 having the largest feature amount (the feature of the subject captured in the image is the largest) is preferentially selected and used as a reference image.
- the feature quantities (proportions of regions extracted as edges) of the image 301H-1 are compared, and the image 301 having the largest feature quantity is set as the reference image 302-1.
- the image 301A-1 is set as the reference image 302-1.
- step S13 processing of comparing the feature amounts and setting the image 301 having the largest feature amount as the reference image 302 as a result of the comparison is performed.
- the processing proceeds to step S13 (FIG. 22).
- the image 301 having a large feature is similarly referred to The image 302 is set.
- step S13 stitch processing of the reference image is performed.
- the reference images 302-1 to 302-500 generated by the reference image generation processing unit 105 are supplied to the reference image stitch processing unit 107.
- the reference image stitching processing unit 107 performs stitching processing using the supplied reference images 302-1 to 302-500, and generates a stitching reference map 303 (stitch reference information 303).
- the stitch processing using the reference images 302-1 to 302-500 is performed as described with reference to FIGS. That is, by detecting portions where the characteristic regions of the reference images 302-1 to 302-500 match (similar) and performing superposition or trimming, an image (stitching reference map 303) having no overlapping portion is generated.
- the process corresponding to the feature region to be used is performed. It will be.
- the stitching process is performed such that the extracted edge areas are combined.
- the process of matching the feature point is a stitching process. It is executed as a process.
- the variance values are combined (the arrangement of the areas having the variance value is the same or approximate Stitching process) is performed.
- an inspection image is generated.
- the inspection image is, for example, an image (an image generated by inverse matrix calculation from the imaged image) captured at a wavelength suitable for sensing the vegetation when the vegetation is inspected (sensed).
- An inspection image is generated by the designated wavelength calculation processing unit 104 and the inspection wavelength image extraction unit 106.
- the designated wavelength calculation processing unit 104 supplies the multispectral image (each of a plurality of wavelengths) to the inspection wavelength image extraction unit 106
- the inspection wavelength image extraction unit 106 supplies the multispectral image
- An inspection image is generated by extracting an image corresponding to the wavelength designated as the inspection wavelength from the plurality of images.
- the designated wavelength calculation processing unit 104 uses a multispectral image to generate an image of a predetermined wavelength based on Expression (2) (see FIGS. 18 to 21).
- the inspection wavelength image extraction unit 106 generates an image corresponding to the wavelength designated as the inspection wavelength from the plurality of supplied images. By extracting, an inspection image is generated.
- the inspection wavelength image extraction unit 106 is generated by the designated wavelength calculation processing unit 104 and supplied.
- An inspection image is generated by extracting an image corresponding to the wavelength designated as the inspection wavelength from the plurality of images.
- the designated wavelength calculation processing unit 104 can be configured to generate an image of the wavelength extracted by the inspection wavelength image extraction unit 106.
- the inspection wavelength image extraction unit 106 is supplied.
- the processed image is treated as an image corresponding to the wavelength designated as the inspection wavelength (the processing such as extraction and generation is not performed, and the supplied image is used as it is).
- an image of the designated wavelength may be generated by the inspection wavelength image extracting unit 106 instead of the designated wavelength calculation processing unit 104.
- a multispectral image is supplied from the designated wavelength calculation processing unit 104 to the inspection wavelength image extraction unit 106, and the inspection wavelength image extraction unit 106 performs an operation based on Equation (2) using the supplied multispectral image.
- the inspection image of the designated wavelength may be generated by performing.
- step S14 When an inspection image is generated in step S14, the process proceeds to step S15.
- the inspection image extracted by the inspection wavelength image extraction unit 106 is supplied to the inspection image stitch processing unit 108.
- the inspection image stitch processing unit 108 is also supplied with a stitching reference map 303 (stitch reference information 303) from the reference image stitch processing unit 107.
- the inspection image stitch processing unit 108 stitch-processes the inspection image using the stitch reference information 303 as described with reference to FIG. 17 to generate a final inspection image.
- the stitch reference information 303 is information such as which position of which image is cut out and how it is pasted together.
- the stitch reference information 303 is information including at least information indicating a positional relationship between an image obtained by capturing a predetermined area and an image obtained by capturing an area overlapping the area, and the stitching process is performed. It is information that can be used as metadata attached to an image.
- the inspection image stitch processing unit 108 refers to the stitch reference information 303 to determine which inspection image of the plurality of inspection images is used and which position (which region) of the inspection image to be used is cut out Then, it is determined how to combine the determined area with the area similarly cut out from other inspection images, and by repeating such processing, stitch processing of the inspection image is performed, and the final inspection image is obtained.
- the final inspection image is used by being mapped onto a map that has already been generated.
- stitch reference processing 303 is generated using stitch reference information 303, and stitch processing is performed on the inspection image of a plurality of wavelengths by stitch processing of the inspection image. It is possible to shorten the time involved and reduce the processing load.
- the image processing system 50 performs processing based on the flowchart shown in FIG.
- step S11 when the feature amount computing process is performed, the process is performed based on the flowchart of the second feature amount computing process illustrated in FIG. 25 except that the process is performed similarly to the flowchart illustrated in FIG. Then, the explanation about the different processing is added, and the explanation about the processing performed similarly is omitted.
- step S101 a reduced image is generated.
- the reduced image 301ab is generated by using the image 301 as a reference image, multiplying the height of the image 301 by 1 / A and multiplying the width by 1 / B.
- the reduced image 301mn is generated by multiplying the height of the image 301 by 1 / M and multiplying the width by 1 / N.
- a reduced image 301ab is generated, in which the height of the image 301 is halved and the width is halved.
- a reduced image 301 mn is generated in which the height of the image 301 is multiplied by 1 ⁇ 5 and the width is multiplied by 1 ⁇ 5.
- step S101 When a reduced image is generated in step S101, the process proceeds to step S102.
- step S102 high frequency components are extracted (edges are extracted) (a feature amount is calculated). This process can be performed in the same manner as the process of step S31 of the flowchart shown in FIG.
- the image 301 for which high frequency components are to be extracted is the original image and the reduced image, and for example, each of the image 301, the image 301ab, and the image 301mn as shown in FIG. A high frequency component is extracted from each of them, and a feature amount (such as a ratio of an edge area) is calculated based on the extraction result.
- processing such as multiplying a predetermined coefficient with the feature amounts of the reduced image may be appropriately performed so as to be the same condition as the original image.
- the feature amount when the sum of feature points or the sum of statistical values is used as the feature amount, the number of target pixels (number of areas) differs between the original image and the reduced image, so When the feature amount and the feature amount of the reduced image are compared, an erroneous comparison result may be obtained.
- the calculated feature amount is multiplied by a predetermined coefficient such as a reduction ratio to calculate the same condition as the original image (pixel same as the desired image)
- a predetermined coefficient such as a reduction ratio to calculate the same condition as the original image (pixel same as the desired image)
- a process of converting into a feature quantity that can be handled as calculated by the number and the number of areas) may be included.
- step S102 the process of calculating the feature amount by extracting high frequency components
- the feature amount calculated from such an original image and the feature amount calculated from the reduced image are The description will be continued on the assumption that the process for handling the feature quantity calculated under the same condition is included.
- step S11 when the feature amount calculation process (step S11 in FIG. 22) is completed, the process proceeds to step S12.
- step S12 reference image generation processing is performed.
- step S12 The reference image generation process performed in step S12 is performed based on the flowchart of the first reference image generation process shown in FIG. 24, and since the description thereof has already been made, the description thereof is omitted here.
- step S51 feature amounts are compared, but the image with which the feature amounts are compared is different from the first generation process (the first reference image generation process) described above, so this is a diagram for this point. Add description with reference to 27.
- the reference image 302-1 is generated from the image 301-1 captured at time t1, as shown in FIG. 27, first, the image 301A-1 of wavelength A, its reduced image 301Aab-1, and its reduced image 301Amn A feature amount is calculated from each of -1.
- the feature amount is calculated from the image 301B-1 of the wavelength B, the reduced image 301Bab-1 and the reduced image 301Bmn-1, respectively. Further, the feature amount is calculated from each of the image 301C-1 of the wavelength C, the reduced image 301Cab-1 thereof, and the reduced image 301Cmn-1.
- the feature amount is calculated from each of the image 301D-1 of the wavelength D, the reduced image 301Dab-1 thereof, and the reduced image 301Dmn-1. Further, the feature amount is calculated from each of the image 301E-1 of the wavelength E, the reduced image 301Eab-1 and the reduced image 301Emn-1.
- the feature amount is calculated from each of the image 301F-1 of the wavelength F, the reduced image 301Fab-1 thereof, and the reduced image 301Fmn-1. Further, the feature amount is calculated from each of the image 301G-1 of the wavelength G, the reduced image 301Gab-1 thereof, and the reduced image 301Gmn-1.
- the feature amount is calculated from each of the image 301H-1 of the wavelength H, the reduced image 301Hab-1 and the reduced image 301Hmn-1.
- an image having the largest feature amount is detected. That is, in the second generation process, the reduced image is also processed and processed.
- the original image of the image detected as having the largest feature amount is set as the reference image 302-1.
- the image 301A-1 which is the original image of the reduced image 301Amn-1, is set as the reference image 302-1.
- the image 301A-1 is set as the reference image 302-1.
- an image with more features can be extracted and used as a reference image.
- step S13 (FIG. 22). Since the process after step S13 has already been described, the description thereof is omitted.
- the image processing system 50 performs processing based on the flowchart shown in FIG.
- step S201 a reduced image is generated.
- This process can be performed in the same manner as the process of step S101 of the flowchart shown in FIG. 25 in the second generation process.
- the reduced image is generated, that is, the case where the third generation process is performed in combination with the second generation process is described as an example, it may be combined with the first generation process.
- the multispectral image (original image) supplied from the MS sensor 103 may be used as it is without generating the reduced image in step S201.
- step S202 a process of dividing into blocks is performed.
- the process of dividing into blocks will be described with reference to FIG.
- the left view is the same as the view shown in FIG. 26, and is a view for explaining how to generate a reduced image.
- the reduced image 301ab is generated by multiplying the length of the original image 301 by 1 / A and multiplying the width by 1 / B.
- the reduced image 301 mn is generated by multiplying the length of the original image 301 by 1 / m and the width by 1 / n.
- the process of dividing into blocks is performed on both the original image and the reduced image.
- the original image 301 is divided into C ⁇ D blocks by dividing the height into C and dividing the width into D.
- the reduced image 301ab is divided into (C / A) ⁇ (D / B) blocks by dividing the height into C / A and dividing the width into D / B.
- the reduced image 301 mn is divided into (C / M) ⁇ (D / N) blocks by dividing the height into C / M and dividing the width into D / N.
- the number of pixels in each block becomes the same in the number of vertical and horizontal pixels, so that extraction and comparison of feature quantities can be easily performed in the subsequent processing.
- the number of pixels of one block of the original image 301 is 100 ⁇ 100 pixels.
- the number of pixels of one block after dividing the original image 301 is 100 ⁇ 100 pixels. Further, the number of pixels in one block after dividing the reduced image 301ab is also 100 ⁇ 100 pixels. That is, the number of pixels of one block of the original image 301 and that of the reduced image 301ab are the same. Therefore, as described above, when the number of pixels in each block is the same, extraction and comparison of feature amounts in the subsequent processing can be easily performed.
- step S202 such division processing is performed for each multispectral image.
- division processing is performed on an image of every eight wavelengths of wavelengths A to H.
- the division processing is performed only on the original image 301 (images 301A to 301H) of the multispectral image.
- step S203 high frequency components are extracted for each block.
- This process can be basically performed in the same manner as step S31 (first generation process) of the flowchart shown in FIG. 23 or step S102 (second generation process) of the flowchart shown in FIG. 25. However, there is a difference in that each block performs.
- the reference image 302-1 is generated from the image 301-1 captured at time t1, as shown in FIG. 30, first, the image 301A-1 of wavelength A, the reduced image 301Aab-1, the reduced image 301Amn- A feature amount is calculated from each block of 1.
- the feature amount is calculated for each of C ⁇ D blocks of the image 301A-1 of the wavelength A, and (C / A) ⁇ of the reduced image 301Aab-1
- a feature amount is calculated for each of (D / B) blocks, and a feature amount is calculated for each of (C / M) ⁇ (D / N) blocks of the reduced image 301Amn-1.
- X feature quantities are calculated from each block of the image 301B-1 of wavelength B, the reduced image 301Bab-1, and the reduced image 301Bmn-1, and the image 301C-1 of wavelength C, the reduced image 301Cab-1, X feature quantities are calculated from the respective blocks of the reduced image 301Cmn-1 and X images from the respective blocks of the image 301D-1 of the wavelength D, the reduced image 301Dab-1, and the reduced image 301Dmn-1.
- a feature amount is calculated.
- X feature quantities are calculated from respective blocks of the image 301E-1 of wavelength E, the reduced image 301Eab-1, and the reduced image 301Emn-1, and the image 301F-1 of wavelength F, the reduced image 301Fab-1 And X feature quantities are calculated from each block of the reduced image 301Fmn-1, and X features from each block of the image 301G-1 of the wavelength G, the reduced image 301Gab-1, and the reduced image 301Gmn-1 are calculated.
- X feature amounts are calculated from each block of the image 301H-1, the reduced image 301Hab-1, and the reduced image 301Hmn-1 of the wavelength H.
- X feature quantities are calculated from images of eight wavelengths of wavelengths A to H, so (8 ⁇ X) feature quantities are calculated.
- the processing relating to the calculation of the feature amount is performed on all of the multispectral images supplied from the MS sensor 103.
- the feature amount is calculated for each block.
- the size of each block (the number of pixels included in one block) is the same as described above. Therefore, the feature quantities calculated from each block can be compared as they are.
- step S12 reference image generation processing is performed.
- the reference image generation process (third reference image generation process performed in the third generation process) performed in step S12 will be described with reference to the flowchart in FIG.
- step S231 feature amounts of all blocks are compared. Then, in step S232, the features are sorted in descending order. For example, as described with reference to FIG. 30, X feature quantities are calculated from images of eight wavelengths A to H, and a total of (8 ⁇ X) feature quantities are calculated. . The (8 ⁇ X) feature quantities are compared and sorted in descending order of feature quantity.
- step S233 the feature image is reconstructed and a reference image is generated.
- the reconstruction of the feature image generation of the reference image
- FIG. 32 is a diagram for describing reconstruction of a feature image in a case where the block with the highest feature amount is the reduced image 301Hmn-1 when the blocks are sorted in descending order of the feature amount.
- the reduced image 301Hmn-1 is the smallest reduced image among the reduced images generated from the original image 301H-1. If the feature amount of the reduced image 301Hmn-1 is the largest among the (8 ⁇ X) feature amounts, the reduced image 301Hmn-1 is set as the reference image 302-1.
- the reduced image 301Hmn-1 is reduced, when it is set to the reference image 302-1, the original size is restored, that is, in this case, the original image 301H-1 is returned, and the returned image is returned. , And the reference image 302-1.
- the vertical size of the reduced image 301Hmn-1 is multiplied by M and the horizontal size is multiplied by N to restore the original size, and the image is set as the reference image 302-1 .
- the original image of the reduced image 301Hmn-1 that is, the original image 301H-1 in this case may be set as the reference image 302-1.
- FIG. 32 is an example in which the original image 301 to be the reference image 302 can be generated in one block.
- a case where the original image 301 to be the reference image 302 is reconstructed with a plurality of blocks will be described with reference to FIG.
- the feature quantities extracted from the image of wavelength A and the image of wavelength B are high, and the regions with high feature quantities are cut out from the image of wavelength A and the image of wavelength B, respectively.
- An image (reference image) having a high feature amount is generated by positioning and combining these regions in the corresponding region.
- Each of 12 blocks located at a substantially central portion of the image 301A-1 of the wavelength A is cut out of the image 301A-1 as a block having a high feature amount.
- description will be continued assuming that twelve blocks form the region 501. This area 501 is cut out of the image 301A-1.
- the area 501 is cut out as it is.
- the area 501 is set as an area of the position of the reference image 302-1 corresponding to the position where the area 501 was located in the image 301A-1.
- Each of two blocks located at the lower left of the reduced image 301Aab-1 obtained by reducing the image 301A-1 of the wavelength A is cut out from the reduced image 301Aab-1 as a block having a high feature amount.
- description will be continued assuming that two blocks form the region 502. This area 502 is cut out from the reduced image 301Aab-1.
- the area 502 is not cut out from the reduced image 301Aab-1 as it is, but an image corresponding to the position where the area 502 is located in the reduced image 301Aab-1.
- An area at the position of 301A-1 original image 301A-1 is cut out.
- the area 502 is located at the lower left in the reduced image 301Aab-1, and the size (this size, eight blocks) corresponding to the area 502 is cut out.
- the area 502 may be A-fold in the vertical direction and converted to a B-fold in the horizontal direction, and the converted area 502 may be cut out (used).
- the original sized area 502 is set as an area of the position of the reference image 302-1 corresponding to the position where the area 502 was located in the image 301A-1. In this case, an area 502 is set in the lower left area of the reference image 302-1.
- each of two blocks located approximately right to the center of the reduced image 301Aab-1 obtained by reducing the image 301A-1 of the wavelength A is cut out from the reduced image 301Aab-1 as a block having a high feature amount.
- the description will be continued assuming that two blocks form the region 503.
- This area 503 is cut out from the reduced image 301Aab-1 to be an area at a corresponding position of the reference image 302-1.
- the area 503 is also converted to the original size as in the above-described area 502, and then made part of the reference image 302-1.
- Each of the four blocks located at the upper right of the image 301B-1 of the wavelength B is cut out from the image 301B-1 as a block having a high feature amount.
- the description will be continued assuming that four blocks form the region 504. This area 504 is cut out from the image 301B-1.
- the area 504 is set as an area (in this case, upper right) of the reference image 302-1 corresponding to the position (in this case, upper right) where the area 504 was located in the image 301B-1.
- Each of two blocks located at the upper left of the reduced image 301Bab-1 obtained by reducing the image 301B-1 of the wavelength B is cut out from the reduced image 301Bab-1 as a block having a high feature amount.
- description will be continued assuming that two blocks form the region 505.
- This area 505 is cut out from the reduced image 301Bab-1 to be an area at the corresponding position of the reference image 302-1.
- the reduced image 301Bab-1 is an image which has been reduced
- the area 505 is converted to the original size as in the above-described area 502 and the like, and is made part of the reference image 302-1.
- each of two blocks located at the lower right of the reduced image 301Bab-1 obtained by reducing the image 301B-1 of the wavelength B is cut out from the reduced image 301Bab-1 as a block having a high feature amount.
- description will be continued on the assumption that two blocks form the region 506.
- This area 506 is cut out from the reduced image 301Bab-1 to be an area at the corresponding position of the reference image 302-1.
- the reduced image 301Bab-1 is an image which has been reduced
- the area 506 is converted to the original size as in the above-described area 502 and the like, and is made part of the reference image 302-1.
- a block having a high feature amount is preferentially selected, and a reference image 302-1 configured of blocks having a high feature amount is generated.
- a reference image 302-1 configured of blocks having a high feature amount is generated.
- an image with more features can be extracted and used as a reference image.
- step S13 (FIG. 22).
- the process of shifting to step S13 has already been described, and thus the description thereof is omitted.
- images of other wavelengths are also generated using the multispectral image acquired by the MS sensor 103 and generated A process using a plurality of images including an image will be described.
- FIG. 34 is a flowchart for describing a fourth generation process of the final inspection image.
- the wavelength to be extracted is designated.
- the designated wavelength calculation processing unit 104 designates the wavelength of the image to be generated.
- the designated wavelength may be a wavelength set in advance, or may be a wavelength set with reference to the illuminance value or the mode. In the case of being preset, for example, wavelengths (400 nm, 450 nm, 500 nm,..., 900 nm) for every 50 nm between 400 nm and 900 nm can be sequentially designated.
- a wavelength of light suitable for vegetation sensing for example, a wavelength of 650 nm to 850 nm may be specified finely.
- step S301 when the designated wavelength calculation processing unit 104 designates (sets) a predetermined wavelength, in step S302, an image of the designated wavelength is generated.
- the designated wavelength is a multispectral image obtained from the MS sensor 103
- the image 301 corresponding to the designated wavelength is extracted and supplied to the reference image generation processing unit 105.
- an operation in which the image 301 corresponding to the designated wavelength is the data of the multispectral image substituted in the equation (2) Is generated and supplied to the reference image generation processing unit 105.
- step S303 the feature amount calculation process is executed by the reference image generation processing unit 105. Similar to the first to third generation processes, the feature quantity calculation process performed in step S303 can be performed in the same manner as the feature quantity calculation process performed in step S11 of the flowchart shown in FIG.
- the feature quantity computing process in the first generation process may be applied, and the feature quantity may be calculated based on the flowchart of the first feature quantity computing process shown in FIG.
- the feature amount may be calculated based on the flowchart of the second feature amount calculation process shown in FIG. 25 by applying the feature amount calculation process in the second generation process.
- the feature quantity computing process in the third generation process may be applied, and the feature quantity may be calculated based on the flowchart of the third feature quantity computing process shown in FIG.
- the image for which the feature amount is calculated is an image supplied from the designated wavelength calculation processing unit 104 to the reference image generation processing unit 105, and the image is a multispectral image or a multispectral image.
- the generated image of the predetermined wavelength is included.
- step S304 it is determined whether the designation of the wavelength is completed. For example, when the designated wavelength is set in advance, the designated wavelength calculation processing unit 104 performs the determination process in step S304 by determining whether all the set wavelengths have been designated.
- step S304 If it is determined in step S304 that the wavelength instruction has not ended, the process returns to step S301, and the subsequent processes are repeated. That is, all the wavelengths to be designated are designated, all the images to be designated are extracted or generated, and the processing of steps S301 to S304 is repeated until the feature quantities are extracted from all the images to be designated.
- step S304 when it is determined in step S304 that the wavelength instruction has ended, the process proceeds to step S305.
- step S305 the reference image generation processing unit 105 executes reference image generation processing.
- the reference image generation process performed in step S305 can be performed in the same manner as the reference image generation process performed in step S12 of the flowchart illustrated in FIG. 22 as in the first to third generation processes.
- the reference image generation processing in the first generation processing may be applied, and the reference image may be generated based on the flowchart of the reference image generation processing shown in FIG.
- the reference image generation processing in the second generation processing is applied, and the reference image generation processing corresponding to the case where the feature amount is calculated based on the flowchart of the second feature amount calculation processing shown in FIG. You may
- the reference image generation processing in the third generation processing may be applied, and the reference image may be generated based on the flowchart of the third reference image generation processing shown in FIG.
- the reference image is obtained from the multispectral image supplied from the designated wavelength calculation processing unit 104 to the reference image generation processing unit 105 or an image group including an image of a predetermined wavelength generated from the multispectral image. It is a generated image.
- steps S306 to S308 can be performed in the same manner as the processes of steps S13 to S15 of the flowchart (first to third generation processes) illustrated in FIG. That is, since the process after the reference image is generated can be performed in the same manner as the first to third generation processes, the description thereof is omitted here.
- an image whose feature amount is to be calculated is generated by designating a wavelength.
- the reflectance of the reflected light from the subject differs depending on the wavelength of the light to be irradiated.
- the image may be an image in which the subject is clearly imaged or an image in which the object is imaged in a blurred manner.
- the subject is extracted as a feature area from the image captured in a blurry manner. It is difficult to calculate feature quantities.
- an image of a wavelength suitable for the subject is generated, an image in which the subject can be easily detected, that is, an image in which the feature region can be easily extracted can be obtained.
- the fourth generation process by specifying a plurality of wavelengths and generating a plurality of images, it is possible to generate an image of a wavelength suitable for the subject to be detected.
- the fourth generation processing it is possible to more appropriately extract the feature region and to easily calculate the feature amount. Also, a reference image can be generated from the feature amount calculated by extracting such an appropriate feature region. In addition, since a reference image having a large feature amount can be generated, the stitch processing of the reference image can be performed more accurately, for example, for detecting a region where the feature regions match or arranging reference images based on the detection. It will be possible to do.
- the fifth generation process is a process combining any one of the first to fourth generation processes and the orthographic process.
- the aerial shot image Because it is a projection, it may be photographed as a distorted shape due to the influence of the elevation of the terrain, and the positional relationship may not coincide with the actual topography.
- ortho processing also referred to as ortho correction
- the image after the ortho processing is an image in which the unevenness of the terrain or the inclination of the photograph is corrected, and for example, the image can be handled in the same manner as the topographic map.
- an image is captured in a state where the images are superimposed.
- the overlapping area may be, for example, a high overlap rate of 80% to 90%.
- a process of recognizing a plurality of parallax images as a 3D image and converting it into an image viewed from directly above using this matter is an orthodontic process.
- Position information obtained from a GPS (Global Positioning System), attitude information from an IMU (Inertial Measurement Unit: inertial measurement device), and the like are used to perform the orthogonalization process.
- GPS Global Positioning System
- IMU Inertial Measurement Unit: inertial measurement device
- FIG. 35 is a flowchart for explaining the fifth generation process of the final inspection image.
- the process based on the flowchart shown in FIG. 35 shows the case where the fourth generation process includes the ortho processing, and here, the case where the fourth generation process includes the ortho processing is described as an example. I do.
- the ortho processing may be included in any of the first to third generation processing, and which generation processing is to be combined with is appropriately set according to the situation where the image processing system 50 is used. can do.
- step S401 position information and posture information are extracted.
- GPS or IMU is mounted on the mobile measuring device 201 to which the imaging device 11 or the imaging device 11 is attached, and position information and posture information are recorded each time imaging is performed by the imaging device 11 .
- the captured image (multispectral image) and position information and posture information acquired when the image is captured are associated with each other, and managed so that the other information can be read out from one of the information.
- multispectral images, position information, and posture information are stored in association with each other in a storage (not shown).
- the storage can be provided in the imaging device 11 or in the movement measuring device 201.
- step S401 for example, position information and posture information associated with the multispectral image to be processed are read out from the storage.
- step S402 the wavelength to be extracted is designated.
- steps S402 to S406 are performed in the same manner as steps S301 to S305 (fourth generation process) of the flowchart shown in FIG. 34, and thus the description thereof is omitted here.
- steps S402 to S406 is the flowchart shown in FIG. The process is performed in the same manner as the process of steps S11 and S12.
- step S406 when the reference image generation processing ends and a reference image is generated, the processing proceeds to step S407.
- the reference image is subjected to an orthodontic process.
- the orthodontic process can be performed by a process that is generally performed, for example, by performing an orthographic conversion on an aerial image.
- the image (ortho image) subjected to the ortho processing is displayed in the correct size and position with no tilt as seen from above as in the case of a map, with the position shift of the object on the image eliminated and the aerial image taken
- the image has been converted to the
- step S407 the reference image is subjected to ortho processing to generate an ortho image of the reference image.
- the position information and the attitude information are read out in step S401 and are used at the ortho processing .
- step S408 the ortho image of the reference image is used, and the process in step S408 in which the stitching process is performed is performed in the same manner as the process of step S306 in FIG. 34 (step S13 in FIG. 22).
- the difference is that the stitch processing is performed on the orthonormalized reference image.
- step S408 when the stitched reference image, that is, the stitching reference map 303 (stitch reference information 303) is generated, the process proceeds to step S409.
- step S409 The process of generating the inspection image in step S409 is performed in the same manner as the process of step S307 in FIG. 34 (step S14 in FIG. 22), and thus the description thereof is omitted here.
- step S410 ortho processing is performed on the generated inspection image.
- the orthographic processing on the inspection image can be performed in the same manner as the above-described orthographic processing on the reference image.
- orthorectifying processing on the inspection image it is possible to use orthodization information when the reference image is orthodized.
- step S411 stitch processing is performed on the inspection image.
- the stitch processing on the inspection image in step S411 is performed in the same manner as the processing of step S308 in FIG. 34 (step S15 in FIG. 22) except that the inspection image is an orthoimage, and thus the description thereof is omitted here.
- the generated final inspection image can also be the orthographic inspection image.
- the orthodization information used when the process (step S407) of performing the process of orthogonalizing the reference image can be used. Therefore, it is possible to easily perform the process related to the ortho formation.
- the final inspection image generated can be mapped to a map. Since the final inspection image is an orthoimage, the shape of the subject being imaged is correct and the position is also correctly arranged, so it is easy to use it superimposed on other maps. .
- the operation of the image processing system 50 including the process of referring to the illuminance value from the illuminance sensor 12 will be described as the sixth generation process.
- the image processing system 51 shown in FIG. 36 has the same basic configuration as the image processing system 50 shown in FIG. 3, but an illuminance meter 601, a reflectance image generation unit 602, and a reflectance image stitch processing unit 603. And the specific index image calculation unit 604.
- the illuminance meter 601 is configured to be included in the image processing system 51 in FIG. 36, the illuminance meter 601 corresponds to the illuminance sensor 12 (FIG. 1), calculates the illuminance value, and the inspection wavelength image extraction unit 106 It is supposed to be supplied.
- the illuminance meter 601 acquires the spectral characteristics of the light source as shown in FIG. 19, for example, as the illuminance sensor 12 described above.
- the inspection wavelength image extraction unit 106 generates an inspection image as in the above-described embodiment.
- the inspection image from the inspection wavelength image extraction unit 106 is supplied to the reflectance image generation unit 602.
- the reflectance image generation unit 602 generates a reflectance image by converting the supplied inspection image into an image based on the reflectance, and supplies the reflectance image to the reflectance image stitch processing unit 603.
- the reflectance image stitch processing unit 603 is a processing unit corresponding to the inspection image stitch processing unit 108 of FIG. 3, and the image to be subjected to the stitch processing is the reflectance image from the reflectance image generation unit 602, Stitching is performed on the reflectance image.
- the image generated by the reflectance image stitch processing unit 603 is described as a final reflectance image here.
- the specific index image calculation unit 604 calculates an image related to a specific index, for example, a vegetation index (NDVI: Normalized Difference Vegetation Index), using the final reflectance image.
- NDVI Normalized Difference Vegetation Index
- the image processing system 51 is a system for inspecting the vegetation
- NDVI Vegetation Index
- an image of red (wavelength) and an image of infrared (wavelength) are used as inspection images.
- NDVI is an index showing the distribution status and activity of vegetation, and the explanation will be continued by taking NDVI as an example here, but the present technology can be applied even when other indices are used. .
- NDVI value (Dp (IR)-Dp (R)) / (Dp (IR) + Dp (R)) ... (3)
- Dp (IR) represents the reflectance of an infrared region
- Dp (R) represents the reflectance of red (R) of a visible region.
- the reflectance can be obtained by (inspection image / light source information).
- the reflectance is generated by the reflectance image generation unit 602 dividing the inspection image from the inspection wavelength image extraction unit 106 by the light source information from the illumination meter 601. Find and generate a reflectance image.
- the illuminance value from the illuminance meter 601 may be directly supplied to the reflectance image generation unit 602.
- the reflectance image generated by the reflectance image generation unit 602 is supplied to the reflectance image stitch processing unit 603.
- the reflectance image stitch processing unit 603 performs a stitch process on the reflectance image, generates a final reflectance image, and supplies the final reflectance image to the specific index image calculation unit 604.
- the specific index image calculation unit 604 uses the final reflectance image in the infrared region after the stitch processing and the red reflectance image in the visible region to generate the above-described equation (3 The image based on the NDVI value is generated by performing the operation based on the).
- step S501 light source information is extracted.
- the light source information the illuminance value when the multispectral image to be processed is captured is extracted.
- step S502 the wavelength to be extracted is designated.
- the processes of steps S502 to S508 are performed in the same manner as the processes of steps S301 to S307 of the flowchart illustrated in FIG. 34, and thus the description thereof is omitted here.
- step S509 a reflectance image is calculated.
- the reflectance image generation unit 602 calculates reflectance by calculating (inspection image / light source information), and generates a reflectance image using the reflectance.
- step S510 the reflectance image stitch processing unit 603 performs stitch processing using the reflectance image generated by the reflectance image generation unit 602.
- the stitch process in step S510 can be performed, for example, in the same manner as the stitch process in step S15 of FIG. 22, and thus the description thereof is omitted here.
- step S511 the specific index image calculation unit 604 generates a specific index image, for example, a vegetation index image.
- the specific index image calculation unit 604 when the specific index image calculation unit 604 generates the vegetation index image, the reflectance image in the infrared region and the red reflection in the visible region are calculated in order to perform the calculation based on the equation (3) described above. Get rate image. Then, the specific index image calculation unit 604 generates a vegetation index image by substituting the acquired reflectance image of the infrared region and the red reflectance image of the visible region into Expression (3).
- the present technology can also be applied to the case of generating an image of a specific index, such as a vegetation index image.
- a stitching reference map 303 (stitch reference information 303) for stitch is generated, and a specific index image is generated based on the stitching reference map 303 (stitch reference information 303). Therefore, even when, for example, an image of a different specific index is generated, the specific index image can be easily generated based on the stitching reference map 303 (stitch reference information 303).
- the reference image and the reference image are stitch-processed to generate the stitching reference map 303 (stitch reference information 303).
- stitch reference information 303 stitch reference information
- the MS sensor 103 can acquire images (multispectral images) of a plurality of wavelengths.
- multispectral images can be used to acquire images of yet other wavelengths.
- the image generation unit 62 can generate an image of an arbitrary wavelength.
- the amount of data supplied from the imaging unit 61 to the image generation unit 62 can be reduced.
- processing in the image generation unit 62, the first stitch processing unit 63, and the second stitch processing unit 64 can be reduced.
- the stitching reference map 303 (stitch reference information 303) is generated, and based on the stitching reference map 303 (stitch reference information 303), the image of the desired wavelength is subjected to stitching processing. For example, even when images of a plurality of wavelengths (a plurality of specific index images) are generated, there is no need to perform the stitch processing from the beginning for each of a plurality of specific index images. Since the processing can be performed based on the reference information 303), the processing can be significantly reduced.
- the amount of data handled in the image processing system 50 (51) can be reduced, and the processing load can be reduced, so the processing time in the image processing system 50 (51) can also be reduced.
- the above-described series of processes may be performed by hardware or software.
- a program that configures the software is installed on a computer.
- the computer includes, for example, a general-purpose personal computer that can execute various functions by installing a computer incorporated in dedicated hardware and various programs.
- FIG. 38 is a block diagram showing an example of a hardware configuration of a computer that executes the series of processes described above according to a program.
- a central processing unit (CPU) 1001, a read only memory (ROM) 1002, and a random access memory (RAM) 1003 are mutually connected by a bus 1004.
- An input / output interface 1005 is further connected to the bus 1004.
- An input unit 1006, an output unit 1007, a storage unit 1008, a communication unit 1009, and a drive 1010 are connected to the input / output interface 1005.
- the input unit 1006 includes a keyboard, a mouse, a microphone and the like.
- the output unit 1007 includes a display, a speaker, and the like.
- the storage unit 1008 includes a hard disk, a non-volatile memory, and the like.
- the communication unit 1009 includes a network interface or the like.
- the drive 1010 drives removable media 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 1001 loads the program stored in the storage unit 1008 into the RAM 1003 via the input / output interface 1005 and the bus 1004, and executes the program. Processing is performed.
- the program executed by the computer (CPU 1001) can be provided by being recorded on, for example, a removable medium 1011 as a package medium or the like. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the storage unit 1008 via the input / output interface 1005 by mounting the removable media 1011 in the drive 1010.
- the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008.
- the program can be installed in advance in the ROM 1002 or the storage unit 1008.
- the program executed by the computer may be a program that performs processing in chronological order according to the order described in this specification, in parallel, or when necessary, such as when a call is made. It may be a program to be processed.
- system represents the entire apparatus configured by a plurality of apparatuses.
- the present technology can also have the following configurations.
- (1) Regarding a second imaging area that generates a first reference image regarding the first imaging area based on a plurality of first images regarding the first imaging area, and at least a part of which overlaps the first imaging area
- An image generation unit that generates a second reference image related to the second imaging region based on a plurality of second images;
- a processing unit that generates positioning information indicating a correspondence between the first imaging region and the second imaging region based on the first reference image and the second reference image;
- An image processing apparatus comprising: (2) The processing unit stitch-processes the first reference image and the second reference image to generate a reference composite image including the positioning information.
- the image processing apparatus according to (1).
- the processing unit acquires a composite image by performing a stitching process on one of the plurality of first images and one of the plurality of second images based on the positioning information.
- the processing unit sets an area corresponding to the first imaging area as an image based on the plurality of first images based on the positioning information, and the plurality of areas corresponding to the second imaging area. Acquiring the composite image by setting the image based on the second image of The image processing apparatus according to (1) or (2).
- the plurality of first images are images based on different wavelengths, or the plurality of second images are images based on different wavelengths.
- the image processing apparatus according to any one of the above (1) to (4).
- the image generation unit preferentially acquires an image having a large feature amount from the plurality of first images as the first reference image, and prioritizes an image having a large feature amount from the plurality of second images.
- the image processing apparatus according to any one of the above (1) to (5).
- the image generation unit acquires an image with the largest feature amount among the plurality of first images as the first reference image, and selects an image with the largest feature amount among the plurality of second images. Get as 2 reference images, The image processing apparatus according to any one of the above (1) to (5).
- the image generation unit generates a first reduced image by reducing the first image, generates the first reference image with the first reduced image as a target, and reduces the second image.
- the image processing apparatus according to any one of (1) to (7), wherein a second reduced image is generated, and the second reduced image is also generated to generate the second reference image.
- a second reduced image is generated, and the second reduced image is also generated to generate the second reference image.
- the image processing apparatus When the first reduced image is set to the first image, an original first image of the first reduced image is set to the first reference image, and the second reduced image Is set to the second image, the original second image of the second reduced image is set to the second reference image.
- the image processing apparatus according to (8).
- the image generation unit divides the first image into blocks, calculates a feature amount for each block, generates the first reference image from the block having the large feature amount, and generates the second image.
- the image processing apparatus according to any one of (1) to (7), wherein the image processing apparatus is divided into blocks, a feature amount is calculated for each block, and the second reference image is generated from the block having the large feature amount.
- the image generation unit generates a first reduced image obtained by reducing the first image, divides the first reduced image into blocks, and targets the blocks of the first reduced image.
- a second reduced image generated by reducing the second image, the second reduced image is also divided into blocks, and the second reduced image block is also targeted.
- the image processing apparatus according to (10), which generates the second reference image.
- the corresponding block of the original first image of the first reduced image is set as a part of the first reference image
- the corresponding block of the original second image of the second reduced image is a part of the second reference image
- the first reference image and the second reference image are respectively subjected to an orthodontic process,
- the image processing apparatus according to any one of (1) to (13), wherein the processing unit generates the positioning information on the basis of the first reference image and the second reference image that are converted into the ortho pattern. .
- a second imaging area that generates a first reference image regarding the first imaging area based on a plurality of first images regarding the first imaging area, and at least a part of which overlaps the first imaging area Generating a second reference image on the second imaging area based on the plurality of second images;
- Positioning information indicating a correspondence between the first imaging region and the second imaging region is generated based on the first reference image and the second reference image.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
図1は、本技術を適用した画像処理装置を含む画像処理システムの一実施の形態の構成を示す図である。
図11、図12を参照し、図1乃至図3に示した画像処理システムの動作について説明する。
ここで、MSセンサ103からの信号から生成されるマルチスペクトル画像以外の波長(波長Xとする)の画像Xの生成について説明を加える。所望とされる波長Xの画像は、MSセンサ103から得られるマルチスペクトル画像を用いた逆マトリクス演算を行うことで生成することができる。このことについて説明するために、まず図18を参照し、撮像装置11の動作の概略について再度説明する。
(光源の分光特性L(λ))×(被写体の分光特性P(λ))×(撮像系の分光特性S(λ))=(画像(O(λ)) ・・・(1)
図22のフローチャートを参照し、画像処理システム50(図3)の動作について説明する。なお、以下に、第1乃至第6の生成処理について順次説明を加えるが、各生成処理においては、撮像部61(レンズ101、露光部102、およびMSセンサ103)により、マルチスペクトル画像が取得されている(波長A乃至Hの各画像301が取得されている)ことを前提とし、空撮時における処理については説明を省略する。
画像処理システム50(図3)の他の動作について説明する。最終検査画像を生成するまでの処理として、上記した第1の生成処理においては、ステップS11において、特徴量演算処理を行うとき、MSセンサ103で取得されるマルチスペクトル画像を、そのまま用いて特徴量を演算する例を示した。
画像処理システム50(図3)のさらに他の動作について説明する。最終検査画像を生成するまでの処理として、上記した第1の生成処理、第2の生成処理においては、ステップS11において、特徴量演算処理を行うとき、MSセンサ103で取得されるマルチスペクトル画像を、そのまま、または縮小画像を生成して特徴量を演算する例を示した。
画像処理システム50(図3)のさらに他の動作について説明する。最終検査画像を生成するまでの処理として、上記した第1乃至第3の生成処理においては、ステップS11において、特徴量演算処理を行うとき、MSセンサ103で取得されるマルチスペクトル画像をそのまま用いたり、縮小画像を生成したり、ブロックに分割したりして、特徴量を演算する例を示した。
画像処理システム50(図3)のさらに他の動作について説明する。
画像処理システム50(図3)のさらに他の構成と動作について説明する。
NDVI値=(Dp(IR)-Dp(R))/(Dp(IR)+Dp(R))
・・・(3)
上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
(1)
第1の撮像領域に関する複数の第1の画像に基づいて前記第1の撮像領域に関する第1のリファレンス画像を生成し、前記第1の撮像領域と少なくとも一部が重複する第2の撮像領域に関する複数の第2の画像に基づいて前記第2の撮像領域に関する第2のリファレンス画像を生成する画像生成部と、
前記第1のリファレンス画像と前記第2のリファレンス画像とに基づいて、前記第1の撮像領域と前記第2の撮像領域との対応関係を示す位置決め情報を生成する処理部と、
を備える画像処理装置。
(2)
前記処理部は、前記第1のリファレンス画像と前記第2のリファレンス画像とをスティッチ処理することで、前記位置決め情報を含むリファレンス合成画像を生成する、
前記(1)に記載の画像処理装置。
(3)
前記処理部は、前記位置決め情報に基づいて、前記複数の第1の画像のいずれかの画像と、前記複数の第2の画像のいずれかの画像とをスティッチ処理することで合成画像を取得する、
前記(1)または(2)に記載の画像処理装置。
(4)
前記処理部は、前記位置決め情報に基づいて、前記第1の撮像領域に対応する領域を前記複数の第1の画像に基づく画像に設定し、前記第2の撮像領域に対応する領域を前記複数の第2の画像に基づく画像に設定することで前記合成画像を取得する、
前記(1)または(2)に記載の画像処理装置。
(5)
前記複数の第1の画像はそれぞれ異なる波長に基づく画像、または前記複数の第2の画像はそれぞれ異なる波長に基づく画像である、
前記(1)乃至(4)のいずれかに記載の画像処理装置。
(6)
前記画像生成部は、前記複数の第1の画像から、特徴量の大きい画像を優先して前記第1のリファレンス画像として取得し、前記複数の第2の画像から、特徴量の大きい画像を優先して前記第2のリファレンス画像として取得する、
前記(1)乃至(5)のいずれかに記載の画像処理装置。
(7)
前記画像生成部は、前記複数の第1の画像のうち最も特徴量の大きい画像を前記第1のリファレンス画像として取得し、前記複数の第2の画像のうち最も特徴量の大きい画像を前記第2のリファレンス画像として取得する、
前記(1)乃至(5)のいずれかに記載の画像処理装置。
(8)
前記画像生成部は、前記第1の画像を縮小した第1の縮小画像を生成し、前記第1の縮小画像も対象として前記第1のリファレンス画像を生成し、前記第2の画像を縮小した第2の縮小画像を生成し、前記第2の縮小画像も対象として前記第2のリファレンス画像を生成する
前記(1)乃至(7)のいずれかに記載の画像処理装置。
(9)
前記第1の縮小画像が、前記第1の画像に設定された場合、前記第1の縮小画像の元の第1の画像が、前記第1のリファレンス画像に設定され、前記第2の縮小画像が、前記第2の画像に設定された場合、前記第2の縮小画像の元の第2の画像が、前記第2のリファレンス画像に設定される
前記(8)に記載の画像処理装置。
(10)
前記画像生成部は、前記第1の画像をブロックに分割し、ブロック毎に特徴量を算出し、前記特徴量が大きいブロックから、前記第1のリファレンス画像を生成し、前記第2の画像をブロックに分割し、ブロック毎に特徴量を算出し、前記特徴量が大きいブロックから、前記第2のリファレンス画像を生成する
前記(1)乃至(7)のいずれかに記載の画像処理装置。
(11)
前記画像生成部は、前記第1の画像を縮小した第1の縮小画像を生成し、前記第1の縮小画像もブロックに分割し、前記第1の縮小画像のブロックも対象として、前記第1のリファレンス画像を生成し、前記第2の画像を縮小した第2の縮小画像を生成し、前記第2の縮小画像もブロックに分割し、前記第2の縮小画像のブロックも対象として、前記第2のリファレンス画像を生成する
前記(10)に記載の画像処理装置。
(12)
前記第1の縮小画像のブロックが、前記特徴量が大きいブロックである場合、前記第1の縮小画像の元の第1の画像の対応するブロックが、前記第1のリファレンス画像の一部に設定され、前記第2の縮小画像のブロックが、前記特徴量が大きいブロックである場合、前記第2の縮小画像の元の第2の画像の対応するブロックが、前記第2のリファレンス画像の一部に設定される
前記(11)に記載の画像処理装置。
(13)
前記複数の第1の画像を用いて、所定の波長の第3の画像を生成し、
前記画像生成部は、前記第3の画像から前記第1のリファレンス画像を生成し、
前記複数の第2の画像を用いて、所定の波長の第4の画像を生成し、
前記画像生成部は、前記第4の画像から前記第2のリファレンス画像を生成する
前記(1)乃至(12)のいずれかに記載の画像処理装置。
(14)
前記第1のリファレンス画像と前記第2のリファレンス画像をそれぞれオルソ化処理し、
前記処理部は、前記オルソ化された前記第1のリファレンス画像と前記第2のリファレンス画像に基づいて、前記位置決め情報を生成する
前記(1)乃至(13)のいずれかに記載の画像処理装置。
(15)
第1の撮像領域に関する複数の第1の画像に基づいて前記第1の撮像領域に関する第1のリファレンス画像を生成し、前記第1の撮像領域と少なくとも一部が重複する第2の撮像領域に関する複数の第2の画像に基づいて前記第2の撮像領域に関する第2のリファレンス画像を生成し、
前記第1のリファレンス画像と前記第2のリファレンス画像とに基づいて、前記第1の撮像領域と前記第2の撮像領域との対応関係を示す位置決め情報を生成する、
ステップを含む画像処理方法。
(16)
コンピュータに、
第1の撮像領域に関する複数の第1の画像に基づいて前記第1の撮像領域に関する第1のリファレンス画像を生成し、前記第1の撮像領域と少なくとも一部が重複する第2の撮像領域に関する複数の第2の画像に基づいて前記第2の撮像領域に関する第2のリファレンス画像を生成し、
前記第1のリファレンス画像と前記第2のリファレンス画像とに基づいて、前記第1の撮像領域と前記第2の撮像領域との対応関係を示す位置決め情報を生成する、
ステップを含む処理を実行させるためのプログラム。
Claims (16)
- 第1の撮像領域に関する複数の第1の画像に基づいて前記第1の撮像領域に関する第1のリファレンス画像を生成し、前記第1の撮像領域と少なくとも一部が重複する第2の撮像領域に関する複数の第2の画像に基づいて前記第2の撮像領域に関する第2のリファレンス画像を生成する画像生成部と、
前記第1のリファレンス画像と前記第2のリファレンス画像とに基づいて、前記第1の撮像領域と前記第2の撮像領域との対応関係を示す位置決め情報を生成する処理部と、
を備える画像処理装置。 - 前記処理部は、前記第1のリファレンス画像と前記第2のリファレンス画像とをスティッチ処理することで、前記位置決め情報を含むリファレンス合成画像を生成する、
請求項1に記載の画像処理装置。 - 前記処理部は、前記位置決め情報に基づいて、前記複数の第1の画像のいずれかの画像と、前記複数の第2の画像のいずれかの画像とをスティッチ処理することで合成画像を取得する、
請求項1に記載の画像処理装置。 - 前記処理部は、前記位置決め情報に基づいて、前記第1の撮像領域に対応する領域を前記複数の第1の画像に基づく画像に設定し、前記第2の撮像領域に対応する領域を前記複数の第2の画像に基づく画像に設定することで前記合成画像を取得する、
請求項1に記載の画像処理装置。 - 前記複数の第1の画像はそれぞれ異なる波長に基づく画像、または前記複数の第2の画像はそれぞれ異なる波長に基づく画像である、
請求項1に記載の画像処理装置。 - 前記画像生成部は、前記複数の第1の画像から、特徴量の大きい画像を優先して前記第1のリファレンス画像として取得し、前記複数の第2の画像から、特徴量の大きい画像を優先して前記第2のリファレンス画像として取得する、
請求項1に記載の画像処理装置。 - 前記画像生成部は、前記複数の第1の画像のうち最も特徴量の大きい画像を前記第1のリファレンス画像として取得し、前記複数の第2の画像のうち最も特徴量の大きい画像を前記第2のリファレンス画像として取得する、
請求項1に記載の画像処理装置。 - 前記画像生成部は、前記第1の画像を縮小した第1の縮小画像を生成し、前記第1の縮小画像も対象として前記第1のリファレンス画像を生成し、前記第2の画像を縮小した第2の縮小画像を生成し、前記第2の縮小画像も対象として前記第2のリファレンス画像を生成する
請求項1に記載の画像処理装置。 - 前記第1の縮小画像が、前記第1の画像に設定された場合、前記第1の縮小画像の元の第1の画像が、前記第1のリファレンス画像に設定され、前記第2の縮小画像が、前記第2の画像に設定された場合、前記第2の縮小画像の元の第2の画像が、前記第2のリファレンス画像に設定される
請求項8に記載の画像処理装置。 - 前記画像生成部は、前記第1の画像をブロックに分割し、ブロック毎に特徴量を算出し、前記特徴量が大きいブロックから、前記第1のリファレンス画像を生成し、前記第2の画像をブロックに分割し、ブロック毎に特徴量を算出し、前記特徴量が大きいブロックから、前記第2のリファレンス画像を生成する
請求項1に記載の画像処理装置。 - 前記画像生成部は、前記第1の画像を縮小した第1の縮小画像を生成し、前記第1の縮小画像もブロックに分割し、前記第1の縮小画像のブロックも対象として、前記第1のリファレンス画像を生成し、前記第2の画像を縮小した第2の縮小画像を生成し、前記第2の縮小画像もブロックに分割し、前記第2の縮小画像のブロックも対象として、前記第2のリファレンス画像を生成する
請求項10に記載の画像処理装置。 - 前記第1の縮小画像のブロックが、前記特徴量が大きいブロックである場合、前記第1の縮小画像の元の第1の画像の対応するブロックが、前記第1のリファレンス画像の一部に設定され、前記第2の縮小画像のブロックが、前記特徴量が大きいブロックである場合、前記第2の縮小画像の元の第2の画像の対応するブロックが、前記第2のリファレンス画像の一部に設定される
請求項11に記載の画像処理装置。 - 前記複数の第1の画像を用いて、所定の波長の第3の画像を生成し、
前記画像生成部は、前記第3の画像から前記第1のリファレンス画像を生成し、
前記複数の第2の画像を用いて、所定の波長の第4の画像を生成し、
前記画像生成部は、前記第4の画像から前記第2のリファレンス画像を生成する
請求項1に記載の画像処理装置。 - 前記第1のリファレンス画像と前記第2のリファレンス画像をそれぞれオルソ化処理し、
前記処理部は、前記オルソ化された前記第1のリファレンス画像と前記第2のリファレンス画像に基づいて、前記位置決め情報を生成する
請求項1に記載の画像処理装置。 - 第1の撮像領域に関する複数の第1の画像に基づいて前記第1の撮像領域に関する第1のリファレンス画像を生成し、前記第1の撮像領域と少なくとも一部が重複する第2の撮像領域に関する複数の第2の画像に基づいて前記第2の撮像領域に関する第2のリファレンス画像を生成し、
前記第1のリファレンス画像と前記第2のリファレンス画像とに基づいて、前記第1の撮像領域と前記第2の撮像領域との対応関係を示す位置決め情報を生成する、
ステップを含む画像処理方法。 - コンピュータに、
第1の撮像領域に関する複数の第1の画像に基づいて前記第1の撮像領域に関する第1のリファレンス画像を生成し、前記第1の撮像領域と少なくとも一部が重複する第2の撮像領域に関する複数の第2の画像に基づいて前記第2の撮像領域に関する第2のリファレンス画像を生成し、
前記第1のリファレンス画像と前記第2のリファレンス画像とに基づいて、前記第1の撮像領域と前記第2の撮像領域との対応関係を示す位置決め情報を生成する、
ステップを含む処理を実行させるためのプログラム。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019534026A JP7160037B2 (ja) | 2017-08-01 | 2018-07-18 | 画像処理装置、画像処理方法、並びにプログラム |
BR112020001562-8A BR112020001562A2 (pt) | 2017-08-01 | 2018-07-18 | aparelho e método de processamento de imagem, e, programa |
AU2018309905A AU2018309905A1 (en) | 2017-08-01 | 2018-07-18 | Image processing device, image processing method, and program |
EP18841076.5A EP3664029B1 (en) | 2017-08-01 | 2018-07-18 | Image processing device, image processing method, and program |
US16/633,661 US11328388B2 (en) | 2017-08-01 | 2018-07-18 | Image processing apparatus, image processing method, and program |
CN201880049241.1A CN110998657B (zh) | 2017-08-01 | 2018-07-18 | 图像处理设备、图像处理方法和程序 |
US17/694,429 US11769225B2 (en) | 2017-08-01 | 2022-03-14 | Image processing apparatus, image processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017148858 | 2017-08-01 | ||
JP2017-148858 | 2017-08-01 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/633,661 A-371-Of-International US11328388B2 (en) | 2017-08-01 | 2018-07-18 | Image processing apparatus, image processing method, and program |
US17/694,429 Continuation US11769225B2 (en) | 2017-08-01 | 2022-03-14 | Image processing apparatus, image processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019026619A1 true WO2019026619A1 (ja) | 2019-02-07 |
Family
ID=65232810
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/026826 WO2019026619A1 (ja) | 2017-08-01 | 2018-07-18 | 画像処理装置、画像処理方法、並びにプログラム |
Country Status (7)
Country | Link |
---|---|
US (2) | US11328388B2 (ja) |
EP (1) | EP3664029B1 (ja) |
JP (1) | JP7160037B2 (ja) |
CN (1) | CN110998657B (ja) |
AU (1) | AU2018309905A1 (ja) |
BR (1) | BR112020001562A2 (ja) |
WO (1) | WO2019026619A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020179276A1 (ja) * | 2019-03-01 | 2020-09-10 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
WO2022102295A1 (ja) * | 2020-11-10 | 2022-05-19 | ソニーグループ株式会社 | 撮像装置 |
WO2022190826A1 (ja) * | 2021-03-09 | 2022-09-15 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置及び電子機器 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7171505B2 (ja) | 2019-04-24 | 2022-11-15 | キヤノン株式会社 | トナー |
IT202200006677A1 (it) * | 2022-04-05 | 2023-10-05 | Turf Europe Srl | Dispositivo di acquisizione, elaborazione e trasmissione di immagini digitali ad almeno 5 bande multispettrali coassiali ad ampio campo visivo, per l’agromonitoraggio delle superfici agricole ed a verde. |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012163482A (ja) | 2011-02-08 | 2012-08-30 | System Instruments Kk | 光量子計 |
JP2015532714A (ja) * | 2012-08-21 | 2015-11-12 | ビジュアル インテリジェンス, エルピーVisual Intelligence, Lp | インフラストラクチャマッピングシステム及び方法 |
JP2016208306A (ja) * | 2015-04-23 | 2016-12-08 | パナソニックIpマネジメント株式会社 | 画像処理装置及びこれを備えた撮像システムならびに画像処理方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5589446B2 (ja) * | 2009-09-18 | 2014-09-17 | ソニー株式会社 | 画像処理装置、撮像装置、および画像処理方法、並びにプログラム |
US8600194B2 (en) * | 2011-05-17 | 2013-12-03 | Apple Inc. | Positional sensor-assisted image registration for panoramic photography |
US8811764B1 (en) * | 2012-10-25 | 2014-08-19 | Google Inc. | System and method for scene dependent multi-band blending |
US9813601B2 (en) * | 2014-05-06 | 2017-11-07 | Urugus S.A. | Imaging device for scenes in apparent motion |
GB201412061D0 (en) * | 2014-07-07 | 2014-08-20 | Vito Nv | Method and system for geometric referencing of multi-spectral data |
CN106464811B (zh) * | 2015-03-10 | 2021-03-26 | 深圳市大疆创新科技有限公司 | 用于自适应全景图像生成的系统及方法 |
KR20160115466A (ko) * | 2015-03-27 | 2016-10-06 | 한국전자통신연구원 | 파노라믹 비디오를 스티칭하는 장치 및 이를 위한 스티칭 방법 |
JP6453694B2 (ja) * | 2015-03-31 | 2019-01-16 | 株式会社モルフォ | 画像合成装置、画像合成方法、画像合成プログラム及び記録媒体 |
JP6585006B2 (ja) * | 2016-06-07 | 2019-10-02 | 株式会社東芝 | 撮影装置および車両 |
-
2018
- 2018-07-18 CN CN201880049241.1A patent/CN110998657B/zh active Active
- 2018-07-18 EP EP18841076.5A patent/EP3664029B1/en active Active
- 2018-07-18 JP JP2019534026A patent/JP7160037B2/ja active Active
- 2018-07-18 WO PCT/JP2018/026826 patent/WO2019026619A1/ja unknown
- 2018-07-18 BR BR112020001562-8A patent/BR112020001562A2/pt not_active IP Right Cessation
- 2018-07-18 AU AU2018309905A patent/AU2018309905A1/en not_active Abandoned
- 2018-07-18 US US16/633,661 patent/US11328388B2/en active Active
-
2022
- 2022-03-14 US US17/694,429 patent/US11769225B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012163482A (ja) | 2011-02-08 | 2012-08-30 | System Instruments Kk | 光量子計 |
JP2015532714A (ja) * | 2012-08-21 | 2015-11-12 | ビジュアル インテリジェンス, エルピーVisual Intelligence, Lp | インフラストラクチャマッピングシステム及び方法 |
JP2016208306A (ja) * | 2015-04-23 | 2016-12-08 | パナソニックIpマネジメント株式会社 | 画像処理装置及びこれを備えた撮像システムならびに画像処理方法 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020179276A1 (ja) * | 2019-03-01 | 2020-09-10 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
WO2022102295A1 (ja) * | 2020-11-10 | 2022-05-19 | ソニーグループ株式会社 | 撮像装置 |
WO2022190826A1 (ja) * | 2021-03-09 | 2022-09-15 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置及び電子機器 |
Also Published As
Publication number | Publication date |
---|---|
JP7160037B2 (ja) | 2022-10-25 |
CN110998657B (zh) | 2023-12-12 |
US11769225B2 (en) | 2023-09-26 |
CN110998657A (zh) | 2020-04-10 |
AU2018309905A1 (en) | 2020-01-23 |
BR112020001562A2 (pt) | 2020-07-21 |
US20220277419A1 (en) | 2022-09-01 |
US20200211158A1 (en) | 2020-07-02 |
EP3664029B1 (en) | 2023-12-13 |
EP3664029A1 (en) | 2020-06-10 |
US11328388B2 (en) | 2022-05-10 |
EP3664029A4 (en) | 2020-08-12 |
JPWO2019026619A1 (ja) | 2020-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
O’Connor et al. | Cameras and settings for aerial surveys in the geosciences: Optimising image data | |
JP7160037B2 (ja) | 画像処理装置、画像処理方法、並びにプログラム | |
Torres-Sánchez et al. | Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards | |
Roth et al. | PhenoFly Planning Tool: flight planning for high-resolution optical remote sensing with unmanned areal systems | |
JP6192853B2 (ja) | 超音波深度検出を使用するオプティカルフロー画像化システム及び方法 | |
Kedzierski et al. | Radiometric quality assessment of images acquired by UAV’s in various lighting and weather conditions | |
US20150022656A1 (en) | System for collecting & processing aerial imagery with enhanced 3d & nir imaging capability | |
Rijsdijk et al. | Unmanned aerial systems in the process of juridical verification of cadastral border | |
CN108414454A (zh) | 一种植物三维结构及光谱信息的同步测量系统及测量方法 | |
Kedzierski et al. | Methodology of improvement of radiometric quality of images acquired from low altitudes | |
JP2019503484A (ja) | マルチスペクトルデータの幾何学的リファレンシング方法およびシステム | |
Wierzbicki et al. | Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle | |
Ahamed et al. | Tower remote-sensing system for monitoring energy crops; image acquisition and geometric corrections | |
Ramirez‐Paredes et al. | Low‐altitude Terrestrial Spectroscopy from a Pushbroom Sensor | |
Szabó et al. | Zooming on aerial survey | |
Ribas Costa et al. | Uncrewed aircraft system spherical photography for the vertical characterization of canopy structural traits | |
O'Connor | Impact of image quality on SfM Photogrammetry: colour, compression and noise | |
CN112154484A (zh) | 正射影像生成方法、系统和存储介质 | |
Domozi et al. | Surveying private pools in suburban areas with neural network based on drone photos | |
Altena et al. | Assessing UAV platform types and optical sensor specifications | |
Jonassen et al. | Bundle Adjustment of Aerial Linear Pushbroom Hyperspectral Images with Sub-Pixel Accuracy | |
Miraki et al. | Using canopy height model derived from UAV imagery as an auxiliary for spectral data to estimate the canopy cover of mixed broadleaf forests | |
Haavardsholm et al. | Multimodal Multispectral Imaging System for Small UAVs | |
Doumit | The effect of Neutral Density Filters on drones orthomosaics classifications for land-use mapping | |
Gonçalves | Using structure-from-motion workflows for 3D mapping and remote sensing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18841076 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019534026 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018309905 Country of ref document: AU Date of ref document: 20180718 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112020001562 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2018841076 Country of ref document: EP Effective date: 20200302 |
|
ENP | Entry into the national phase |
Ref document number: 112020001562 Country of ref document: BR Kind code of ref document: A2 Effective date: 20200124 |