WO2016171007A1 - 検査装置および検査方法、並びにプログラム - Google Patents
検査装置および検査方法、並びにプログラム Download PDFInfo
- Publication number
- WO2016171007A1 WO2016171007A1 PCT/JP2016/061521 JP2016061521W WO2016171007A1 WO 2016171007 A1 WO2016171007 A1 WO 2016171007A1 JP 2016061521 W JP2016061521 W JP 2016061521W WO 2016171007 A1 WO2016171007 A1 WO 2016171007A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- inspection
- angle
- sensing
- imaging
- sensing data
- Prior art date
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 213
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000012545 processing Methods 0.000 claims abstract description 61
- 238000001514 detection method Methods 0.000 claims description 6
- 238000012360 testing method Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 abstract description 171
- 230000001419 dependent effect Effects 0.000 abstract description 9
- 238000005286 illumination Methods 0.000 abstract 1
- 244000025254 Cannabis sativa Species 0.000 description 17
- 238000012935 Averaging Methods 0.000 description 15
- 241000196324 Embryophyta Species 0.000 description 13
- 238000004891 communication Methods 0.000 description 13
- 238000012937 correction Methods 0.000 description 9
- 230000000694 effects Effects 0.000 description 8
- 230000010354 integration Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000004615 ingredient Substances 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 240000007594 Oryza sativa Species 0.000 description 2
- 235000007164 Oryza sativa Nutrition 0.000 description 2
- 240000000111 Saccharum officinarum Species 0.000 description 2
- 235000007201 Saccharum officinarum Nutrition 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 235000009566 rice Nutrition 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 241001520823 Zoysia Species 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
- G01N21/35—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
- G01N21/3563—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light for analysing solids; Preparation of samples therefor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/0098—Plants or trees
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N2021/8466—Investigation of vegetal material, e.g. leaves, plants, fruits
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01W—METEOROLOGY
- G01W1/00—Meteorology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10036—Multispectral image; Hyperspectral image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present disclosure relates to an inspection apparatus, an inspection method, and a program, and more particularly, to an inspection apparatus, an inspection method, and a program that can obtain a more accurate inspection result.
- Patent Document 1 an inspection apparatus for inspecting vegetation such as the state and activity of a plant growing in a certain place is known (see, for example, Patent Document 1).
- the present disclosure has been made in view of such a situation, and makes it possible to obtain more accurate inspection results.
- the inspection apparatus relies on the angle related to the inspection so that the component dependent on the angle related to the inspection included in the sensing data obtained by sensing the inspection object to be inspected is reduced.
- a calculation processing unit is provided that calculates an inspection value for inspecting the inspection object using sensing data having opposite components.
- the inspection method relies on the angle related to the inspection so that the component dependent on the angle related to the inspection included in the sensing data obtained by sensing the inspection object to be inspected is reduced. Calculating an inspection value for inspecting the inspection object by using sensing data having opposite components.
- the program according to one aspect of the present disclosure is a component that depends on the angle related to the inspection so that the component that depends on the angle related to the inspection included in the sensing data obtained by sensing the inspection object to be inspected is reduced.
- the computer is caused to function as a calculation processing unit that calculates an inspection value for inspecting the inspection object using sensing data that are in conflict with each other.
- the component depending on the angle related to the inspection is reduced so that the component dependent on the angle related to the inspection included in the sensing data obtained by sensing the inspection object to be inspected is reduced.
- An inspection value for inspecting the inspection object is calculated using the conflicting sensing data.
- FIG. 18 is a block diagram illustrating a configuration example of an embodiment of a computer to which the present technology is applied.
- the vegetation inspection system 11 includes two imaging devices 12-1 and 12-2 and a vegetation inspection device 13.
- the vegetation inspection system 11 inspects the vegetation of plants using, for example, various plants such as turf, rice, and sugarcane grown in the field 14 as inspection objects.
- various plants such as turf, rice, and sugarcane grown in the field 14 as inspection objects.
- the vegetation inspection system 11 inspects the vegetation of the plant, the light from the sun is applied to the field 14 or the light is blocked by the clouds.
- the imaging devices 12-1 and 12-2 are fixed to the field 14 according to predetermined arrangement conditions, and communicate with the vegetation inspection device 13 via a communication network constructed by wire or wireless. Then, the imaging devices 12-1 and 12-2 respectively capture the field 14 in accordance with the control by the vegetation inspection device 13, and transmit an image of the field 14 obtained as a result to the vegetation inspection device 13.
- the imaging devices 12-1 and 12-2 have the same distance from the center of the field 14 on the straight line passing through the center of the field 14 and the same height from the field 14 toward the center of the field 14. Fixed at the same elevation angle.
- the imaging devices 12-1 and 12-2 are oriented in directions opposite to each other, and the imaging devices 12-1 and 12-2 are respectively directed to the vertical line at the center of the flat field 14. The angles formed by the optical axes are fixed to be equal.
- the vegetation inspection device 13 controls the timing at which the imaging devices 12-1 and 12-2 image the field 14. Then, the vegetation inspection apparatus 13 obtains a vegetation index for inspecting the vegetation of the plant grown in the field 14 based on the image of the field 14 captured by the imaging apparatuses 12-1 and 12-2. The detailed configuration of the vegetation inspection apparatus 13 will be described later with reference to FIG.
- the vegetation inspection system 11 configured as described above has a direction in which a plant grown in the field 14 grows, a direction in which the imaging devices 12-1 and 12-2 image the field 14, and a direction from the sun to the field 14.
- a vegetation index in which the influence depending on each angle is eliminated (reduced) can be obtained with respect to the direction of the irradiated light.
- the lawn growing direction (hereinafter referred to as turf) greatly changes due to the lawn mowing, so it is difficult to always inspect vegetation under the same conditions. is there. Therefore, hereinafter, the turf in the field 14 where the turf such as a soccer stadium is cultivated will be described as an inspection target of the vegetation inspection system 11.
- the vegetation inspection system 11 in addition to the above-described rice and sugarcane, various plants can be targeted for inspection.
- the hatching direction of the field 14 in FIG. 1 represents the grass of the turf grown in the field 14, and the vegetation inspection system 11 grows the grass so that the turf is different for each predetermined area.
- the vegetation can be inspected with respect to the turf that has been removed, eliminating the effect of the angle of the turf.
- the imaging devices 12-1 and 12-2 are simply referred to as the imaging device 12.
- the definition of the angle of the grass in the field 14, the angle of the imaging direction by the imaging device 12, and the angle of the light irradiation direction to the field 14 will be described.
- the lawn eye angle P in the field 14 is defined based on the lawn eye azimuth angle ⁇ with respect to a predetermined reference direction and the lawn eye elevation angle ⁇ with respect to the vertical upward direction. Is done. That is, when the azimuth angle ⁇ of the lawn is a and the elevation angle ⁇ of the lawn is b, it is defined as a lawn angle P (a, b). Note that when the turf is facing upward in the vertical direction, the turf is at an angle P (0, 0).
- the angle C of the imaging direction by the imaging device 12 is based on the azimuth angle ⁇ of the imaging direction with respect to a predetermined reference azimuth and the elevation angle ⁇ of the imaging direction with respect to the vertical downward direction. Defined. That is, when the azimuth angle ⁇ in the imaging direction is c and the elevation angle ⁇ in the imaging direction is d, it is defined as an angle C (c, d) in the imaging direction. Note that when the imaging direction is directed downward in the vertical direction, the angle C (0, 0) in the imaging direction is obtained.
- the angle L of the irradiation direction of the light to the field 14 is determined by the azimuth angle ⁇ of the irradiation direction with respect to a predetermined reference azimuth and the elevation angle ⁇ of the irradiation direction with respect to the vertical downward direction. Defined on the basis of That is, when the azimuth angle ⁇ in the irradiation direction is e and the elevation angle ⁇ in the irradiation direction is f, it is defined as an angle L (e, f) in the irradiation direction. When the irradiation direction is downward in the vertical direction, the angle L (0, 0) of the irradiation direction is obtained.
- the vegetation inspection system 11 performs a vegetation inspection using an image captured under inspection conditions in which components (for example, the azimuth angle ⁇ and the elevation angle ⁇ ) depending on the angle relating to the inspection are contradictory.
- the irradiation direction is the angle L (e, f) and the angle L ( ⁇ e , F)
- the influence of angle dependence can be eliminated.
- the turf is upward in the vertical direction
- the downward imaging in the vertical direction from above is an angle in which the light irradiation direction is symmetric with respect to the vertical direction (the azimuth is opposite) ) Is performed twice.
- the imaging direction is the angle C (c, d) and the angle C ( ⁇ c , D), it is possible to eliminate the influence of angle dependence. That is, in the second inspection condition, the turf is upward in the vertical direction, and the imaging direction becomes an angle symmetric with respect to the vertical direction at a timing when light is irradiated from directly above to the vertical direction 2 Imaging is performed from the direction.
- the difference in the angle of the imaging direction is canceled (the imaging direction is equivalent to the angle C (0, 0)), and the imaging direction The influence depending on the angle can be eliminated.
- the first imaging is performed with the angle L (e, f) in the irradiation direction and the angle C (c, d) in the imaging direction with respect to the lawn angle P (0, 0), and irradiation is performed.
- the second imaging is performed with the direction angle L ( ⁇ e, f) and the imaging direction angle C ( ⁇ c, d)
- the influence of angle dependence can be eliminated. That is, in the third inspection condition, the imaging direction is symmetric with respect to the vertical direction at each of the two timings when the turf is upward in the vertical direction and the light irradiation direction is symmetric with respect to the vertical direction. Imaging is performed from two directions that are different angles.
- the difference in the angle in the imaging direction and the difference in the angle in the irradiation direction are canceled (the imaging direction is the angle C (0, 0) and the irradiation direction is equivalent to the angle L (0, 0)), and the influence depending on the angle of the imaging direction and the irradiation direction can be eliminated.
- the difference in the angle of the grass is canceled (the grass is equivalent to the angle P (0, 0)), and the grass is The influence depending on the angle can be eliminated.
- the sensing values of each area are used as sensing data, and the sensing values of the two areas are added and averaged. Thus, it is possible to reduce the influence depending on the lawn angle.
- the irradiation direction is the angle L (0, 0)
- the area having the lawn angle P (a, b) is imaged at the imaging direction angle C (c, d)
- the lawn By imaging an area having an eye angle P ( ⁇ a, b) at an angle C ( ⁇ c, d) in the imaging direction, the influence of angle dependence can be eliminated. That is, in the fifth inspection condition, at the timing when the light is irradiated downward in the vertical direction, the lawn-eye is in the downward direction in the vertical direction with respect to each of the two areas whose angles are symmetrical with respect to the upward in the vertical direction. Imaging is performed from two directions that are symmetrical with respect to each other.
- the difference in the angle of the lawn and the difference in the angle of the imaging direction are canceled (the lawn is the angle P (0, 0), the imaging direction is equivalent to the angle C (0, 0)), and the influence depending on the lawn and the angle of the imaging direction can be eliminated.
- the area where the turf is at the angle P (a, b) is the first time at the timing when the irradiation direction is the angle L (e, f) from the angle C (c, d) in the imaging direction.
- the timing at which the irradiation direction is an angle L (-e, f) from the angle C (-c, d) in the imaging direction By performing the second imaging, the influence of angle dependence can be eliminated.
- imaging from two directions where the lawn is symmetrical with respect to the vertical downward direction is performed for each of the two areas where the lawn is symmetrical with respect to the vertical upward direction.
- the light irradiation direction is performed at two timings at a symmetric angle with respect to the vertical direction.
- the weather is cloudy so that the light from the sun is not directly irradiated on the field 14, it is not necessary to consider the influence depending on the angle of the irradiation direction (the irradiation direction is the angle L (0 , 0)).
- the irradiation direction is the angle L (0 , 0)
- imaging is performed when the weather is cloudy.
- the effect of angle dependence can be eliminated by performing one imaging at the angle C (0, 0) in the imaging direction. It can. That is, under the seventh inspection condition, the grass is upward in the vertical direction, and imaging is performed in the vertical direction from directly above. That is, in this case, the influence of angle dependence can be eliminated with an image obtained by one imaging.
- imaging is performed from two directions in which the imaging direction is an angle C (c, d) and an angle C ( ⁇ c, d).
- the influence of angle dependence can be eliminated. That is, under the eighth inspection condition, imaging is performed from two directions in which the turf is upward in the vertical direction and the imaging direction is at an angle symmetrical to the vertical direction.
- the difference in the angle of the imaging direction is canceled (the imaging direction is equivalent to the angle C (0, 0)), and the imaging direction The influence depending on the angle can be eliminated.
- the difference in the angle of the grass is canceled (the grass is equivalent to the angle P (0, 0)), and the grass is The influence depending on the angle can be eliminated.
- an area having the lawn angle P (a, b) is imaged at an angle C (c, d) in the imaging direction, and the area having the lawn angle P ( ⁇ a, b) is obtained.
- an image at an angle C ( ⁇ c, d) in the imaging direction it is possible to eliminate the influence of angle dependency. That is, in the tenth inspection condition, imaging is performed from two directions that are symmetric with respect to the downward direction in the vertical direction for each of the two areas where the turf is symmetric with respect to the upward direction in the vertical direction. Is called.
- the difference in the angle of the lawn and the difference in the angle of the imaging direction are canceled (the lawn is the angle P (0, 0), the imaging direction is equivalent to the angle C (0, 0)), and the influence depending on the lawn and the angle of the imaging direction can be eliminated.
- FIG. 14 is a block diagram illustrating a configuration example of the vegetation inspection apparatus 13.
- the vegetation inspection apparatus 13 includes a communication unit 21, a data server 22, a weather information acquisition unit 23, an imaging control unit 24, and an arithmetic processing unit 25.
- Imaging devices 12-1 and 12-2 are connected.
- the imaging devices 12-1 and 12-2 for example, in addition to detection elements for detecting red, green, and blue light in the visible light band, in addition to the near infrared (NIR) outside the visible light band, :
- Detection element for detecting is provided with a sensor arranged two-dimensionally, and is used as a sensing device for sensing light from the field 14.
- the imaging devices 12-1 and 12-2 each have an imaging sensor in which pixels (detecting elements) that detect light in different wavelength bands are two-dimensionally arranged for each wavelength band, and the imaging sensor.
- the images of the respective wavelength bands (image data including the sensing values) captured by the are examples of sensing data obtained by sensing.
- the communication unit 21 communicates with the imaging devices 12-1 and 12-2.
- the communication unit 21 receives image data (for example, raw data composed of R, G, B, and IR pixel values) that constitutes images captured by the imaging devices 12-1 and 12-2, and receives the data
- the server 22 is supplied.
- an imaging command for instructing imaging to the imaging devices 12-1 and 12-2 is supplied from the imaging control unit 24, the communication unit 21 sends the imaging command to the imaging devices 12-1 and 12-2. Send.
- the data server 22 accumulates the image data supplied from the communication unit 21 and supplies the image data to the arithmetic processing unit 25 in response to a request from the arithmetic processing unit 25.
- the data server 22 acquires the weather information when the image is captured by the imaging devices 12-1 and 12-2 via the weather information acquisition unit 23, and associates the image data with the image data corresponding to the image. And remember.
- the weather information acquisition unit 23 is, for example, weather information observed by an observer (not shown) installed in the field 14, weather information in the vicinity of the field 14 distributed via an external network such as the Internet, etc. Is supplied to the imaging control unit 24 as needed. Further, the weather information acquisition unit 23 supplies the data server 22 with weather information at the imaging time of the image data accumulated in the data server 22.
- the imaging control unit 24 uses the communication unit 21 according to time information (for example, data including date, hour, minute, and second) measured by a built-in timer or weather information supplied from the weather information acquisition unit 23. Thus, an imaging command for instructing imaging is transmitted to the imaging devices 12-1 and 12-2. Thereby, the imaging control unit 24 controls the timing at which the imaging devices 12-1 and 12-2 image the field 14.
- time information for example, data including date, hour, minute, and second
- the imaging control unit 24 sets the light irradiation direction to the field 14 to the angle L (0) as described above with reference to FIGS. 5, 7, and 8. , 0), the imaging command is transmitted. Further, the imaging control unit 24 determines that the irradiation direction of the light to the field 14 is the angle L (e) as described above with reference to FIGS. 4, 6, and 9 according to the time information measured by the built-in timer. , F) and the angle L ( ⁇ e, f), the imaging command is transmitted. Further, the imaging control unit 24 transmits an imaging command when the weather in the field 14 becomes cloudy according to the weather information supplied from the weather information acquisition unit 23 as described above with reference to FIGS. 10 to 13. To do.
- the arithmetic processing unit 25 reads out the image data stored in the data server 22 and uses the set of images captured by the imaging devices 12-1 and 12-2 to remove the influence of the angle dependency of the field 14. Calculate the vegetation index. That is, the arithmetic processing unit 25 includes a vegetation index calculation unit 31, a lens distortion correction unit 32, an addition processing unit 33, a cancellation processing unit 34, and an integration processing unit 35, as illustrated.
- the vegetation index calculating unit 31 reads out the image data from the data server 22 and vegetation.
- the normalized vegetation index is calculated by calculating the following equation (1).
- NDVI can be calculated.
- the normalized vegetation index NDVI is obtained using a pixel value R representing a visible red component and a pixel value IR representing a near infrared component.
- the vegetation index calculating unit 31 configures each image using the pixel value R and the pixel value IR of the image data obtained by imaging the field 14 by the imaging devices 12-1 and 12-2.
- the normalized vegetation index NDVI is obtained for each pixel to be processed.
- the vegetation index calculating unit 31 generates two NDVI images based on the pixel values of the two images captured by the imaging devices 12-1 and 12-2.
- the lens distortion correction unit 32 corrects the lens distortion of the two NDVI images generated by the vegetation index calculation unit 31.
- a wide-angle lens for example, a fish-eye lens
- a wide angle of view is used to capture an image of the entire field 14 in an image captured from an imaging direction having a predetermined elevation angle with respect to the field 14.
- lens distortion has occurred.
- similar lens distortion occurs in the NDVI image generated based on such an image.
- a straight line in the field 14 is distorted into a curve in the peripheral portion of the NDVI image. Therefore, the lens distortion correction unit 32 corrects the lens distortion generated in the NDVI image so that the straight line in the field 14 becomes a straight line in the peripheral portion of the NDVI image.
- the addition processing unit 33 performs a process of adding and averaging the two NDVI images whose lens distortion has been corrected by the lens distortion correction unit 32, thereby generating one NDVI image in which the difference in angle in the imaging direction is canceled. To do.
- the cancel processing unit 34 In the cancellation processing unit 34, information on the azimuth angle of the turf for each area identified by the turf eye of the turf grown in the field 14 (for example, information indicating which area is what kind of turf) Is registered in advance. Then, the cancel processing unit 34 adds pixel values in some areas in the two-dimensional sensing data in which the azimuth angles of the turf are opposite to each other in the NDVI image generated by the addition processing unit 33. By canceling, the difference in lawn angle is canceled.
- the integrated processing unit 35 averages the data of the area added in the cancellation processing unit 34 for the normalized vegetation index NDVI in which the difference in the angle of the grass is canceled by the cancellation processing unit 34, and is distinguished by the grass. Divide into areas and integrate them.
- the vegetation inspection apparatus 13 configured as described above can obtain vegetation information (NDVI image) that eliminates the effects depending on the lawn angle, the imaging direction angle, and the irradiation direction angle.
- vegetation information NDVI image
- grass is grown in the field 14 so that four areas having different lawn angles P (a, b) are alternately arranged. That is, an area where the turf is at an angle P (a1, b1), an area where the turf is at an angle P (a2, b2), an area where the turf is at an angle P (a3, b3), and an area where the turf is an angle
- the areas of P (a4, b4) are arranged so that the length ⁇ width is 2 ⁇ 2, and these four areas are spread throughout the field 14.
- the field 14 is imaged from the imaging direction of the angle C (c, d) by the imaging device 12-1, and the imaging direction of the angle C ( ⁇ c, d) is captured by the imaging device 12-2. Field 14 is imaged.
- the NDVI image P1 generated by the vegetation index calculating unit 31 from the image captured by the imaging device 12-1 according to the angle C (c, d) in the imaging direction of the imaging device 12-1 includes a lens. Distortion has occurred.
- the NDVI image P2 generated by the vegetation index calculating unit 31 from the image captured by the image capturing device 12-2 in accordance with the angle C ( ⁇ c, d) in the image capturing direction of the image capturing device 12-2 includes: Lens distortion has occurred.
- the lens distortion correction unit 32 generates an NDVI image P3 in which the lens distortion of the NDVI image P1 is corrected, and generates an NDVI image P4 in which the lens distortion of the NDVI image P2 is corrected.
- each rectangle illustrated in a grid pattern in the NDVI image P3 and the NDVI image P4 represents each area distinguished according to the angle P (a, b) of the grass, and is described in each rectangle.
- the arrows indicate the lawn angle P (a, b).
- the normalized vegetation index NDVI of each pixel constituting the NDVI image P3 depends on the component depending on the lawn angle P (a, b) and the angle C (c, d) in the imaging direction. Contains ingredients.
- the normalized vegetation index NDVI of each pixel constituting the NDVI image P4 depends on components dependent on the lawn angle P (a, b) and on the imaging direction angle C ( ⁇ c, d). Contains the ingredients.
- the addition processing unit 33 performs a process of adding and averaging the NDVI image P3 and the NDVI image P4, so that the component depending on the angle C (c, d) in the imaging direction and the angle C ( ⁇ c) in the imaging direction. , D), and the NDVI image P5 in which the components dependent on it are canceled is generated. That is, the NDVI image P5 is equivalent to the angle C (0, 0) in the imaging direction. Since the image is taken when the weather is cloudy, the irradiation direction is an angle L (0, 0).
- the cancel processing unit 34 performs processing for canceling a component depending on the lawn angle P (a, b) included in the NDVI image P5.
- the turf eye angle P (a1, b1) and the turf eye angle P (a3, b3) are opposite to each other in the azimuth angle ⁇ .
- the azimuth angle ⁇ is opposite to the angle P (a4, b4). Therefore, the cancel processing unit 34 normalizes the vegetation index NDVI of the area where the turf has an angle P (a1, b1), and normalizes vegetation index NDVI of the area where the turf has an angle P (a2, b2), the turf
- the angle P of the turf The component depending on (a, b) can be canceled.
- the cancel processing unit 34 performs a process of adding the normalized vegetation index NDVI so that two of the four areas to be added overlap.
- the integration processing unit 35 averages the data of the four areas, divides the information into area units for each lawn angle P (a, b), and performs integration processing, thereby eliminating the angle dependence.
- An NDVI image P6 composed of the normalized vegetation index NDVI of the entire field 14 is output. That is, as shown on the right side of the NDVI image P6, the integration processing unit 35 has an average of four areas including the area at the lower right and four areas including the area at the lower left when the area is centered. , The average of the four areas including the area in the upper right, and the average of the four areas including the area in the upper left are averaged to obtain a normalized vegetation index NDVI of the area. The integration processing unit 35 performs such averaging for all areas.
- the integrated processing unit 35 can output the NDVI image P6 composed of the normalized vegetation index NDVI of the entire field 14 from which the angle dependence is eliminated.
- the arithmetic processing unit 25 represents a pixel value R that represents a visible red component, a pixel value G that represents a visible green component, a pixel value B that represents a visible blue component, and a near infrared component.
- a vegetation index other than the normalized vegetation index NDVI for example, RVI (Ratio Vegetation Index) or GNDVI (Green NDVI) may be obtained using the pixel value IR.
- FIG. 16 is a flowchart for explaining an example of processing for obtaining a vegetation index in the vegetation inspection system 11.
- step S11 the weather information acquisition unit 23 acquires the weather information in the field 14 and supplies it to the imaging control unit 24.
- step S12 the imaging control unit 24 determines whether or not it is time to perform imaging of the field 14 by the imaging devices 12-1 and 12-2. For example, the imaging control unit 24 is not affected by the angle L ( ⁇ , ⁇ ) of the irradiation direction of light from the sun based on the weather information supplied from the weather information acquisition unit 23 in step S11. When the weather is cloudy, it is determined that it is time to take an image. Further, as described above, the imaging control unit 24 may determine whether it is time to perform imaging according to time information.
- step S12 when the imaging control unit 24 determines that it is not time to capture the field 14 by the imaging devices 12-1 and 12-2, the process returns to step S11, and the same process is repeated thereafter.
- step S12 when the imaging control unit 24 determines that it is time to capture the field 14 by the imaging devices 12-1 and 12-2, the process proceeds to step S13.
- step S13 the imaging control unit 24 transmits an imaging command for instructing imaging to the imaging devices 12-1 and 12-2 via the communication unit 21, and the imaging devices 12-1 and 12-2
- the field 14 is imaged according to the imaging command.
- the imaging devices 12-1 and 12-2 respectively transmit images obtained by capturing the field 14, and the communication unit 21 acquires image data of these images and accumulates them in the data server 22.
- step S ⁇ b> 14 the vegetation index calculating unit 31 reads out two pieces of image data to be processed from the data server 22, calculates the above-described equation (1), and normalizes each pixel constituting each image.
- the vegetation index NDVI is obtained and two NDVI images are generated.
- step S15 the lens distortion correction unit 32 performs a process of correcting the lens distortion on each of the two NDVI images generated by the vegetation index calculation unit 31 in step S14, thereby correcting the lens distortion. Generate an image.
- step S ⁇ b> 16 the addition processing unit 33 cancels the component depending on the angle in the imaging direction by performing a process of averaging the two NDVI images whose lens distortion is corrected by the lens distortion correction unit 32 in step S ⁇ b> 15.
- One NDVI image is generated.
- step S ⁇ b> 17 the cancellation processing unit 34 adds the areas in which the angles of the grass are opposite to each other in the NDVI image generated by the addition processing unit 33 in step S ⁇ b> 16, thereby depending on the grass angle. Cancel ingredients.
- step S18 the integration processing unit 35 performs processing for dividing and integrating the NDVI image in which the component depending on the lawn angle is canceled by the cancel processing unit 34 in step S17 in units of the lawn eye area.
- step S18 a process returns to step S11 and the same process is repeated hereafter.
- the vegetation inspection system 11 cancels the component depending on the lawn angle, cancels the component depending on the angle in the imaging direction, and generates an NDVI image composed of the normalized vegetation index NDVI of the entire field 14. Obtainable. Therefore, the vegetation inspection system 11 can inspect the vegetation of the turf grown in the field 14 while eliminating the influence depending on the angle of the turf, the angle in the imaging direction, and the angle in the irradiation direction.
- Addition averaging processing is performed using conflicting images.
- addition averaging processing is performed using images from two directions at angles C (c, d) and C ( ⁇ c, d).
- the reciprocity is within a predetermined reference value such as 5% or 10%. If it does, the influence depending on an angle can be reduced using such an image.
- the angles of the turf, imaging direction, and irradiation direction are examples of angles relating to the inspection of the field 14, and angles other than these may be used.
- the configuration in which two images are captured by the two imaging devices 12-1 and 12-2 has been described.
- the two imaging devices 12 are sequentially moved to move the two images. These images may be taken, and the normalized vegetation index NDVI may be calculated from these images.
- the imaging device 12 may be mounted on a UAV (Unmanned Aerial Vehicle) so that imaging can be performed at an arbitrary position.
- two or more imaging devices 12 may be used.
- the normalized vegetation index NDVI may be calculated from four images captured using the four imaging devices 12 from the east-west direction and the north-south direction. .
- the imaging device 12 and the vegetation inspection device 13 are connected via a communication network, the imaging device 12 and the vegetation inspection device 13 are set in an offline state so that, for example, an image is supplied via a storage medium or the like. It may be. In this case, it is not necessary to provide the data server 22.
- the imaging device 12 and the vegetation inspection device 13 may be configured as an integrated device.
- the imaging device 12 obtains a vegetation index and transmits the vegetation index to the vegetation inspection device 13, and the vegetation inspection device 13 can perform processing for eliminating the influence of angle dependence.
- the angle P (0, 0) of the lawn, the angle C (0, 0) in the imaging direction, and the angle L in the irradiation direction It is not limited to a vertical direction like (0, 0), and these angles should just become fixed angles other than a vertical direction. That is, when comparing vegetation, it is only necessary to eliminate the influence of angle dependence so that the same condition is always obtained.
- system represents the entire apparatus composed of a plurality of apparatuses.
- the processes described with reference to the flowcharts described above do not necessarily have to be processed in chronological order in the order described in the flowcharts, but are performed in parallel or individually (for example, parallel processes or objects). Processing).
- the program may be processed by one CPU, or may be distributedly processed by a plurality of CPUs.
- the above-described series of processing can be executed by hardware or can be executed by software.
- a program constituting the software executes various functions by installing a computer incorporated in dedicated hardware or various programs.
- the program is installed in a general-purpose personal computer from a program recording medium on which the program is recorded.
- FIG. 17 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input / output interface 105 is further connected to the bus 104.
- the input / output interface 105 includes an input unit 106 including a keyboard, a mouse, and a microphone, an output unit 107 including a display and a speaker, a storage unit 108 including a hard disk and nonvolatile memory, and a communication unit 109 including a network interface.
- a drive 110 for recording and reproducing information on a removable recording medium 111 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is connected.
- the CPU 101 loads, for example, the program stored in the storage unit 108 to the RAM 103 via the input / output interface 105 and the bus 104 and executes the program. Is performed.
- the program executed by the computer (CPU 101) is, for example, a magnetic disk (including a flexible disk), an optical disk (CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc), etc.), a magneto-optical disc, or a semiconductor.
- the program is recorded on a removable medium 111 that is a package medium including a memory or the like, or is provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the storage unit 108 via the input / output interface 105 by attaching the removable medium 111 to the drive 110. Further, the program can be received by the communication unit 109 via a wired or wireless transmission medium and installed in the storage unit 108. In addition, the program can be installed in the ROM 102 or the storage unit 108 in advance.
- this technique can also take the following structures.
- (1) In order to reduce the component depending on the angle related to the inspection included in the sensing data obtained by sensing the inspection object to be inspected, the sensing data using the opposite components depending on the angle related to the inspection is used.
- An inspection apparatus comprising a calculation processing unit that calculates an inspection value for inspecting an inspection object.
- (2) The said calculation process part performs the said sensing according to the direction which the said plant grows by performing the arithmetic process using the area where the azimuth
- the sensing data is image data including sensing values of each wavelength band acquired based on a sensor in which detection elements that detect light of different wavelength bands for each wavelength band are two-dimensionally arranged.
- the inspection apparatus according to (1) or (2).
- the inspection apparatus according to any one of (1) to (4), wherein the inspection object is a plant, and the inspection value is a normalized difference vegetation index (NDVI).
- NDVI normalized difference vegetation index
- the calculation processing unit senses the inspection object by performing arithmetic processing using two pieces of the sensing data sensed with azimuth angles of sensing directions opposite to each other when sensing the sensing data.
- the inspection apparatus according to any one of (1) to (5), wherein a component depending on the angle included in the sensing data is reduced according to a sensing direction.
- the calculation processing unit performs a calculation process using the two pieces of sensing data sensed with opposite azimuths in the irradiation directions of light irradiated on the inspection object when sensing the sensing data.
- the inspection apparatus according to any one of (1) to (6), wherein a component depending on the angle included in the sensing data is reduced according to an irradiation direction of light irradiated on the inspection object. .
- the inspection apparatus according to any one of (1) to (7), further including a sensing control unit that performs control on a sensing apparatus that senses the inspection object.
- a weather information acquisition unit for acquiring weather information indicating the weather in the field where the plant that is the inspection object is grown;
- the sensing control unit causes the sensing device to sense the inspection object at a timing according to a sensing direction in which the inspection object is sensed and an irradiation direction of light applied to the inspection object.
- the sensing data is image data including sensing values of each wavelength band acquired based on a sensor in which detection elements that detect light of different wavelength bands for each wavelength band are two-dimensionally arranged.
- the inspection apparatus according to (1).
- the inspection device according to (11), wherein the sensing data includes sensing values in respective wavelength bands of red, green, blue, and near infrared.
- the inspection apparatus according to (12), wherein the inspection object is a plant, and the inspection value is a normalized difference vegetation index (NDVI).
- NDVI normalized difference vegetation index
- 11 vegetation inspection system 12-1 and 12-2 imaging device, 13 vegetation inspection device, 14 fields, 21 communication unit, 22 data server, 23 weather information acquisition unit, 24 imaging control unit, 25 arithmetic processing unit, 31 vegetation index Calculation unit, 32 lens distortion correction unit, 33 addition processing unit, 34 cancellation processing unit, 35 integration processing unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Multimedia (AREA)
- Biochemistry (AREA)
- Pathology (AREA)
- Immunology (AREA)
- Analytical Chemistry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Botany (AREA)
- Quality & Reliability (AREA)
- Wood Science & Technology (AREA)
- Medicinal Chemistry (AREA)
- Food Science & Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Environmental Sciences (AREA)
- Forests & Forestry (AREA)
- Medical Informatics (AREA)
- Biodiversity & Conservation Biology (AREA)
- Radiology & Medical Imaging (AREA)
- Mathematical Physics (AREA)
- Ecology (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Image Processing (AREA)
Abstract
Description
(1)
検査の対象となる検査対象物をセンシングして得られたセンシングデータに含まれる検査に関する角度に依存した成分が低減されるように、検査に関する角度に依存した成分が相反するセンシングデータを用いて前記検査対象物を検査するための検査値を算出する算出処理部
を備える検査装置。
(2)
前記算出処理部は、前記検査対象物である植物の生える方向の方位角が互いに逆向きとなる少なくとも2箇所のエリアを用いた演算処理を行うことにより、前記植物の生える方向に応じて前記センシングデータに含まれる前記角度に依存した成分を低減する
上記(1)に記載の検査装置。
(3)
前記センシングデータは、異なる波長帯域の光を、それぞれの波長帯域ごとに検出する検出素子が二次元的に配置されたセンサに基づいて取得された各波長帯域のセンシング値を含む画像データである
上記(1)または(2)に記載の検査装置。
(4)
前記センシングデータは、赤色、緑色、青色、および近赤外のそれぞれの波長帯域におけるセンシング値を含む
上記(3)に記載の検査装置。
(5)
前記検査対象物は植物であり、前記検査値は正規化植生指数(NDVI:Normalized Difference Vegetation Index)である
上記(1)から(4)までのいずれかに記載の検査装置。
(6)
前記算出処理部は、前記センシングデータをセンシングする際のセンシング方向の方位角が互いに逆向きでセンシングされた2枚の前記センシングデータを用いた演算処理を行うことにより、前記検査対象物をセンシングするセンシング方向に応じて前記センシングデータに含まれる前記角度に依存した成分を低減する
上記(1)から(5)までのいずれかに記載の検査装置。
(7)
前記算出処理部は、前記センシングデータをセンシングする際に前記検査対象物に照射される光の照射方向の方位角が互いに逆向きでセンシングされた2枚の前記センシングデータを用いた演算処理を行うことにより、前記検査対象物に照射される光の照射方向に応じて前記センシングデータに含まれる前記角度に依存した成分を低減する
上記(1)から(6)までのいずれかに記載の検査装置。
(8)
前記検査対象物をセンシングするセンシング装置に対する制御を行うセンシング制御部
をさらに備える上記(1)から(7)までのいずれかに記載の検査装置。
(9)
前記検査対象物である植物が育成されているフィールドにおける天候を示す天候情報を取得する天候情報取得部をさらに備え、
前記センシング制御部は、前記フィールドにおける天候が曇りであるときに、前記センシング装置により前記植物をセンシングさせる
上記(8)に記載の検査装置。
(10)
前記センシング制御部は、前記検査対象物をセンシングするセンシング方向、および、前記検査対象物に照射される光の照射方向に従ったタイミングで、前記センシング装置により前記検査対象物をセンシングさせる
上記(8)または(9)に記載の検査装置。
(11)
前記センシングデータは、異なる波長帯域の光を、それぞれの波長帯域ごとに検出する検出素子が二次元的に配置されたセンサに基づいて取得された各波長帯域のセンシング値を含む画像データである
上記(1)に記載の検査装置。
(12)
前記センシングデータは、赤色、緑色、青色、および近赤外のそれぞれの波長帯域におけるセンシング値を含む
上記(11)に記載の検査装置。
(13)
前記検査対象物は植物であり、前記検査値は正規化植生指数(NDVI:Normalized Difference Vegetation Index)である
上記(12)に記載の検査装置。
(14)
検査の対象となる検査対象物をセンシングして得られたセンシングデータに含まれる検査に関する角度に依存した成分が低減されるように、検査に関する角度に依存した成分が相反するセンシングデータを用いて前記検査対象物を検査するための検査値を算出すること
を含む検査方法。
(15)
検査の対象となる検査対象物をセンシングして得られたセンシングデータに含まれる検査に関する角度に依存した成分が低減されるように、検査に関する角度に依存した成分が相反するセンシングデータを用いて前記検査対象物を検査するための検査値を算出する算出処理部
として、コンピュータを機能させるためのプログラム。
Claims (15)
- 検査の対象となる検査対象物をセンシングして得られたセンシングデータに含まれる検査に関する角度に依存した成分が低減されるように、検査に関する角度に依存した成分が相反するセンシングデータを用いて前記検査対象物を検査するための検査値を算出する算出処理部
を備える検査装置。 - 前記算出処理部は、前記検査対象物である植物の生える方向の方位角が互いに逆向きとなる少なくとも2箇所のエリアを用いた演算処理を行うことにより、前記植物の生える方向に応じて前記センシングデータに含まれる前記角度に依存した成分を低減する
請求項1に記載の検査装置。 - 前記センシングデータは、異なる波長帯域の光を、それぞれの波長帯域ごとに検出する検出素子が二次元的に配置されたセンサに基づいて取得された各波長帯域のセンシング値を含む画像データである
請求項2に記載の検査装置。 - 前記センシングデータは、赤色、緑色、青色、および近赤外のそれぞれの波長帯域におけるセンシング値を含む
請求項3に記載の検査装置。 - 前記検査対象物は植物であり、前記検査値は正規化植生指数(NDVI:Normalized Difference Vegetation Index)である
請求項4に記載の検査装置。 - 前記算出処理部は、前記センシングデータをセンシングする際のセンシング方向の方位角が互いに逆向きでセンシングされた2枚の前記センシングデータを用いた演算処理を行うことにより、前記検査対象物をセンシングするセンシング方向に応じて前記センシングデータに含まれる前記角度に依存した成分を低減する
請求項1に記載の検査装置。 - 前記算出処理部は、前記センシングデータをセンシングする際に前記検査対象物に照射される光の照射方向の方位角が互いに逆向きでセンシングされた2枚の前記センシングデータを用いた演算処理を行うことにより、前記検査対象物に照射される光の照射方向に応じて前記センシングデータに含まれる前記角度に依存した成分を低減する
請求項1に記載の検査装置。 - 前記検査対象物をセンシングするセンシング装置に対する制御を行うセンシング制御部
をさらに備える請求項1に記載の検査装置。 - 前記検査対象物である植物が育成されているフィールドにおける天候を示す天候情報を取得する天候情報取得部をさらに備え、
前記センシング制御部は、前記フィールドにおける天候が曇りであるときに、前記センシング装置により前記植物をセンシングさせる
請求項8に記載の検査装置。 - 前記センシング制御部は、前記検査対象物をセンシングするセンシング方向、および、前記検査対象物に照射される光の照射方向に従ったタイミングで、前記センシング装置により前記検査対象物をセンシングさせる
請求項8に記載の検査装置。 - 前記センシングデータは、異なる波長帯域の光を、それぞれの波長帯域ごとに検出する検出素子が二次元的に配置されたセンサに基づいて取得された各波長帯域のセンシング値を含む画像データである
請求項1に記載の検査装置。 - 前記センシングデータは、赤色、緑色、青色、および近赤外のそれぞれの波長帯域におけるセンシング値を含む
請求項11に記載の検査装置。 - 前記検査対象物は植物であり、前記検査値は正規化植生指数(NDVI:Normalized Difference Vegetation Index)である
請求項12に記載の検査装置。 - 検査の対象となる検査対象物をセンシングして得られたセンシングデータに含まれる検査に関する角度に依存した成分が低減されるように、検査に関する角度に依存した成分が相反するセンシングデータを用いて前記検査対象物を検査するための検査値を算出すること
を含む検査方法。 - 検査の対象となる検査対象物をセンシングして得られたセンシングデータに含まれる検査に関する角度に依存した成分が低減されるように、検査に関する角度に依存した成分が相反するセンシングデータを用いて前記検査対象物を検査するための検査値を算出する算出処理部
として、コンピュータを機能させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/560,726 US20180052114A1 (en) | 2015-04-24 | 2016-04-08 | Inspection apparatus, inspection method, and program |
EP16783028.0A EP3287003B1 (en) | 2015-04-24 | 2016-04-08 | Inspection device, inspection method, and program |
CN201680022410.3A CN107529726B (zh) | 2015-04-24 | 2016-04-08 | 检查装置、检查方法和记录介质 |
JP2017514067A JP6768203B2 (ja) | 2015-04-24 | 2016-04-08 | 検査装置および検査方法、並びにプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015089013 | 2015-04-24 | ||
JP2015-089013 | 2015-04-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016171007A1 true WO2016171007A1 (ja) | 2016-10-27 |
Family
ID=57142996
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/061521 WO2016171007A1 (ja) | 2015-04-24 | 2016-04-08 | 検査装置および検査方法、並びにプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20180052114A1 (ja) |
EP (1) | EP3287003B1 (ja) |
JP (1) | JP6768203B2 (ja) |
CN (1) | CN107529726B (ja) |
WO (1) | WO2016171007A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110869744A (zh) * | 2017-07-18 | 2020-03-06 | 索尼公司 | 信息处理设备、信息处理方法、程序和信息处理系统 |
US20220366668A1 (en) * | 2019-10-30 | 2022-11-17 | Sony Group Corporation | Image processing apparatus, image processing method, and image processing program |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109190497A (zh) * | 2018-08-09 | 2019-01-11 | 成都天地量子科技有限公司 | 一种基于时序多光谱卫星影像的耕地识别方法 |
DE102021210960A1 (de) * | 2021-09-30 | 2023-03-30 | Robert Bosch Gesellschaft mit beschränkter Haftung | Vegetationsüberwachungsvorrichtung, Vegetationsüberwachungssystem mit der Vegetationsüberwachungsvorrichtung und Verfahren zur Überwachung einer Vegetationsgesundheit in einem Garten |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003339238A (ja) * | 2002-05-28 | 2003-12-02 | Satake Corp | 作物の生育診断方法及びその装置 |
WO2007000999A1 (ja) * | 2005-06-27 | 2007-01-04 | Pioneer Corporation | 画像分析装置および画像分析方法 |
JP2007293558A (ja) * | 2006-04-25 | 2007-11-08 | Hitachi Ltd | 目標物認識プログラム及び目標物認識装置 |
JP2013101428A (ja) * | 2011-11-07 | 2013-05-23 | Pasuko:Kk | 建物輪郭抽出装置、建物輪郭抽出方法及び建物輪郭抽出プログラム |
JP2013242624A (ja) * | 2012-05-17 | 2013-12-05 | Reiji Oshima | 画像処理方法 |
JP2014183788A (ja) * | 2013-03-25 | 2014-10-02 | Sony Corp | 情報処理システム、および情報処理システムの情報処理方法、撮像装置および撮像方法、並びにプログラム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6160902A (en) * | 1997-10-10 | 2000-12-12 | Case Corporation | Method for monitoring nitrogen status using a multi-spectral imaging system |
US7058197B1 (en) * | 1999-11-04 | 2006-06-06 | Board Of Trustees Of The University Of Illinois | Multi-variable model for identifying crop response zones in a field |
WO2007035427A1 (en) * | 2005-09-16 | 2007-03-29 | U.S. Environmental Protection Agency | Optical system for plant characterization |
US8492721B2 (en) * | 2009-10-15 | 2013-07-23 | Camtek Ltd. | Systems and methods for near infra-red optical inspection |
LT5858B (lt) * | 2010-10-20 | 2012-08-27 | Uab "Žemdirbių Konsultacijos" | Augalo augimo sąlygų diagnostikos būdas ir įrenginys |
CN102999918B (zh) * | 2012-04-19 | 2015-04-22 | 浙江工业大学 | 全景视频序列图像的多目标对象跟踪系统 |
CN104243967B (zh) * | 2013-06-07 | 2017-02-01 | 浙江大华技术股份有限公司 | 一种图像检测方法及装置 |
-
2016
- 2016-04-08 US US15/560,726 patent/US20180052114A1/en active Pending
- 2016-04-08 CN CN201680022410.3A patent/CN107529726B/zh active Active
- 2016-04-08 JP JP2017514067A patent/JP6768203B2/ja active Active
- 2016-04-08 WO PCT/JP2016/061521 patent/WO2016171007A1/ja active Application Filing
- 2016-04-08 EP EP16783028.0A patent/EP3287003B1/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003339238A (ja) * | 2002-05-28 | 2003-12-02 | Satake Corp | 作物の生育診断方法及びその装置 |
WO2007000999A1 (ja) * | 2005-06-27 | 2007-01-04 | Pioneer Corporation | 画像分析装置および画像分析方法 |
JP2007293558A (ja) * | 2006-04-25 | 2007-11-08 | Hitachi Ltd | 目標物認識プログラム及び目標物認識装置 |
JP2013101428A (ja) * | 2011-11-07 | 2013-05-23 | Pasuko:Kk | 建物輪郭抽出装置、建物輪郭抽出方法及び建物輪郭抽出プログラム |
JP2013242624A (ja) * | 2012-05-17 | 2013-12-05 | Reiji Oshima | 画像処理方法 |
JP2014183788A (ja) * | 2013-03-25 | 2014-10-02 | Sony Corp | 情報処理システム、および情報処理システムの情報処理方法、撮像装置および撮像方法、並びにプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3287003A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110869744A (zh) * | 2017-07-18 | 2020-03-06 | 索尼公司 | 信息处理设备、信息处理方法、程序和信息处理系统 |
US20220366668A1 (en) * | 2019-10-30 | 2022-11-17 | Sony Group Corporation | Image processing apparatus, image processing method, and image processing program |
Also Published As
Publication number | Publication date |
---|---|
EP3287003A4 (en) | 2019-01-16 |
CN107529726A (zh) | 2018-01-02 |
CN107529726B (zh) | 2020-08-04 |
EP3287003A1 (en) | 2018-02-28 |
JPWO2016171007A1 (ja) | 2018-02-15 |
US20180052114A1 (en) | 2018-02-22 |
EP3287003B1 (en) | 2020-06-24 |
JP6768203B2 (ja) | 2020-10-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Vega et al. | Multi-temporal imaging using an unmanned aerial vehicle for monitoring a sunflower crop | |
WO2016171007A1 (ja) | 検査装置および検査方法、並びにプログラム | |
JP6872137B2 (ja) | 信号処理装置および信号処理方法、並びにプログラム | |
US10600162B2 (en) | Method and system to compensate for bidirectional reflectance distribution function (BRDF) | |
US20150359163A1 (en) | Row guidance parameterization with hough transform | |
US10776901B2 (en) | Processing device, processing method, and non-transitory computer-readable medium program with output image based on plurality of predetermined polarization directions and plurality of predetermined wavelength bands | |
Diago et al. | On‐the‐go assessment of vineyard canopy porosity, bunch and leaf exposure by image analysis | |
JP6933211B2 (ja) | センシングシステム、センシング方法、及び、センシング装置 | |
Xiang et al. | An automated stand-alone in-field remote sensing system (SIRSS) for in-season crop monitoring | |
CN108154479A (zh) | 一种对遥感图像进行图像校正的方法 | |
WO2019017095A1 (ja) | 情報処理装置、情報処理方法、プログラム、情報処理システム | |
Yun et al. | Use of unmanned aerial vehicle for multi-temporal monitoring of soybean vegetation fraction | |
US11570371B2 (en) | Imaging apparatus, imaging method, and program | |
CN103870847A (zh) | 一种低照度环境下对地监控的运动目标检测方法 | |
US10204405B2 (en) | Apparatus and method for parameterizing a plant | |
WO2020179276A1 (ja) | 画像処理装置、および画像処理方法、並びにプログラム | |
JP5648546B2 (ja) | 通路検出装置、方法、及びプログラム | |
CN104111080A (zh) | 一种凝视卫星面阵ccd相机的mtf在轨检测方法 | |
US10284819B2 (en) | Long-range image reconstruction using measured atmospheric characterization | |
JPWO2020217283A5 (ja) | 物体検出装置、物体検出システム、物体検出方法及びプログラム | |
CN105809632B (zh) | 从预定农作物的雷达影像去除噪声的方法 | |
Kumar et al. | Detection and counting of tassels for maize crop monitoring using multispectral images | |
Kataev et al. | Farm fields UAV images clusterization | |
Koenig et al. | Radiometric correction of terrestrial LiDAR data for mapping of harvest residues density | |
JP2008258679A (ja) | ラインセンサ観測画像の色補正方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16783028 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2016783028 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017514067 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15560726 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |