WO2017188956A1 - Image-capture devices having a diffractive grating array - Google Patents

Image-capture devices having a diffractive grating array Download PDF

Info

Publication number
WO2017188956A1
WO2017188956A1 PCT/US2016/029762 US2016029762W WO2017188956A1 WO 2017188956 A1 WO2017188956 A1 WO 2017188956A1 US 2016029762 W US2016029762 W US 2016029762W WO 2017188956 A1 WO2017188956 A1 WO 2017188956A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
image sensor
image
array
capture device
diffractive grating
Prior art date
Application number
PCT/US2016/029762
Other languages
French (fr)
Inventor
Peter Morovic
Jan Morovic
Charles Santori
Francesco AIETA
Marco Fiorentino
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4205Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/10Beam splitting or combining systems
    • G02B27/1006Beam splitting or combining systems for splitting or combining different wavelengths
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/10Beam splitting or combining systems
    • G02B27/1086Beam splitting or combining systems operating by diffraction only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/2251Constructional details
    • H04N5/2254Mounting of optical parts, e.g. lenses, shutters, filters; optical parts peculiar to the presence of use of an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infra-red radiation
    • H04N5/332Multispectral imaging comprising at least a part of the infrared region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/04Picture signal generators
    • H04N9/045Picture signal generators using solid-state devices

Abstract

An image-capture device includes an image sensor. The image sensor has an array of image sensor elements including at least first and second image sensor elements. The image-capture device includes a diffractive grating array to diffract electromagnetic radiation in a first wavelength range onto the first image sensor element and to diffract electromagnetic radiation in a second, different wavelength range onto the second image sensor element.

Description

IMAGE-CAPTURE DEVICES HAVING A DIFFRACTIVE GRATING ARRAY BACKGROUND

[0001] Image-capture devices have been proposed that separate out and capture light from different parts of the spectrum. Some such devices use a colour filter array (CFA) to separate out light into different wavelength ranges. For example, a Bayer filter may be used to separate light into red, green and blue (RGB) wavelength ranges so that colour information relating to a scene can be captured by the image-capture device.

BRIEF DESCRIPTION OF THE DRAWINGS

[0002] Various features of the present disclosure will be apparent from the detailed description which follows, taken in conjunction with the accompanying drawings, which together illustrate certain example features, and wherein:

[0003] Figure 1 is a schematic diagram showing an image-capture device according to an example;

[0004] Figure 2 is a schematic diagram showing an image-capture device according to an example;

[0005] Figure 3 is a schematic diagram showing part of an image-capture device according to an example;

[0006] Figure 4 is a schematic diagram showing a diffractive grating array according to an example;

[0007] Figure 5 is a schematic diagram showing an image-capture device according to an example;

[0008] Figure 6 is a schematic diagram showing spectral data according to an example;

[0009] Figure 7 is a graph showing spectral sensitivity data relative to wavelength according to an example;

[0010] Figure 8 is a schematic diagram showing an additive manufacturing system according to an example;

[0011] Figure 9 is a schematic diagram showing a filter pattern according to an example; [0012] Figure 10 is a graph showing sensitivity data relative to wavelength according to an example; and

[0013] Figure 1 1 is a schematic diagram showing an additive manufacturing process according to an example.

DETAILED DESCRIPTION

[0014] In the following description, for purposes of explanation, numerous specific details of certain examples are set forth. Reference in the specification to "an example" or similar language means that a particular feature, structure, or characteristic described in connection with the example is included in at least that one example, but not necessarily in other examples.

[0015] Referring to Figure 1 , there is shown schematically an example of an image-capture device 100. The image-capture device 100 may capture image data relating to a scene.

[0016] In this example, the image-capture device 100 includes an image sensor 1 10. Image sensors may also be known as imagers. The image sensor 1 10 may detect and convey information that constitutes an image. Example types of image sensor 1 10 include, but are not limited to, complementary metal-oxide- semiconductor (CMOS) and charge-coupled device (CCD) image sensors. The image sensor 1 10 may comprise an array 120 of image sensor elements. The array 120 of image sensor elements is an ordered arrangement of image sensor elements. The image sensor elements may be arranged in a grid. In this example, image sensor 1 10 includes first and second image sensor elements 130, 140. The image sensor 1 10 may include at least one additional image sensor element. In some examples, the image sensor 1 10 includes a much larger total number of image sensor elements than two. For example, the image sensor 1 10 may include over a million image sensor elements.

[0017] In this example, the image-capture device 100 includes only a single image sensor, namely image sensor 1 10. In other examples, the image-capture device 100 includes multiple image sensors 1 10.

[0018] In this example, the image-capture device 100 includes an optical component 150 in the form of a diffractive grating array 150. The diffractive grating array 150 may include an array of diffractive gratings. The diffractive grating lens array 150 may be a diffractive grating lens array, the lenses in the array having respective diffractive gratings. The diffractive gratings may split and diffract incident electromagnetic radiation into multiple different beams travelling in different directions. The directions of the respective beams may depend on the wavelength of the electromagnetic radiation making up the diffracted beam. For example, a first beam associated with a first wavelength range may be diffracted by the diffractive grating array 150 at a first angle or angle range relative to the diffractive grating array 150 and a second beam associated with a second wavelength range may be diffracted by the diffractive grating array 150 at a second, different angle or angle range relative to the diffractive grating array 150. The image-capture device 100 may include one or more additional optical components in the form of diffractive grating arrays.

[0019] The diffractive grating array 150 may diffract electromagnetic radiation in a first wavelength range onto the first image sensor element 130 and to diffract electromagnetic radiation in a second, different wavelength range onto the second image sensor element 140. In some examples, the first wavelength range is in the visible part of the electromagnetic spectrum and the second, different wavelength range is in the infrared part of the electromagnetic spectrum

[0020] The diffractive grating array 150 may focus visible light on the first image sensor element 130. Visible light may be considered to be electromagnetic radiation having a wavelength in the visible part of the electromagnetic spectrum. The visible part of the electromagnetic spectrum may be considered to relate to wavelengths from around 390 nanometres (nm) to around 700 nm.

[0021] The diffractive grating array 150 may focus infrared on the second image sensor element 140. Infrared may be considered to be electromagnetic radiation having a wavelength in the infrared part of the electromagnetic spectrum. The infrared part of the electromagnetic spectrum may be considered to relate to wavelengths from around 700 nm to around 1 millimetre (mm).

[0022] In this example, the image-capture device 100 is a joint visible and infrared spectral imager. The image-capture device 100 may facilitate accurate capture of colorimetric and/or spectral data in the visible wavelength region and quantifying of infrared radiation. In this example, a single image sensor 120 senses both visible light and infrared. This is in contrast to an arrangement in which a first image sensor senses visible light and a second, different image sensor senses infrared.

[0023] As will be described in more detail below, the image-capture device 100 may be used in an additive manufacturing inspection process. This may allow both visible spectra and infrared spectra information relating to the additive manufacturing process to be captured and inspected. Captured visible spectra information may be used for color calibration, profiling, in-line geometric adjustments etc. Captured infrared spectra information may be used for heat-uniformity analysis, fusing analysis, etc.

[0024] Referring to Figure 2, there is shown schematically an example of an image-capture device 200.

[0025] In this example, the image-capture device 200 includes an image sensor 210. The image sensor 210 includes a plurality of image sensor elements. The image sensor 210 may be a photosensitive array of photosensitive elements. An example type of image sensor element is a pixel. In this specific example, the image sensor 210 is a 36 megapixel image sensor. In this specific example, the individual image sensor elements making up the image sensor 210 have a size of 4.88 micrometres (μΐη) x 4.88 μΐη.

[0026] In this example, the image-capture device 200 includes an optical component in the form of a diffractive grating array 220. In this example, the diffractive grating array 220 is a nano-grating filter array. In this example, the nano- grating filter array has an array of individual nano-grating filters. In this specific example, the individual nano-grating filters making up the nano-grating filter array have a size of 4.88 μιη x 4.88 μιη. In this example, the individual filters of the nano- grating filter array are aligned with respective individual image senor elements.

[0027] In this example, the image-capture device 200 includes an optical element 230. In this example, the optical element 230 performs one or more functions in relation to incident electromagnetic radiation arriving at the optical element 230. Examples of such functions include, but are not limited to, collimating, focusing and isolating incident electromagnetic radiation. Collimating refers to reducing the angular spread of incident electromagnetic radiation. The optical element 230 may, for example, be a long focal-ratio objective lens to image the incoming electromagnetic radiation onto the diffractive grating lens array 220. The optical element 230 may, for example, comprise at least one collimating lens to provide collimated electromagnetic radiation to the diffractive grating array 220.

[0028] The image-capture device 200 may be designed and manufactured so that electromagnetic radiation transmitted from the diffractive grating array 220 to the image sensor 210 has one or more predetermined spectral properties. Examples of such predetermined spectral properties include, but are not limited to, predetermined peak wavelengths and predetermined widths. Width may be expressed in terms of full-width at half-maximum (FWHM).

[0029] Designing and manufacturing the image-capture device 200 may involve determining the size of individual image sensor elements of the image sensor 210. The size of individual image sensor elements may be determined in various different ways, for example based on knowing both the size of the image sensor 210 and the sensor element density within the image sensor 210. An example of a commercially available full-frame photography (e.g. 36 mm χ 24 mm) image sensor has 36 megapixels, with pixels of size 4.88 μιη x 4.88 μιη. Another example of a commercially available full-frame image sensor has 16 megapixels, with pixels of size 7.3 μιη x 7.3 μιη.

[0030] Designing and manufacturing the image-capture device 200 may involve determining a configuration of the diffractive grating array 220. Determining the configuration of the diffractive grating array 220 may involve determining one or more patterns of spatially arranged transmission filters that modulate incoming electromagnetic radiation. The transmission filters may be narrowband transmission filters. A narrowband transmission filter may for example have a bandpass of around 2 nm to 10 nm or larger.

[0031] Designing and manufacturing the image-capture device 200 may involve superimposing the transmission filters making up the diffractive grating array 220 onto the image-capture device 200 so that individual filters are aligned with respective image sensor elements. In some examples, a transmission filter is aligned with a single image sensor element.

[0032] Individual image sensor elements may capture a signal proportional to the amount of electromagnetic radiation that is filtered through their respective transmission filters. This may allow a reading of intensity over the set of individual narrow spectral bands to be captured by the image sensor 210. [0033] The image-capture device 200 may comprise an interpolator 240 to interpolate data obtained from the image sensor 120 to generate first interpolated image data associated with visible light and second interpolated image data associated with infrared. The interpolated data may for example be associated with comprise spectral reflectance and heat, and colorimetry and heat.

[0034] Interpolation may be performed for some or all of the individual image sensor elements. Interpolation may be used to compute one or more values for at least some other spectral bands corresponding to some or all of the other transmission filters. Interpolation may be performed for a given image sensor element to compute values for all other spectral bands corresponding to all of the transmission filters other than the transmission filter associated with the given image sensor element.

[0035] The result of the interpolation may be one or more full-resolution images, with the one or more images corresponding to one or more frequency bands or ranges. The number of frequency bands or ranges may be the same as the number of different frequency bands that are filtered using the diffractive grating array 220 over the whole area of the image sensor 210. For example, the diffractive grating array 220 may filter around nine different wavelengths and nine individual full-resolution images may be obtained, one associated with each of the nine different wavelengths.

[0036] Referring to Figure 3, there is shown schematically an example of part of an image-capture device 300. In this example, the depicted part of the image- capture device 300 is a single element (or 'unit') of the image-capture device 300. In this example, the single element of the image-capture device 300 corresponds to a single image sensor element 310 of an image sensor, a single filter 320 of a diffractive filter arrangement and a portion 330 of an optical element. The optical element may, for example, be a long focal-ratio objective lens as described above.

[0037] In this example, incident electromagnetic radiation 340 arrives at a first face of the portion 330 of the optical element. The incident electromagnetic radiation is collimated by the portion 330 of the optical element and is transmitted to the filter 320. The collimated electromagnetic radiation is filtered and diffracted by the filter 320 onto the single image sensor element 310. The filter 320 may be a narrowband filter to allow a narrow band of electromagnetic radiation to pass through to the image sensor element 310. Output data may be obtained from the image sensor element 310. The output data may be a scalar signal. The output data may be indicative of an amount of electromagnetic radiation filtered spectrally through the filter 320.

[0038] Referring to Figure 4, there is shown schematically an example of a diffractive grating array 400.

[0039] The diffractive grating array 400 includes a number of diffractive grating lens elements. Two diffractive grating array elements are indicated by reference signs 410, 420 in Figure 4. The diffractive grating array elements include respective diffractive gratings. Two diffractive gratings are indicated by reference signs 430, 440 in Figure 4. Diffractive grating 430 is associated with diffractive grating array element 410 and diffractive grating 440 is associated with diffractive grating array element 420. In this specific example, the diffractive grating lens array 400 includes sixteen diffractive grating array elements.

[0040] The diffractive grating array 400 may be implemented in different ways.

[0041] The diffractive grating array 400 may be implemented as a binary mask. The diffractive grating array 400 may include grating of a binary transmission modulation type. Binary transmission modulation is based on the same principle as a zone plate, namely using a set of opaque and transparent paths to diffract electromagnetic radiation. The diffractive grating array 400 may include grating of a binary thickness modulation type. Binary thickness modulation is based on the diffractive grating array 400 having two different thickness levels, causing a desired diffraction pattern. A diffractive grating array implemented as a binary mask may facilitate implementation as it may not need high resolution manufacturing and may be made with a single step of photolithography.

[0042] The diffractive grating lens array 400 may be implemented as a multilevel mask. The diffractive grating array 400 may include grating of a multilevel thickness modulation type. Multilevel thickness modulation is based on the diffractive grating array 400 having more than two different thickness levels, causing a desired diffraction pattern. The multilevel mask may use multilevel thickness modulation for continuous phase variation. This type of diffractive grating lens array 400 may use a relatively large number of fabrication steps, as multiple lithographic masks alignments may be needed. [0043] The diffractive grating lens array 400 may be implemented as a subwavelength resonator. The diffractive grating array 400 may be a grating of a subwavelength resonator type. The subwavelength resonator may use subwavelength resonators to provide continuous phase variation. In a subwavelength resonator, light is trapped and then released with a specific phase delay that can be tailored based on the geometry of the resonator. This type of diffractive grating lens array 400 may be relatively efficient to manufacture and may be made in a single step of lithography. Although the resolution of patterning used for this type of diffractive grating lens array 400 may be higher than that for other types, for example below 100 nm, some currently available nanofabrication techniques may achieve these resolutions at a high yield. Examples of such nanofabrication techniques include, but are not limited to, nanoimprinting lithography and extreme ultraviolet (UV) step-photo lithography.

[0044] Referring to Figure 5, there is shown schematically an example of an image-capture device 500.

[0045] In this example, the image-capture device 500 includes an image sensor 510. The image sensor 510 includes an array of image sensor elements. Two image sensor elements are indicated by reference signs 520, 530 in Figure 4.

[0046] In this example, the image-capture device 500 also includes a diffractive grating lens array 540. In this example, different beams of electromagnetic radiation associated with different wavelengths focus at different points of the image sensor 510. The diffractive grating lens array 540 may diffract and focus visible light onto the first image sensor element 520 and may diffract and focus infrared onto the second image sensor element 530. In this example, the diffractive grating lens array 540 includes a number of diffractive grating lens array elements 550, 560, 570, 580.

[0047] In this example, the diffractive grating lens array 540 is aligned with the image sensor 510 by positioning the diffractive grating lens array 540 off-axis with respect to the image sensor 510. The off-axis placement of the diffractive grating lens array 540 with respect to the image sensor 510 accommodates the diffraction caused by the diffractive grating lens array 540.

[0048] In this example, the array of diffractive grating lenses 540 is placed in front of the image sensor 510. Individual lenses 550, 560, 570, 580 of the diffractive grating lens array 540 focus, and deflect at an angle, the electromagnetic radiation incident on that portion of the diffractive grating lens array 540. As a consequence of colour dispersion, different wavelengths of electromagnetic radiation focus at different points on the image sensor 510. This facilitates detection of spectral information with spatial resolution of a scene in a single shot. The focal plane may change with wavelength.

[0049] The dispersed light may be collected by the matrix of sensor elements of the image sensor 510.

[0050] Examples described above provide a method of manufacturing the image-capture device 500. An image sensor 510 is provided. The image sensor 510 includes at least first and second image sensor elements 520, 530. A diffractive grating lens array 540 may be aligned with the image sensor 520 so that, in use, visible light diffracted by the diffractive grating lens array 540 is focused and directed onto the first image sensor element 520 and infrared diffracted by the diffractive grating lens array 540 is focused and directed onto the second image sensor element 530.

[0051] Referring to Figure 6, there is shown schematically an example of spectra formed on an image sensor 600. A spectrum formed on the image sensor 600 is indicated by reference sign 610 in Figure 6. Rotating the dispersion axis of the diffractive grating lens array relative to the tiling axes of the diffractive grating lens array may allow the sensor area of the image sensor to be used more efficiently, resulting in more information being obtained per image sensor element.

[0052] Referring to Figure 7, there is shown a graph depicting example spectral sensitivity data. The example spectral sensitivity data includes a first curve 710 comparing sensitivity to wavelength for a CMOS sensor, a second curve 720 comparing sensitivity to wavelength for a CCD sensor and a third curve 730 comparing sensitivity to wavelength for the human eye.

[0053] CMOS and CCD sensors have sensitivity beyond the visible range. Electromagnetic radiation beyond the visible range may be filtered out in some commercially available cameras using an infrared-cut filter. In some examples, the image-capture device described herein does not include an infrared-cut filter. In some examples, the image-capture device described herein includes one or more infrared filters to allow infrared to pass through the one or more filters. This may allow the image sensor to collect data relating to wavelengths in the infrared part of the electromagnetic spectrum which may be useful in the context of inspecting an additive manufacturing process for example.

[0054] Although example CMOS and CCD sensitivity curves are depicted in Figure 7, some non-silicon based CCD sensors have sensitivities beyond near- infrared. For example they may have sensitives up to around 2.5 μιη or 3 μιη. An example of a non-silicon based CCD sensor that may exhibit sensitivities in this region is an Indium-Gallium-Arsenide sensor.

[0055] Referring to Figure 8, there is shown schematically an example of an additive manufacturing system 800. Additive manufacturing is commonly referred to as 3D printing. In this example, the additive manufacturing system 800 includes an image-capture device 810 and additive manufacturing equipment 820. The additive manufacturing equipment 820 may be used to one or more produce three- dimensional objects. The additive manufacturing equipment 820 is commonly referred to as a 3D printer. The image-capture device 810 may serve as an additive manufacturing inspection camera to inspect the production of the one or more produce three-dimensional objects by the additive manufacturing equipment 820.

[0056] Some commercially available additive manufacturing systems have no built-in imaging mechanisms. Having such imaging mechanisms may facilitate capturing of different parameters associated with the additive manufacturing process. Examples of such parameters include, but are not limited to, geometry of all or part of the object being manufactured, colour of all or part of the object being manufactured and heat associated with all of part of the manufacturing process.

[0057] Capturing geometric information may benefit from using high-resolution imaging techniques in order to capture deformations. Such deformations may be captured, extracted and compensated, for example during the manufacturing process. Capturing colour information may benefit from using high-accuracy imaging techniques in order to calibrate colour output. Colour information may be analysed using a colorimeter or spectrophotometer for example. Capturing heat information may involve using sufficiently sensitive imaging techniques to determine the heat- uniformity of the fusing process.

[0058] Referring to Figure 9, there is shown schematically an example of a filter layout pattern 900. The filter layout pattern 900 may be repeated over an image sensor. Example peak wavelengths in nm of light diffracted onto individual sensor elements of the image sensor are show in the filter layout pattern 900 in Figure 9. Two elements of the filter layout pattern corresponding to respective individual sensor elements of the image sensor are indicated by reference signs 910, 920 in Figure 9.

[0059] In this example, the filter is configured so that visible light having a peak wavelength of 420 nm is diffracted and focused onto the first sensor element 910, infrared having a peak wavelength of 795 nm is diffracted and focused onto the second sensor element 920, visible light having a peak wavelength of 495 nm is diffracted and focused onto a third sensor element, infrared having a peak wavelength of 870 nm is diffracted and focused onto a fourth sensor element, visible light having a peak wavelength of 570 nm is diffracted and focused onto a fifth sensor element, infrared having a peak wavelength of 945 nm is diffracted and focused onto a sixth sensor element, visible light having a peak wavelength of 645 nm is diffracted and focused onto a seventh sensor element, infrared having a peak wavelength of 1020 nm is diffracted and focused onto an eighth sensor element and visible light having a peak wavelength of 720 nm is diffracted and focused onto a ninth sensor element.

[0060] In this example a diffractive grating array is to focus visible light on a first set of non-adjacent image sensor elements in the array of image sensor elements, namely the first, third, fifth, seventh and ninth image sensor elements. In this example, a diffractive grating array is to focus infrared on a second set of non- adjacent image sensor elements in the array of image sensor elements, namely the second, fourth, sixth and eighth image sensor elements.

[0061] In this example, the diffractive grating array is to focus visible light associated with a first wavelength on a first image sensor element of the image sensor. For example the diffractive grating array is to focus visible light associated with a wavelength of 420 nm onto the first sensor element 910. In this example, the diffractive grating array is to focus infrared associated with a second wavelength on a second image sensor element of the image sensor. For example the diffractive grating array is to focus infrared associated with a wavelength of 795 nm onto the second sensor element 920. [0062] In this example, the diffractive grating array is to focus electromagnetic radiation associated with at least one further wavelength on at least one further image sensor element of the image sensor. The at least one further wavelength may include a wavelength associated with visible light. In this specific example, the diffractive grating array is to focus visible light associated with a wavelength of 495 nm onto the third sensor element, visible light associated with a wavelength of 570 nm onto the fifth sensor element, visible light associated with a wavelength of 945 nm onto the seventh sensor element and visible light associated with a wavelength of 720 nm onto the ninth sensor element. The at least one further wavelength may include a wavelength associated with infrared. In this specific example, the diffractive grating array is to focus infrared associated with a wavelength of 870 nm onto the fourth sensor element, infrared associated with a wavelength of 945 nm onto the sixth sensor element and infrared associated with a wavelength of 1020 nm onto the eighth sensor element.

[0063] The filter may be designed such that accurate colorimetry and infrared information may be obtained at high-spatial resolution. In this example, a 3 x 3 filter design is used such that four of the image sensor elements in a 3 x 3 square of image sensor elements are used to capture infrared information, while the remaining five image sensor elements are used for colorimetry. In this example, the peak sensitives of the filters used to capture infrared are selected from the range of 800 nm to 1200 nm. In this example, the peak sensitives of the filters used to capture visible light are selected from the range of 400 nm to 700 nm. This filter arrangement allows hyperspectral sampling or imaging by collecting information from across the electromagnetic spectrum. In this specific example, nine different digital readings may be obtained covering an entire spectral range of interest and allow for an estimation of the full visible and infrared spectrum.

[0064] The above-described example 3 x 3 filter design may be varied. For example, the number of image sensor element configurations included within the repeating pattern may be different from nine. A higher number image sensor configurations may allowing for more granularity in spectral band terms. The arrangement of the filters within the repeating pattern may also be different from that described above, for example to reduce cross-talk effects. Further, an arrangement may be provided in which at least some image sensor element configurations change between different repetitions of the filter pattern.

[0065] The above-described example 3 x 3 filter design may provide full scene colorimetry and a heat-map from a single capture. It may also provide potential for a visible and infrared spectral reading, for example a scene described at individual image sensor elements by a uniform spectral sampling from 400 nm to 1200 nm.

[0066] The number of filters may impact the spatial resolution. For example, a Bayer pattern camera which relies on interpolation has a lower effective resolution than the number of its pixels. Using three spectral bands, for example, may maintain higher resolution than nine or sixteen bands.

[0067] The FWHM of each image sensor element may impact the signal to noise ratio of the filter design. The more narrowband the filters, the more care may be needed in relation to noise and/or the higher the dependence on the intensity of the incoming electromagnetic radiation.

[0068] Referring to Figure 10, there is shown a graph 1000 depicting example data relating to filter sensitivity. In this example, the example sensitivity data corresponds to sample peak sensitivities of the nine filters corresponding to the peak wavelengths shown in Figure 9 at an FWHM of 20 nm. This provides for capture for both colorimetry and infrared information.

[0069] In this example, a first curve 1010 has a peak of 420 nm and corresponds to the filter allowing visible light having a peak wavelength of 420 nm to be diffracted and focused onto the first sensor element. In this example, a second curve 1020 has a peak of 495 nm and corresponds to the filter allowing visible light having a peak wavelength of 495 nm to be diffracted and focused onto the third sensor element. In this example, a third curve 1030 has a peak of 570 nm and corresponds to the filter allowing visible light having a peak wavelength of 570 nm to be diffracted and focused onto the fifth sensor element. In this example, a fourth curve 1040 has a peak of 645 nm and corresponds to the filter allowing visible light having a peak wavelength of 645 nm to be diffracted and focused onto the seventh sensor element. In this example, a fifth curve 1050 has a peak of 720 nm and corresponds to the filter allowing visible light having a peak wavelength of 720 nm to be diffracted and focused onto the ninth sensor element. In this example, a sixth curve 1060 has a peak of 795 nm and corresponds to the filter allowing infrared having a peak wavelength of 795 nm to be diffracted and focused onto the second sensor element. In this example, a seventh curve 1070 has a peak of 870 nm and corresponds to the filter allowing infrared having a peak wavelength of 870 nm to be diffracted and focused onto the fourth sensor element. In this example, an eighth curve 1080 has a peak of 945 nm and corresponds to the filter allowing infrared having a peak wavelength of 945 nm to be diffracted and focused onto the sixth sensor element. In this example, a ninth curve 1090 has a peak of 1020 nm and corresponds to the filter allowing infrared having a peak wavelength of 1020 nm to be diffracted and focused onto the eighth sensor element.

[0070] Certain examples target infrared of a particular type. Certain examples target shortwave near-infrared radiation.

[0071] In terms of signal-to-noise ratios, an image-capture device with narrowband filters may be beneficial where fusing lamps used in the additive manufacturing produce very high-intensity outputs. For example, they may have powers in the range 300 Watts (W) to 750 W. This may mean the amount of energy to which an image sensor elements is exposed is very high compared to normal viewing conditions. Under some conditions, this may involve filtering to protect the image sensor. However, where narrowband filters are used, such additional filtering may not be used as the filter array itself may be highly spectrally-selective and may serve to protect the image sensor itself.

[0072] Referring to Figure 1 1 , there is shown schematically an example of an additive manufacturing process 1 100 using examples of the present technology. These examples concern parts of an additive manufacturing cycle, covering infrared and/or heat applications and visible light applications. The infrared and/or heat applications may relate to intensity and/or spectrum. The visible light applications may relate to geometry and/or colorimetry applications.

[0073] Items 1 1 10, 1 120 and 1 130 relate to a process performed during the additive manufacturing cycle. Item 1 1 10 corresponds to deposition of a powder layer. Item 1 120 corresponds to application of marking by agents. Item 1 130 corresponds to application of a fusing mechanism.

[0074] Items 1 140, 1 150 and 1 160 relate to activity at an example image- capture device and/or analysis of data captured by an example image-capture device. Item 1 140 corresponds to capturing visible and infrared image data. The captured data may be analysed for heat uniformity and/or geometric uniformity and/or for powder uniformity. Item 1 150 corresponds to capturing visible image data and analysing the captured data for agent deposition accuracy (vs. digital input) and capturing infrared image data for assessing an impact on heat uniformity. Item 1 160 corresponds to capturing infrared image data for fusing uniformity and/or heat uniformity and to capturing visible image data for colorimetry, for example as a proxy for mechanical properties, and geometry.

[0075] Items 1 170, 1 180 and 1 190 relate to triggers and/or actions. Item 1 170 may correspond to triggering recoating, applying additional heat to one or more layers, extending a waiting time to allow for cooling, modifying one or more agents to be deposited, initiating lamp heat spectrum feedback etc. Item 1 170 may correspond to applying additional agent amounts, modifying subsequent pass parameters for a current layer, extending a waiting time for cooling, informing one or more entities of a fusing stage, for example in terms of time and/or intensity etc. Item 1 180 may correspond to modifying one or more print parameters for subsequent layers, for example speed and/or lamp intensity, modifying geometry and/or halftone for one or more subsequent layers, adjusting and/or calibrating a colour mapping etc.

[0076] With additive manufacturing techniques using multiple materials to obtain multiple object properties, it may be desirable to improve measurement and sensing both on-line and off-line. For example, a colour additive manufacturing equipment may print using coloured agents and colorimetric measurement may be desirable for process control such as calibration and profiling. Such measurement differs however from 2D printing as the objects to be measured may not be flat surfaces and may not practically be measured by means of direct contact. For additive manufacturing systems that operate by means of applying heat, it may be desirable to analyse the uniformity of the infrared spectrum over the object as it determines both the goodness of fusing (an absolute measure) and uniformity (a relative measure). Furthermore, the ability to capture a 2D field of view of an object using the image-capture device described herein may allow for measurements to be used for dynamic in-line colour, geometric and fusing adjustment. Subsequent printed layers may get adjusted based on how the current layer corresponds to the digital content being sent to print-heads, for example, and the ideal state of fusing and colour. Adjustments may include reducing or increasing the agent coverage over a layer. For example, low coverage may mean poor fusing and high coverage may mean geometric distortion due to excessive cooling. Examples described herein may provide a high-resolution 2D field-of-view capture that results in both colorimetry and spectral data as well as fusing and/or heat-uniformity data from infrared capture.

[0077] Some examples described above may improve image-capture by providing flexibility in trading off spectral resolution and spatial resolution and the ability to design and manufacture filters to work with an existing pixel grid of a CMOS or CCD photosensitive array.

[0001] Some examples described above may improve image-capture by providing an image-capture device that may be both high spatial resolution and high spectral resolution with a usability like that of commercially available consumer cameras. This may be distinguished from other multispectral devices based on scanning or interferometry or filter wheels with exposure times orders of magnitude higher than commercially available consumer cameras.

[0002] Some examples described above may improve image-capture by providing an image-capture device that may capture both visible and near-infrared signals using the same mechanism from complex geometries with a single physical device.

[0003] With an increase in materials and/or properties in additive manufacturing, on-line mechanisms for geometric, heat and/or property feedback about the printing progress may be desirable. Facilitating capture of the information to be used by such a mechanism in one processing operation by a single image sensor may be desirable. Further, being able to design image-capture devices that can be parameterised for various levels of precision in relation to visible and/or infrared light may be desirable. Further, being able to allow colorimetry and/or spectral data to be captured in this way may be desirable.

[0004] As an alternative to providing a joint visible and infrared image-capture device, custom solutions may be used for colour and I R infrared in isolation, with each having additional shortcomings. RGB cameras, which may be used to obtain colour information, are not colorimeters and may not result in accurate colorimetries that can be used for color profiling. They may however be high-resolution. Infrared (or 'heat') cameras may be lower resolution. The dynamic ranges of colour and infrared capture devices may differ from each other and combining them may involve significant and non-trivial image and signal processing.

[0005] Further, some commercially available multispectral camera designs may not be suitable in some implementations. For example, filter-wheels or scan- devices may lead to excessive exposure times and interferometry based devices may lead to low-spatial resolution.

[0006] Capturing using a point-measurement device, as may be used in 2D printing for color calibration may be a significantly more complex alternative owing to the need to capture colour at geometry. For example, multiple angles of a surface of a given material may be needed to capture the correct colour. The techniques described herein may also be beneficial in relation to objects that have no flat surfaces where for example the use of a spectrophotometer is not possible.

[0007] Examples described above provide a device comprising an imager comprising at least first and second imager elements, and a diffractive filter array to diffract visible light onto at least part of the first imager element and to diffract infrared onto at least part of the second imager element.

[0008] The preceding description has been presented to illustrate and describe examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching.

Claims

Claims
1 . An image-capture device, comprising:
an image sensor comprising an array of image sensor elements including at least first and second image sensor elements; and
a diffractive grating array to diffract electromagnetic radiation in a first wavelength range onto the first image sensor element and to diffract electromagnetic radiation in a second, different wavelength range onto the second image sensor element.
2. The image-capture device of claim 1 , wherein the first wavelength range is in the visible part of the electromagnetic spectrum and the second, different wavelength range is in the infrared part of the electromagnetic spectrum
3. The image-capture device of claim 1 , wherein the diffractive grating array includes grating of a binary transmission modulation type.
4. The image-capture device of claim 1 , wherein the diffractive grating array includes grating of a binary thickness modulation type.
5. The image-capture device of claim 1 , wherein the diffractive grating array includes grating of a multilevel thickness modulation type.
6. The image-capture device of claim 1 , wherein the diffractive grating array includes grating of a subwavelength resonator type.
7. The image-capture device of claim 1 , wherein the diffractive grating array is to focus visible light on a first set of non-adjacent image sensor elements in the array of image sensor elements and to focus infrared on a second set of non- adjacent image sensor elements in the array of image sensor elements, the first set of non-adjacent image sensor elements including the first image sensor element and the second set of non-adjacent image sensor elements including the second image sensor element.
8. The image-capture device of claim 1 , the diffractive grating array being to focus:
visible light associated with a first wavelength on the first image sensor element of the image sensor;
infrared associated with a second wavelength on the second image sensor element of the image sensor; and
electromagnetic radiation associated with at least one further wavelength different from the first and second wavelengths on at least one further image sensor element of the image sensor different from the first and second image sensor elements.
9. The image-capture device of claim 8, wherein the at least one further wavelength includes a wavelength associated with visible light.
10. The image-capture device of claim 8, wherein the at least one further wavelength includes a wavelength associated with infrared.
1 1 . The image-capture device of claim 1 , comprising an interpolator to interpolate data obtained from the image sensor to generate first interpolated image data associated with visible light and second interpolated image data associated with infrared.
12. The image-capture device of claim 1 , comprising at least one collimating lens to provide collimated electromagnetic radiation to the diffractive grating array.
13. The image-capture device of claim 1 , wherein the image sensor is a complementary metal-oxide-semiconductor, CMOS, or a charge-coupled device, CCD.
14. An image-capture device, comprising:
an image sensor, the image sensor comprising an array of image sensor elements;
a diffractive grating lens array to diffract visible light onto some of the image sensors elements and to diffract infrared onto at least some others of the image sensor elements;
a collimating lens to provide collimated electromagnetic radiation to the diffractive grating lens array; and
an interpolator to interpolate data obtained from the image sensor to generate first interpolated image data associated with visible light and second interpolated image data associated with infrared.
15. A method of manufacturing an image-capture device, the method comprising:
providing an image sensor including at least first and second image sensor elements; and
aligning a diffractive grating lens array with the image sensor so that, in use, visible light diffracted by the diffractive grating lens array is focused and directed onto the first image sensor element and infrared diffracted by the diffractive grating lens array is focused and directed onto the second image sensor element.
PCT/US2016/029762 2016-04-28 2016-04-28 Image-capture devices having a diffractive grating array WO2017188956A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2016/029762 WO2017188956A1 (en) 2016-04-28 2016-04-28 Image-capture devices having a diffractive grating array

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2016/029762 WO2017188956A1 (en) 2016-04-28 2016-04-28 Image-capture devices having a diffractive grating array

Publications (1)

Publication Number Publication Date
WO2017188956A1 true true WO2017188956A1 (en) 2017-11-02

Family

ID=60159899

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/029762 WO2017188956A1 (en) 2016-04-28 2016-04-28 Image-capture devices having a diffractive grating array

Country Status (1)

Country Link
WO (1) WO2017188956A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6035089A (en) * 1997-06-11 2000-03-07 Lockheed Martin Energy Research Corporation Integrated narrowband optical filter based on embedded subwavelength resonant grating structures
JP2001186405A (en) * 1999-12-27 2001-07-06 Canon Inc Image pickup device and focusing method for the image pickup device
US20080283729A1 (en) * 2007-05-15 2008-11-20 Hajime Hosaka Apparatus and Method for Processing Video Signal, Imaging Device and Computer Program
US20120057235A1 (en) * 2010-09-03 2012-03-08 Massachusetts Institute Of Technology Method for Antireflection in Binary and Multi-Level Diffractive Elements
WO2013184556A1 (en) * 2012-06-05 2013-12-12 President And Fellows Of Harvard College Ultra-thin optical coatings and devices and methods of using ultra-thin optical coatings
US20160084708A1 (en) * 2013-04-10 2016-03-24 Bae Systems Plc Spectral imaging

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6035089A (en) * 1997-06-11 2000-03-07 Lockheed Martin Energy Research Corporation Integrated narrowband optical filter based on embedded subwavelength resonant grating structures
JP2001186405A (en) * 1999-12-27 2001-07-06 Canon Inc Image pickup device and focusing method for the image pickup device
US20080283729A1 (en) * 2007-05-15 2008-11-20 Hajime Hosaka Apparatus and Method for Processing Video Signal, Imaging Device and Computer Program
US20120057235A1 (en) * 2010-09-03 2012-03-08 Massachusetts Institute Of Technology Method for Antireflection in Binary and Multi-Level Diffractive Elements
WO2013184556A1 (en) * 2012-06-05 2013-12-12 President And Fellows Of Harvard College Ultra-thin optical coatings and devices and methods of using ultra-thin optical coatings
US20160084708A1 (en) * 2013-04-10 2016-03-24 Bae Systems Plc Spectral imaging

Similar Documents

Publication Publication Date Title
US5801831A (en) Fabry-Perot spectrometer for detecting a spatially varying spectral signature of an extended source
US5757425A (en) Method and apparatus for independently calibrating light source and photosensor arrays
US20120226480A1 (en) Design of Filter Modules for Aperture-coded, Multiplexed Imaging Systems
US6266140B1 (en) Corrected concentric spectrometer
US20090127430A1 (en) Compound-eye imaging apparatus
US5963311A (en) Surface and particle imaging pyrometer and method of use
US5986758A (en) Diffractive optic image spectrometer (DOIS)
US20100013979A1 (en) Snapshot spectral imaging systems and methods
Polder et al. Calibration and characterisation of imaging spectrographs
US7835002B2 (en) System for multi- and hyperspectral imaging
JP2010263158A (en) Two-dimensional solid-state imaging device and method for processing polarized light data in two-dimensional solid-state imaging device
Pust et al. Dual-field imaging polarimeter using liquid crystal variable retarders
Arguello et al. Higher-order computational model for coded aperture spectral imaging
JP2010256324A (en) Apparatus and method of acquiring spectral characteristics, image evaluating device, and image forming device
WO2012066741A1 (en) Multiband camera, and multiband image capturing method
US20020122192A1 (en) Device for the pixel-by-pixel photoelectric measurement of a planar measured object
Tack et al. A compact, high-speed, and low-cost hyperspectral imager
US7489396B1 (en) Spectrophotometric camera
Geelen et al. A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic
Ratliff et al. Interpolation strategies for reducing IFOV artifacts in microgrid polarimeter imagery
Du et al. A prism-based system for multispectral video acquisition
CN102944305A (en) Spectral imaging method and spectrum imaging instrument of snapshot-type high throughput
US20110228279A1 (en) Fabry-perot fourier transform spectrometer
US20040066515A1 (en) Device for the pixel-by-pixel photoelectric measurement of a planar measured object
US7109488B2 (en) Multi-color infrared imaging device