WO2023022964A1 - Multispectral and lidar detector using light field optics, and systems using same - Google Patents

Multispectral and lidar detector using light field optics, and systems using same Download PDF

Info

Publication number
WO2023022964A1
WO2023022964A1 PCT/US2022/040299 US2022040299W WO2023022964A1 WO 2023022964 A1 WO2023022964 A1 WO 2023022964A1 US 2022040299 W US2022040299 W US 2022040299W WO 2023022964 A1 WO2023022964 A1 WO 2023022964A1
Authority
WO
WIPO (PCT)
Prior art keywords
aggregated
imager
pixels
spectral
subpixels
Prior art date
Application number
PCT/US2022/040299
Other languages
French (fr)
Inventor
Shimon Maimon
Original Assignee
Netzvision, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netzvision, Llc filed Critical Netzvision, Llc
Publication of WO2023022964A1 publication Critical patent/WO2023022964A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer

Definitions

  • US Patent No. 7,433,042 to Cavanaugh and al. discloses a hyperspectral imager that utilizes a micro-lens array ( MLA) as a series of field lenses, with each lens distributing a point in the image scene received through an objective lens across an area of a pixel array forming a multi spectral aggregated pixel.
  • Spectral filtering is performed by a spectral filter array positioned at the objective lens so that each sub-pixel within an aggregated pixel receives light that has been filtered by a bandpass or other type of filter and is responsive to a different band of the image spectrum.
  • US Patent No. 7,433,042 is incorporated herein by reference in its entirety.
  • every aggregated pixel comprises an array of 4 rows and 4 columns of sub-pixels
  • every aggregated pixel senses a point in the scene and every subpixel within the aggregated pixel detects a selected spectral band of that point in the image scene.
  • This spectral distribution is achieved by using a cooperating 4X4 filter array located in or about the objective lens of the imager.
  • the aggregated pixel photodetector array size is 400x400.
  • every frame of the 1600X1600 photodetector array will provide 16 separate spectral images of the scene, each having a 400x400 pixel resolution, and each spectral image corresponds to a spectral band determined by the associated filter element in the filter array.
  • each aggregated pixel images a point in the image scene, all the spectral images are fully correlated which means that a pixel having coordinates x,y in one spectral image senses substantially the same point in the image scene as pixel x,y in any other spectral image produced from the same frame, however, each of the spectral images senses a different spectral range.
  • a common problem in multispectral imaging requires correlating pixels in various sensors to the same point in the image scene, a process known colloquially as ‘fusing’ the various spectral images. This is commonly achieved by specialized software, which in turn requires additional hardware to process the various images, and such correlation generally suffers from some inaccuracies and mandates a time lag which limits the effective frame rate. Utilizing the Cavanaugh device obviates the need for fusing software. Numerous applications, such as autonomous vehicles (AV), Advanced Driver Assisted Systems (ADAS), multispectral aerial photography, augmented reality (AR) and the like may benefit from a multispectral image without the inaccuracies and the time lag associated with common fusing software.
  • AV autonomous vehicles
  • ADAS Advanced Driver Assisted Systems
  • AR augmented reality
  • LIDAR Light Detection and Ranging or Laser Imaging Detection and Ranging
  • RADAR Radio Detection and Ranging
  • LIDAR detects the distance from a target or multiple targets by illuminating the surrounding environment with laser light of a given wavelength and measuring range information to the target.
  • range information from the laser light reflected from the image scene, such as, by way of example, time of flight, phase comparison, continuous wave modulation, and the like. In one such range measuring method, the time difference between the transmission of the light pulse to the time the reflected light is detected is measured.
  • a common LIDAR type colloquially known as a scanning LIDAR, sends a plurality of laser pulses in a scanning pattern to scan an area of interest. Range information is collected from the time of flight and correlated with the respective pulse direction.
  • a second type of LIDAR colloquially related to as flash LIDAR, illuminates the volume of interest by a pulse of laser light, and the LIDAR detects the reflected light by a focal plane photodetector array capable of sensing at least the wavelength of the light that was used for flooding the volume.
  • each pixel may individually capture light reflected from a respective point in the image scene.
  • the plurality of direction and range information collected from the reflected laser light pulses can then be used to create a three-dimensional representation of objects in the area of interest.
  • LIDARs use silicon avalanche photodiodes and/or photomultipliers, to detect and amplify the signal produced by the reflection of the laser pulse. LIDARs are used extensively in autonomous vehicles, robots, and the like, to form the three-dimensional representation of the environment.
  • a multi-spectral and LIDAR imager 2 responsive to a spectrum encompassing at least a laser light wavelength and one more wavelength, the imager comprising: an objective lens 105, a plurality of spectral filters 110 disposed within the objective lens or proximal thereto along an optical path of light transferred by the objective lens, at least one of the spectral filters 110 being transmissive of the laser light wavelength, a microlens array 120 having a plurality of microlenses, the microlens array disposed along the optical path, a photodetector array 115 disposed farther away from the objective lens along the optical path than the microlens array, the photodetector array comprising a plurality of pixels .
  • the imager is characterized by a plurality of aggregated pixels 310 each formed of a group of pixels 3101, 3102, ... 310n from the plurality of pixels of the photodetector array 115, the group of pixels acting as subpixels to the respective aggregated pixel, each aggregated pixel associated with one of the plurality of microlenses, and each subpixel receives light passing through a respective one of the plurality of spectral filters, at least one of the subpixels having a different spectral filter associated therewith from the spectral filter associated with a neighboring subpixel, and the imager further characterized by a readout circuitry 125 capable of accumulating photocurrent from each of the subpixels, and provide integrated photocurrent magnitude information and/or timing information to a readout processing circuitry.
  • the readout circuitry 125 provides timing information relating to time difference between a first time associated with illuminating an image scene by a laser light source 100 emitting light at the laser light wavelength, and a second time associated with detection of the laser light reflected from an area in the image scene by a respective aggregated pixel.
  • the readout circuitry 125 is further characterized by coalescing circuitry 530 operative to coalesce photocurrent from at least a portion of the group of detectors of at least one aggregated pixel to facilitate determining the second time.
  • a base material of the photodetector array 115 is indium Gallium Arsenide (InGaAs).
  • the number of subpixels in at least one of the aggregated pixels 310 is an integer multiple of the number of spectral filters 110.
  • the integer may be 1 or greater than 1.
  • At least one of the plurality of spectral filters 110 transfers light in a bandpass containing a wavelength of the laser light and in a filter-specific bandpass which passes light having a wavelength differing from the laser light wavelength.
  • a majority of the plurality of the spectral filters 110 each transfers light in a bandpass containing the wavelength of the laser light and in a filterspecific bandpass which passes light having a wavelength differing from the laser light wavelength.
  • the imager further comprises a controllable filter 150 disposed in a light path extending between the image scene and the plurality of spectral filters 110, the controllable filter controllably switches between at least a first and a second state, wherein when in the first state the controllable filter transferring light in a bandpass containing wavelength of the laser light, and reducing transfer of light in at least one different wavelength bandpass.
  • the controllable filter reduces light different than the reflected laser light to reduce noise and improve signal to noise ratio.
  • the readout circuitry 125 associated with at least one aggregated pixel 310 is optionally capable of selectively outputting image information from a plurality of subpixels in the pixel group forming the at least one aggregated pixel. Additionally in certain embodiments the imager operates in at least a multi-spectral mode wherein the readout circuitry 125 integrates and outputs information reflective of the magnitude of light impinging during an integration period on individual detectors of the plurality of detectors of the photodetector array 115, and in a LIDAR mode wherein the readout circuitry outputs timing information for individual aggregated pixels from the plurality of aggregated pixels, the timing information relating to time difference between the first time associated with illuminating an image scene by the laser light source 100, and the second time associated with detection of the laser light reflected from an area in the image scene by a respective aggregated pixel 310.
  • the readout circuitry 125 may further comprise coalescing circuitry 530 operative to coalesce photocurrent accumulated from a plurality of subpixels of a respective aggregated pixel 310 during determining the second time.
  • the laser light is of a wavelength between 0.9 pm and 2.5 pm.
  • the readout circuitry 125 associated with at least one of the aggregated pixels comprises, for each of the plurality of subpixels forming the aggregated pixel 310, a photocurrent integrator 605, a transfer control circuit 610, and an integrator reset circuit 615; a timing circuit (by way of example 515,520 combined) capable of detecting time difference between a first time associated with illuminating an image scene by the laser light source 100, and a second time associated with detection of the laser light reflected from an area in the image scene by at least one subpixel of the aggregated pixel 310; and mode switching circuitry 710 capable of switching between a first operating mode wherein the readout circuitry 125 outputs to the readout processing circuitry 140 information related to the integrated photocurrent, and a second operating mode wherein the readout circuitry outputs to the readout processing circuitry information related to the time difference.
  • the readout circuitry 125 aggregates photocurrent from a plurality of subpixels of the aggregated pixel to facilitate determining the second time.
  • a camera comprising a multispectral imager 2 as described herein, wherein the readout processing circuitry is characterized by being capable at least of controlling a plurality of the readout circuitries 125 associated with a plurality of aggregated pixels between outputting integrated photocurrent magnitude information for subpixels of the aggregated pixels 310, and outputting timing information, reading the timing information outputted by the individual readout circuitries associated with the aggregated pixels, forming a three-dimensional map of the image scene responsive to the timing information outputted from the plurality of aggregated pixels, and forming a plurality of spectral images, each of the spectral images reflecting the integrated photocurrent magnitude information received from respective subpixels in the plurality of aggregated pixels.
  • the camera as described may have a readout circuitry of individual aggregated pixels and a coalescing circuitry operative to coalesce photocurrent from a plurality of subpixels associated with the aggregated pixel to facilitate determining the timing information.
  • the camera readout processing circuitry may further comprise at least one processor, wherein the processor is operated by software to perform at least a portion of the formation of the three-dimensional map, and/or the spectral images.
  • the laser light wavelength is determined to match the wavelength of a specific laser light source capable of illuminating an image scene of interest, the operation of the laser light source being coordinated with the operation of the readout circuit.
  • a LIDAR and multispectral combined imager 1 for imaging and for forming a three-dimensional map of an image scene, the combined imager comprising a laser source 100 disposed to illuminate an image scene, an objective lens 105 for producing a focused image of the scene at a focal plane, an array of spectral filters 110 placed at or about the objective lens, a photodetector array 115 comprising a plurality of aggregated pixels, each aggregated pixel comprising a plurality of subpixels, a microlens array 120 comprising a plurality of microlenses positioned between the objective lens and the photodetector array, and disposed and constructed to produce multispectral information projected onto a respective aggregated pixel.
  • the imager further comprises readout circuitry 125 capable of accumulating photocurrent from each of the subpixels, and providing information associated with the detection of the laser light by the respective subpixel.
  • the readout circuitry is also capable of providing information of time difference between a first time associated with illuminating the image scene, and a second time associated with detection of the laser light by the respective pixel.
  • illuminating the image scene or a portion thereof with laser light may be done contemporaneously with sending a signal to the readout circuitry, and the circuitry determines the time lag between the illumination and the detection of the laser light reflected from a region of the scene by each aggregated pixel.
  • the time lag information allows sensing a distance between the region in the image scene reflecting the laser light and the imager.
  • the laser illuminating the image scene may be incorporated in the imager, adjacent to the imager, or separate therefrom. Utilizing the laser illumination and sensing distance to scene regions forms a LIDAR, incorporated with the multispectral capable imager.
  • An aggregated pixel 310 comprises a plurality of individual pixels of the photodetector array.
  • a group of pixels are conceptually grouped together to form the aggregated pixel, and each of the individual pixels acts as a subpixel within the aggregated pixel.
  • An aggregated pixel is generally associated with one lenslet of the MLA.
  • the number of subpixels in an aggregated pixel corresponds to the number of spectral filters in the filter array, however, in some embodiments the number of subpixels is a multiple of the number of spectral filters.
  • each filter of the plurality of filters in the spectral filter array has at least one bandpass for a portion of the spectrum sensed by the imager and a bandpass for the laser light, however certain embodiments may utilize one or more filters which does not pass the laser light, and/or filter(s) which pass only the laser light band.
  • the readout circuitry is capable of providing sub-spectrum specific light magnitude information and time difference between laser illuminating of the scene and detection of laser light reflected from the image scene, for each respective subpixel.
  • the imager produces a plurality of spectral images 320, each reflecting a portion of the spectrum determined by the bandpass of a single spectral filter, and distance information of various points in the same image scene, allowing forming a three-dimensional map of the scene or a portion thereof.
  • Fusing the various spectral images is largely obviated as the same optical system and the same aggregated pixels are utilized for sensing the various spectral components, and as differing spectral images are received from adjacent pixels.
  • the system operates in two principal modes, namely a Multi-Spectral Mode (MSM) and a LIDAR mode (LM), where during the spectral mode the photodetector array integrates photocurrent from the subpixels, and during the LIDAR mode, the aggregated pixels sense the arrival of laser light from the laser-illuminated image scene, providing distance related information.
  • MSM Multi-Spectral Mode
  • LM LIDAR mode
  • processing and fusing may still be utilized, such processing is significantly lower than in common systems.
  • the time of flight is converted to range information and light data from corresponding subpixels in each of the plurality of aggregated pixels is collected to form individual spectral images.
  • range information of the LIDAR may require processing to form the three-dimensional map of the scene.
  • the fusing requires minimal processing.
  • the imager is capable of sensing laser light in the wavelength range between 0.9 pm and 2.5 pm, a range known as Short Wave Infra-Red (SWIR).
  • SWIR Short Wave Infra-Red
  • any of the spectral filters’ bandpass may be in the SWIR range and/or in the visual range.
  • Indium Gallium Arsenide (InGaAs) based infrared photodetectors are known in the art and the use of one in the imager is explicitly considered.
  • Generally utilizing any infrared photodetector in the imager allows sensing of spectral ranges which may include SWIR and/or MWIR spectral bands.
  • the LIDAR may or may not operate in the SWIR or MWIR range.
  • the LIDAR portion of the imager may be modified to utilize SWIR, MWIR, IR, and even visual wavelength range to meet design requirements by modifying the characteristics of the respective filter(s), the detector, and the laser light source.
  • Fig. 1 depicts a simplified top view of an exemplary aggregated pixel
  • Fig. 2 depicts a transmission graph of an exemplary filter
  • Fig 3 depicts a simplified schematic of a multispectral light field camera similar to the camera of the US 7,433,042 Patent to Cavanaugh.
  • FIG. 4 depicts a simplified block diagram of an exemplary combined multispectral and flash LIDAR imager principal components, according to one embodiment.
  • FIG. 5 depicts a simplified block diagram of an optional embodiment of readout processing circuitry.
  • Fig. 6 depicts a simplified block diagram of a unit cell circuitry of a single subpixel.
  • Fig. 7 depicts a simplified block diagram of a readout circuit for one aggregated pixel.
  • Fig. 1 depicts a simplified top view of an exemplary aggregated pixel 310.
  • the aggregated pixel is a portion of photodetector 115 and is divided to subpixels 3101, 3102, ... 310n where every subpixel forms a portion of a subpixel unit cell, and a group of subpixels that share light from a single microlens forms an aggregated pixel 310.
  • the aggregate pixel unit cell is formed of the subpixel unit cells, as well as circuitry specific to the aggregated pixel operation.
  • the aggregated pixel cell unit comprises mode switching circuitry, time counter and/or other distance sensing related circuitry, and optionally subpixel coalescing circuit.
  • the aggregated pixel cell unit circuitry may reside in the inter-cell area between subpixel cell units, which appear as the black portions between the subpixel unit cells in Fig. 1.
  • the aggregated pixel unit cell utilized coalescing circuitry (SI, S2, ... S n , 530) to coalesce output from a plurality of subpixels to drive the distance sensing circuitry. Coalescing subpixels in this manner increases the amplitude of sensed laser light, and therefore the sensitivity of laser light detection. It is noted that the figures depict conceptual schematics, and additional circuitry may be required to improve operational characteristics.
  • a photodetector array 115 and a ROIC 125 may be coupled by flip chip technology, where an array of pixels in the photodetector array is connected onto an array of unit cell circuitry using Indium bumps.
  • the term subpixel generally relates to a single sensing pixel element, and/or the combination of the sensing pixel element and the readout circuitry associated therewith, or stated equivalently, the pixel unit cell.
  • the ROIC 125 provides integration of photocurrent from each subpixel over an integration period.
  • Corresponding subpixels of all the aggregated pixels provide a spectral image reflective of light which passed through a single spectral filter, thus forming a single spectral image.
  • light that passes a single spectral filter falls on corresponding subpixels in each aggregated pixel of the plurality of aggregated pixels, so that sensing the magnitude of light from subpixels having similar coordinates within the respective aggregated pixel forms an image comprising light that was filtered by a single spectral filter.
  • subpixel X2 Y2 of aggregated pixelsi, aggregated pixeh, aggregated pixeh, and so forth to aggregated pixeln are all collected to form a single spectral image.
  • the ROIC integrates all the subpixels of an aggregated pixel to increase laser light detected intensity.
  • some, a majority, or all of the spectral filters in the spectral filter array 110 have at least dual bandpass characteristics.
  • One band would pass the illumination laser wavelength band, and the other band(s) would pass a specific portion of the spectrum as determined by the design requirements of the imager.
  • one or more of the filters may not pass the laser band, and in certain embodiments at least one of the filters has one bandpass which transfers the laser wavelength.
  • Fig. 2 depicts a transmission graph of an exemplary filter. Assuming the LIDAR laser utilized to illuminate the scene operates at 1.550 pm and that the bandpass of interest of the exemplary filter is between 1.2 and 1.3 pm, and that the vertical Y axis represents transmittance of light while the horizontal X axis represents wavelength, band 21 represents the laser light band while band 22 represents the filter-specific bandpass of 1.2-1.3 pm.
  • Various filters may be utilized in the spectral filter array, and individual filters may have other bandpass(es) of interest, which in certain embodiments may include the visual band or portions thereof.
  • polarized filters may be used as elements of the spectral filter array in any combination.
  • filters may utilize similar or different bandpass, at different polarizations. Further, optionally at least one of the filters may be limited only to a bandpass containing the laser light wavelength. However, to provide the desired spectral separation at least one of the filters should transfer a different bandwidth and/or polarization than a second filter. It is noted that while polarization is not strictly wavelength dependent, for the purpose of this specification, a polarizing filter may be considered spectrally different than a nonpolarizing filter, and filters having a different polarization geometry may be considered spectrally different even if their bandpass characteristics are similar.
  • a controllable filter 150 is disposed in the light path extending between the image scene and the photodetector, the controllable filter controllably transmitting light in a band containing the wavelength of the laser light while reducing or blocking transfer of light in at least one different wavelength band, or transmitting a broader spectrum than the laser wavelength, which may or may not include the laser wavelength.
  • the controllable filter reduces received light in any wavelength differing from the laser wavelength while the imager operates in LIDAR mode. Such reducing or blocking of light differing from the laser light is advantageous in LIDAR mode as it reduces detected noise, and facilitates precise determination of the time a laser pulse reflected from the image scene is detected, providing more precise distance information.
  • controllable filter may be implemented by various methods such as a rotating filter, a liquid crystal based filter, and various light path modifiers, such as placing prisms, mirrors, and the like in the light path such that incoming light is steered through a filter with bandpass at the laser wavelength.
  • various light path modifiers such as placing prisms, mirrors, and the like in the light path such that incoming light is steered through a filter with bandpass at the laser wavelength.
  • a plurality of controllable filters may be utilized.
  • Fig. 3 depicts a simplified diagram of optical paths associated with one aggregated pixel in a multispectral camera similar to one described in the ‘042 Patent to Cavanaugh.
  • the example describes in particular an aggregated pixel 310 formed by conceptual grouping of three subpixels 310i, 3102, 310s.
  • an aggregated pixel may comprise any desired plurality of subpixels.
  • Light from any given region in the image scene arrives at objective lens 105 and various spectral portions thereof pass through the spectral filter array 110, which in the drawing is disposed within the objective lens. It is noted the filter array may be in the lens or in its vicinity.
  • the light from the region is focused onto a region- respective microlens from the microlens array 120.
  • the respective microlens projects the light onto respective subpixels 310i, 3102, . . 310 n in the photodetector array 115.
  • Depicted subpixels 310i, 3102, 310s are conceptually grouped to form the three subpixel aggregated pixel 310.
  • Digital processing is performed to form each spectral image 320 by combining the outputs captured by corresponding subpixels from the plurality of the aggregated pixels.
  • the microlens array 120 is constructed and disposed such that light arriving thereto from the objective lens 105 is projected onto the photodetector 115 as substantially parallel light rays.
  • Fig. 4 depicts a simplified diagram of certain components in an exemplary combined LIDAR and multispectral imager forming a camera.
  • a laser source 100 illuminates the image scene, depicted schematically as an object.
  • the laser source may be a flash laser that illuminates the whole scene or large portions thereof, or a scanning laser.
  • An optional laser lens 135 may also be provided to provide any desired directional and/or polarization character to the laser light as it is discharged from the camera.
  • a laser sensor 130 detects the operation of the laser, commonly known as ‘firing’.
  • Light reflected from the image scene, including laser light, is collected by objective lens 105.
  • the depicted embodiment incorporates the spectral filter array 110 in the objective however the filter array may be located adjacent to the lens.
  • the objective lens focuses light from the image scene onto the microlens array 120,
  • the depicted drawing shows light from a specific region in the image scene being directed to a microlens which corresponds to the image scene region, and various microlenses in the microlens array correspond to respective regions in the image scene.
  • Photodetector 115 comprises a plurality of individually readable pixels, grouped into a plurality of aggregated pixels. Each microlens corresponds to an aggregated pixel and projects the light rays focused thereupon onto pixels of the photodetector 115 which act as subpixels of individual aggregated pixels, where subpixels which receive the light from one microlens of the microlens array are considered subpixels of an aggregated pixel.
  • the microlens array is constructed such that the light rays leaving each microlens are substantially parallel to each other.
  • the group of subpixels which receives light from one microlens forms one aggregated pixel.
  • one of the subpixels in an aggregated pixel corresponds to rays passed by a respective one of the plurality of spectral filters in the spectral filter array 110, however in some embodiments more than one subpixel is associated with a single spectral filter.
  • ReadOut Integrated Circuit (ROIC) 125 receives the output of the subpixels of the photodetector and communicates the information to Readout Processing Circuitry 140.
  • Readout circuitry is colloquially known as Read Out Integrated Circuit (ROIC), however, it is pointed out that such circuitry may reside in other embodiments and not only in a single integrated circuit.
  • the readout Processing Circuitry (RPC) 140 controls numerous operations of the imager.
  • a simplified block diagram of one optional embodiment of such processing circuitry is schematically depicted in Fig. 5.
  • the RPC performs primarily tasks of control and coordination of various portions of the imager and the laser portion of the device and optionally processes timing related information received from individual aggregated pixels into a three-dimensional map as well as form the individual spectral images.
  • the RPC further communicates information to external devices such as vehicle-based control devices, remote storage devices, and the like, as required by the environment in which the device operates.
  • the mode switching 710 portion of the RPC switches the mode of operation of the imager between LIDAR mode (LM) and multispectral mode (MSM).
  • LM LIDAR mode
  • MSM multispectral mode
  • a spectral control portion 715 portion of the RPC controls operation and reading of information from unit cells of individual subpixels, and resets the respective subpixel for the next reading cycle.
  • the spectral imaging 720 portion of the RPC composes the information from corresponding unit cells of subpixels in the aggregated pixels into respective spectral images, forming a plurality of spectral images, each reflecting light that passed through one spectral filter.
  • a LIDAR control portion 725 of the RPC controls operation of the system in LIDAR mode, including reading timing information from individual aggregated pixels, to establish range information.
  • Three- dimensional mapping portion 730 utilizes timing data obtained from individual aggregated pixels to form the three-dimensional map of the image scene.
  • RPC controller portion 740 controls the operation of the system as a whole, such as duty cycle between the different modes, communications, and other processing required for the operation of the system. It is noted that any and all of the RPC portions may be embodied in hardware, in software executed on one or more processors, or the RPC may be embodied in any combination of hardware and software. It is further noted that the various portions described above represent merely an arbitrary division and that certain functions may be shared between various portions of the RPC, or be delegated to other circuitry and/or controller, as a matter of implementation choice. The various portions may be implemented by one or more electronic processors. Common parts such as power supply, switches, and the light may not be depicted for simplicity. Further, while in fig.
  • the RPC 140 is depicted outside of the camera, the RPC placement of the RPC is a matter of implementation choice and the RPC may be installed within or without the same enclosure as the imager and/or the laser. It is further noted that the imager 2, the laser portion and the support circuitry may also reside in a single or differing enclosures at any combination.
  • FIG. 6 A simplified block diagram of a single subpixel unit cell is depicted in Fig. 6, where photocurrent from a single subpixel 310 n in the photodetector 115 is collected in integration capacitor 605. Responsive to a read command received from the spectral control portion 715 the charge in the integration capacitor is either presented directly to the output port OP, or optionally sampled by sample and hold circuit 620 for later transfer to the output port OP. The integrated charge information is presented by the readout section 610 for reading by the spectral imaging portion 720 of the RPC 140. Once sampled, the integration capacitor 605 may be discharged by reset circuit 615 and start integrating photocurrent for the next reading circle.
  • the readout section 610 may provide an analog or digital representation of the charge integrated by the integration capacitor 605.
  • a unit cell controller 600 controls and coordinates the operation of the unit cell circuitry while in other embodiments the control is directed from outside the unit cell to respective portions such as the reset, sample and hold, and the like.
  • FIG. 7 A simplified block diagram of an exemplary embodiment of an aggregated pixel cell unit is depicted in Fig. 7. As described elsewhere, the aggregated pixel cell unit operates in two modes, namely LM, and MSM.
  • the aggregated pixel unit cell contains a corresponding changeover switching circuitry SI, S2, and S3 respectively, and subpixel unit cell circuitry UC1, UC2, and UC3 respectively.
  • changeover switching circuitry SI, S2, and S3 respectively
  • subpixel unit cell circuitry UC1, UC2, and UC3 respectively.
  • photocurrent from each of the subpixels passes through the respective switch, and is integrated by the respective unit cell circuitry, as explained relative to Fig. 6.
  • the unit cell circuit presents the output at a respective output ports OP1, OP2, and Op3, in response to a signal from the spectral control portion 715.
  • the RPC signals the respective unit cell circuit to reset and restart integrating the photocurrent in a new cycle.
  • the mode switching portion 710 switches the imager to MSM mode and the bus control 510 directs the output of each subpixel to its respective unit cell circuitry via the respective switch SI, S2, and S3.
  • the unit cell circuitry integrates the photocurrent of the respective subpixel until the spectral control portion 715 signals that the integration period terminated, at which time the integration results are ready to be read.
  • the integrated photocurrent value is presented to the RPC for reading with a potential delay, until the RPC is ready to read the specific subpixel.
  • the mode switching 710 portion switches the operating mode from MSM to LM, it asserts the mode control signal which causes the laser controller 145 to begin arming the laser 100, and the bus control 510 to utilize switches SI, S2, and S3 to switch the output of the respective photodetector pixels 310i, 3102, 310s from the respective sub-pixel unit cell circuitry to the aggregated pixel unit cell circuitry.
  • the outputs of a plurality of the subpixels are coalesced and are connected in parallel by coalescing circuitry 530. It is noted that while Fig. 7 depicts a simple connection as the coalescing circuitry, various circuitries may be utilized to achieve the coalescing of the subpixel signals.
  • the laser sensor 130 senses the firing and sends a start signal to time counter 520.
  • the LIDAR is a flash-type LIDAR
  • the trigger signal is delivered to the counters of all the aggregated pixels.
  • the laser light travels to the image scene and is reflected by an object therein. A portion of the reflected light is captured by the objective lens, and thereby by the aggregated pixel associated with the image scene region containing the reflecting object.
  • the threshold circuit 515 signals the counter to stop the count. Coalescing the output of a plurality of the subpixels by connecting their output in parallel, as seen in Fig. 5, increases the signal collected from the laser, and thus increases the sensitivity of the imager in LIDAR mode.
  • the outputs of a majority or all of the subpixels in the aggregated pixel are coalesced, either in parallel or in series, however, the laser pulse detection may be obtained from less than all subpixels in an aggregated pixel, and even by a single subpixel.
  • the LIDAR portion depicted in Fig. 5 utilizes Time Of Flight (TOF) to measure the distance to the respective object in the image scene which reflected the light, however other methods of measuring distance are known in the LIDAR art, and may be utilized in various embodiments of the combined LIDAR and multispectral imager.
  • TOF Time Of Flight
  • FM-CW Frequency Modulation Continuous Wave
  • pixels of the photodetector are grouped into aggregated pixels by reception of the light from a respective microlens, the operation of the respective portion of the ROIC, and the RPC circuitry.
  • Such pixels which form the aggregate pixel are generally referred to as subpixels.
  • Respective subpixels of aggregated pixel imply that the subpixels have similar coordinates within the respective aggregated pixels. Stated differently by way of explanatory example, if a subpixel that resides in the first row and the second column is selected for the sake of the example, the first row, second column pixel of aggregated pixel 1 is the respective subpixel to the first row, a second pixel of aggregated pixel 2, and of aggregated pixel 3 and so forth.
  • Photodetector and photodetector array are equivalent.
  • Photodetector arrays are also commonly known as Focal Plane Array (FPA).
  • FPA Focal Plane Array
  • the number of pixels in a photodetector may vary and the description above may utilize photodetectors having any number of pixels and/or configuration which meets design requirements. Similarly, the number of subpixels forming an aggregated pixel may be varied to meet design needs.
  • spectral images and three-dimensional map relate at least to electrical representations of such images and map that are readable by a computer, and from which various physical and/or visual representations may be formed.
  • the term ‘device’ should be construed to refer to any combination of an imager and control circuitry, and may further include power supply, laser light source and control thereof, laser sensor, additional support circuitry, communications, and/or switching circuitry, as well as additional optical components such as lenses, mirrors, filters, and the like, as required by the environment and the task at hand for which the imager is employed.
  • a complete camera as depicted in Fig.
  • each of the components may be disposed separately and optionally in a plurality of enclosures.
  • An embodiment of the invention may be embedded in a larger vehicle such as an autonomous vehicle, a robot, or a manned vehicle, and utilized for driving safety, navigational, general mapping purposes, and any combination thereof, in one or more locations within the vehicle, and various components may be dispersed in different places in the vehicle.
  • the specific operating environment and the purpose for which the device is utilized may dictate various distribution of different components.
  • the laser section may be disposed at a different location in the vehicle than the imager section.
  • an embodiment of the invention may be embodied in a personal augmented reality system, where compact packaging is highly desired, resulting in a single package being preferred.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A combination of LIDAR and multispectral imager utilizing an objective lens for receiving multispectral light from an image scene, an array of spectral filters disposed within or proximally to the objective lens, a microlens array disposed further away along the optical path than the spectral filters and a photodetector behind the microlens array. The photodetector is conceptually divided into a plurality of aggregated-pixel, each composed of a group of subpixels, and a readout circuit capable of obtaining light magnitude operation from corresponding subpixels in the aggregated pixels to generate spectral images specific for the respective spectral filters, and for obtaining timing information of laser light reflections from the image scene to form a LIDAR based three-dimensional image of the scene. Both spectral images and range information are derived via the same objective lens, thus correlating pixels from various spectral images and respective range information is greatly simplified.

Description

MULTISPECTRAL AND LIDAR DETECTOR USING LIGHT FIELD OPTICS, AND SYSTEMS USING SAME
Related Applications
[0001] This application claims priority from, and incorporates by reference in its entirety as filed, United States Provisional Patent Application No. 63/233,718.
Background
[0002] US Patent No. 7,433,042 to Cavanaugh and al. discloses a hyperspectral imager that utilizes a micro-lens array ( MLA) as a series of field lenses, with each lens distributing a point in the image scene received through an objective lens across an area of a pixel array forming a multi spectral aggregated pixel. Spectral filtering is performed by a spectral filter array positioned at the objective lens so that each sub-pixel within an aggregated pixel receives light that has been filtered by a bandpass or other type of filter and is responsive to a different band of the image spectrum. US Patent No. 7,433,042 is incorporated herein by reference in its entirety.
[0003] Stated differently, and assuming by way of non-limiting example that every aggregated pixel comprises an array of 4 rows and 4 columns of sub-pixels, every aggregated pixel senses a point in the scene and every subpixel within the aggregated pixel detects a selected spectral band of that point in the image scene. This spectral distribution is achieved by using a cooperating 4X4 filter array located in or about the objective lens of the imager.
[0004] Continuing the non-limiting example, if a photodetector array has 1600x1600 pixels and the imager uses a 4X4 aggregated pixel structure as described above, then the aggregated pixel photodetector array size is 400x400. Utilizing this technique every frame of the 1600X1600 photodetector array will provide 16 separate spectral images of the scene, each having a 400x400 pixel resolution, and each spectral image corresponds to a spectral band determined by the associated filter element in the filter array. As each aggregated pixel images a point in the image scene, all the spectral images are fully correlated which means that a pixel having coordinates x,y in one spectral image senses substantially the same point in the image scene as pixel x,y in any other spectral image produced from the same frame, however, each of the spectral images senses a different spectral range.
[0005] A common problem in multispectral imaging requires correlating pixels in various sensors to the same point in the image scene, a process known colloquially as ‘fusing’ the various spectral images. This is commonly achieved by specialized software, which in turn requires additional hardware to process the various images, and such correlation generally suffers from some inaccuracies and mandates a time lag which limits the effective frame rate. Utilizing the Cavanaugh device obviates the need for fusing software. Numerous applications, such as autonomous vehicles (AV), Advanced Driver Assisted Systems (ADAS), multispectral aerial photography, augmented reality (AR) and the like may benefit from a multispectral image without the inaccuracies and the time lag associated with common fusing software.
[0006] LIDAR (Light Detection and Ranging or Laser Imaging Detection and Ranging) is a surveying method similar to the more common Radio Detection and Ranging (RADAR), utilizing irradiation in the light spectrum instead of the much lower frequency radio spectrum. LIDAR detects the distance from a target or multiple targets by illuminating the surrounding environment with laser light of a given wavelength and measuring range information to the target. Several methods are known for generating range information from the laser light reflected from the image scene, such as, by way of example, time of flight, phase comparison, continuous wave modulation, and the like. In one such range measuring method, the time difference between the transmission of the light pulse to the time the reflected light is detected is measured. This data, known as time-of-flight (TOF), allows the LIDAR to measure the distance to the reflecting target with high accuracy. A common LIDAR type, colloquially known as a scanning LIDAR, sends a plurality of laser pulses in a scanning pattern to scan an area of interest. Range information is collected from the time of flight and correlated with the respective pulse direction. A second type of LIDAR, colloquially related to as flash LIDAR, illuminates the volume of interest by a pulse of laser light, and the LIDAR detects the reflected light by a focal plane photodetector array capable of sensing at least the wavelength of the light that was used for flooding the volume. In a flash type LIDAR, each pixel may individually capture light reflected from a respective point in the image scene. The plurality of direction and range information collected from the reflected laser light pulses can then be used to create a three-dimensional representation of objects in the area of interest.
[0007] Conventional LIDARs use silicon avalanche photodiodes and/or photomultipliers, to detect and amplify the signal produced by the reflection of the laser pulse. LIDARs are used extensively in autonomous vehicles, robots, and the like, to form the three-dimensional representation of the environment.
[0008] US Patent Publication 2021/0190962 to Maimon et al. and US Patent No. 7433042 to Cavanaugh in their entirety are incorporated by reference herein.
Summary
[0009] It is an object of the present invention to create an imaging system capable of forming a three-dimensional map of an image scene, combined with multispectral information. It is desired that the time lag of fusing the image scene information would be minimized, and ideally eliminated. To that end there is provided a multi-spectral and LIDAR imager 2 responsive to a spectrum encompassing at least a laser light wavelength and one more wavelength, the imager comprising: an objective lens 105, a plurality of spectral filters 110 disposed within the objective lens or proximal thereto along an optical path of light transferred by the objective lens, at least one of the spectral filters 110 being transmissive of the laser light wavelength, a microlens array 120 having a plurality of microlenses, the microlens array disposed along the optical path, a photodetector array 115 disposed farther away from the objective lens along the optical path than the microlens array, the photodetector array comprising a plurality of pixels . The imager is characterized by a plurality of aggregated pixels 310 each formed of a group of pixels 3101, 3102, ... 310n from the plurality of pixels of the photodetector array 115, the group of pixels acting as subpixels to the respective aggregated pixel, each aggregated pixel associated with one of the plurality of microlenses, and each subpixel receives light passing through a respective one of the plurality of spectral filters, at least one of the subpixels having a different spectral filter associated therewith from the spectral filter associated with a neighboring subpixel, and the imager further characterized by a readout circuitry 125 capable of accumulating photocurrent from each of the subpixels, and provide integrated photocurrent magnitude information and/or timing information to a readout processing circuitry.
[0010] Optionally, the readout circuitry 125 provides timing information relating to time difference between a first time associated with illuminating an image scene by a laser light source 100 emitting light at the laser light wavelength, and a second time associated with detection of the laser light reflected from an area in the image scene by a respective aggregated pixel.
[0011] Optionally, the readout circuitry 125 is further characterized by coalescing circuitry 530 operative to coalesce photocurrent from at least a portion of the group of detectors of at least one aggregated pixel to facilitate determining the second time.
[0012] In certain embodiments a base material of the photodetector array 115 is indium Gallium Arsenide (InGaAs).
[0013] In certain embodiments, the number of subpixels in at least one of the aggregated pixels 310 is an integer multiple of the number of spectral filters 110. The integer may be 1 or greater than 1.
[0014] Optionally, at least one of the plurality of spectral filters 110 transfers light in a bandpass containing a wavelength of the laser light and in a filter-specific bandpass which passes light having a wavelength differing from the laser light wavelength.
[0015] In some embodiments a majority of the plurality of the spectral filters 110 each transfers light in a bandpass containing the wavelength of the laser light and in a filterspecific bandpass which passes light having a wavelength differing from the laser light wavelength.
[0016] Optionally, the imager further comprises a controllable filter 150 disposed in a light path extending between the image scene and the plurality of spectral filters 110, the controllable filter controllably switches between at least a first and a second state, wherein when in the first state the controllable filter transferring light in a bandpass containing wavelength of the laser light, and reducing transfer of light in at least one different wavelength bandpass. During LIDAR mode operation, such filter reduces light different than the reflected laser light to reduce noise and improve signal to noise ratio.
[0017] The readout circuitry 125 associated with at least one aggregated pixel 310 is optionally capable of selectively outputting image information from a plurality of subpixels in the pixel group forming the at least one aggregated pixel. Additionally in certain embodiments the imager operates in at least a multi-spectral mode wherein the readout circuitry 125 integrates and outputs information reflective of the magnitude of light impinging during an integration period on individual detectors of the plurality of detectors of the photodetector array 115, and in a LIDAR mode wherein the readout circuitry outputs timing information for individual aggregated pixels from the plurality of aggregated pixels, the timing information relating to time difference between the first time associated with illuminating an image scene by the laser light source 100, and the second time associated with detection of the laser light reflected from an area in the image scene by a respective aggregated pixel 310.
[0018] Optionally, when operated in LIDAR mode the readout circuitry 125 may further comprise coalescing circuitry 530 operative to coalesce photocurrent accumulated from a plurality of subpixels of a respective aggregated pixel 310 during determining the second time.
[0019] In certain embodiments the laser light is of a wavelength between 0.9 pm and 2.5 pm.
[0020] Optionally, the readout circuitry 125 associated with at least one of the aggregated pixels comprises, for each of the plurality of subpixels forming the aggregated pixel 310, a photocurrent integrator 605, a transfer control circuit 610, and an integrator reset circuit 615; a timing circuit (by way of example 515,520 combined) capable of detecting time difference between a first time associated with illuminating an image scene by the laser light source 100, and a second time associated with detection of the laser light reflected from an area in the image scene by at least one subpixel of the aggregated pixel 310; and mode switching circuitry 710 capable of switching between a first operating mode wherein the readout circuitry 125 outputs to the readout processing circuitry 140 information related to the integrated photocurrent, and a second operating mode wherein the readout circuitry outputs to the readout processing circuitry information related to the time difference.
[0021] Optionally, the readout circuitry 125 aggregates photocurrent from a plurality of subpixels of the aggregated pixel to facilitate determining the second time.
[0022] There is also provided, in an aspect of the invention a camera comprising a multispectral imager 2 as described herein, wherein the readout processing circuitry is characterized by being capable at least of controlling a plurality of the readout circuitries 125 associated with a plurality of aggregated pixels between outputting integrated photocurrent magnitude information for subpixels of the aggregated pixels 310, and outputting timing information, reading the timing information outputted by the individual readout circuitries associated with the aggregated pixels, forming a three-dimensional map of the image scene responsive to the timing information outputted from the plurality of aggregated pixels, and forming a plurality of spectral images, each of the spectral images reflecting the integrated photocurrent magnitude information received from respective subpixels in the plurality of aggregated pixels.
[0023] The camera as described may have a readout circuitry of individual aggregated pixels and a coalescing circuitry operative to coalesce photocurrent from a plurality of subpixels associated with the aggregated pixel to facilitate determining the timing information.
[0024] The camera readout processing circuitry may further comprise at least one processor, wherein the processor is operated by software to perform at least a portion of the formation of the three-dimensional map, and/or the spectral images.
[0025] The laser light wavelength is determined to match the wavelength of a specific laser light source capable of illuminating an image scene of interest, the operation of the laser light source being coordinated with the operation of the readout circuit.
[0026] In an aspect of the invention, there is provided a LIDAR and multispectral combined imager 1 for imaging and for forming a three-dimensional map of an image scene, the combined imager comprising a laser source 100 disposed to illuminate an image scene, an objective lens 105 for producing a focused image of the scene at a focal plane, an array of spectral filters 110 placed at or about the objective lens, a photodetector array 115 comprising a plurality of aggregated pixels, each aggregated pixel comprising a plurality of subpixels, a microlens array 120 comprising a plurality of microlenses positioned between the objective lens and the photodetector array, and disposed and constructed to produce multispectral information projected onto a respective aggregated pixel. Individual filters of the filter array pass at least one band of light and one band of laser light. All the subpixels of an aggregated pixel 310 receive substantially the same image information except that each subpixel of one aggregated pixel receives light that passed one element of the spectral filters array. The imager further comprises readout circuitry 125 capable of accumulating photocurrent from each of the subpixels, and providing information associated with the detection of the laser light by the respective subpixel. In an optional Time-of-Flight embodiment the readout circuitry is also capable of providing information of time difference between a first time associated with illuminating the image scene, and a second time associated with detection of the laser light by the respective pixel. Thus, by way of example, illuminating the image scene or a portion thereof with laser light, either as a single pulse or by scanning, may be done contemporaneously with sending a signal to the readout circuitry, and the circuitry determines the time lag between the illumination and the detection of the laser light reflected from a region of the scene by each aggregated pixel. The time lag information allows sensing a distance between the region in the image scene reflecting the laser light and the imager. The laser illuminating the image scene may be incorporated in the imager, adjacent to the imager, or separate therefrom. Utilizing the laser illumination and sensing distance to scene regions forms a LIDAR, incorporated with the multispectral capable imager.
[0027] An aggregated pixel 310 comprises a plurality of individual pixels of the photodetector array. A group of pixels are conceptually grouped together to form the aggregated pixel, and each of the individual pixels acts as a subpixel within the aggregated pixel. An aggregated pixel is generally associated with one lenslet of the MLA. Optionally the number of subpixels in an aggregated pixel corresponds to the number of spectral filters in the filter array, however, in some embodiments the number of subpixels is a multiple of the number of spectral filters.
[0028] Generally, each filter of the plurality of filters in the spectral filter array has at least one bandpass for a portion of the spectrum sensed by the imager and a bandpass for the laser light, however certain embodiments may utilize one or more filters which does not pass the laser light, and/or filter(s) which pass only the laser light band. The readout circuitry is capable of providing sub-spectrum specific light magnitude information and time difference between laser illuminating of the scene and detection of laser light reflected from the image scene, for each respective subpixel. Thus, the imager produces a plurality of spectral images 320, each reflecting a portion of the spectrum determined by the bandpass of a single spectral filter, and distance information of various points in the same image scene, allowing forming a three-dimensional map of the scene or a portion thereof. Fusing the various spectral images is largely obviated as the same optical system and the same aggregated pixels are utilized for sensing the various spectral components, and as differing spectral images are received from adjacent pixels.
[0029] The system operates in two principal modes, namely a Multi-Spectral Mode (MSM) and a LIDAR mode (LM), where during the spectral mode the photodetector array integrates photocurrent from the subpixels, and during the LIDAR mode, the aggregated pixels sense the arrival of laser light from the laser-illuminated image scene, providing distance related information.
[0030] While some processing and fusing may still be utilized, such processing is significantly lower than in common systems. Primarily, the time of flight is converted to range information and light data from corresponding subpixels in each of the plurality of aggregated pixels is collected to form individual spectral images. Depending on the internal representation of the data, range information of the LIDAR may require processing to form the three-dimensional map of the scene. However, as the various spectral data and the range data related to a region in the image scene is derived from the same aggregated pixel, the fusing requires minimal processing. It is noted that since the system switches between spectral and LIDAR modes the spectral images and the range information obtained from an aggregated pixel are not contemporaneous, however commonly the spectral integration requires several milliseconds, while a LIDAR image is taken within the order of a few microseconds. Therefore, the temporal difference between the acquisition of the spectral data and the LIDAR range data may be considered negligible.
[0031] Optionally, the imager is capable of sensing laser light in the wavelength range between 0.9 pm and 2.5 pm, a range known as Short Wave Infra-Red (SWIR).
Furthermore, any of the spectral filters’ bandpass may be in the SWIR range and/or in the visual range. Indium Gallium Arsenide (InGaAs) based infrared photodetectors are known in the art and the use of one in the imager is explicitly considered. Generally utilizing any infrared photodetector in the imager allows sensing of spectral ranges which may include SWIR and/or MWIR spectral bands. Notably, the LIDAR may or may not operate in the SWIR or MWIR range. The LIDAR portion of the imager may be modified to utilize SWIR, MWIR, IR, and even visual wavelength range to meet design requirements by modifying the characteristics of the respective filter(s), the detector, and the laser light source.
Brief Description of the Drawings
[0032] Fig. 1 depicts a simplified top view of an exemplary aggregated pixel
[0033] Fig. 2 depicts a transmission graph of an exemplary filter
[0034] Fig 3 depicts a simplified schematic of a multispectral light field camera similar to the camera of the US 7,433,042 Patent to Cavanaugh.
[0035] Fig. 4 depicts a simplified block diagram of an exemplary combined multispectral and flash LIDAR imager principal components, according to one embodiment.
[0036] Fig. 5 depicts a simplified block diagram of an optional embodiment of readout processing circuitry.
[0037] Fig. 6 depicts a simplified block diagram of a unit cell circuitry of a single subpixel. [0038] Fig. 7 depicts a simplified block diagram of a readout circuit for one aggregated pixel.
Detailed description of select embodiments
[0039] Fig. 1 depicts a simplified top view of an exemplary aggregated pixel 310. The aggregated pixel is a portion of photodetector 115 and is divided to subpixels 3101, 3102, ... 310n where every subpixel forms a portion of a subpixel unit cell, and a group of subpixels that share light from a single microlens forms an aggregated pixel 310. The aggregate pixel unit cell is formed of the subpixel unit cells, as well as circuitry specific to the aggregated pixel operation. In addition to the portion of the photodetector which forms the sensing pixel itself, a subpixel unit cell circuitry UCi, UC2 ... UCn commonly comprises a photocurrent integrator, a sample and hold (S&H) circuit, transfer control circuit, and control circuit which may also act to reset the integrator. In addition to the cell units of the subpixels which form the aggregated pixel, the aggregated pixel cell unit comprises mode switching circuitry, time counter and/or other distance sensing related circuitry, and optionally subpixel coalescing circuit. The aggregated pixel cell unit circuitry may reside in the inter-cell area between subpixel cell units, which appear as the black portions between the subpixel unit cells in Fig. 1. Optionally, the aggregated pixel unit cell utilized coalescing circuitry (SI, S2, ... Sn, 530) to coalesce output from a plurality of subpixels to drive the distance sensing circuitry. Coalescing subpixels in this manner increases the amplitude of sensed laser light, and therefore the sensitivity of laser light detection. It is noted that the figures depict conceptual schematics, and additional circuitry may be required to improve operational characteristics.
[0040] Optionally a photodetector array 115 and a ROIC 125 may be coupled by flip chip technology, where an array of pixels in the photodetector array is connected onto an array of unit cell circuitry using Indium bumps. The term subpixel generally relates to a single sensing pixel element, and/or the combination of the sensing pixel element and the readout circuitry associated therewith, or stated equivalently, the pixel unit cell.
[0041] The ROIC 125 provides integration of photocurrent from each subpixel over an integration period. Corresponding subpixels of all the aggregated pixels provide a spectral image reflective of light which passed through a single spectral filter, thus forming a single spectral image. Stated differently, light that passes a single spectral filter falls on corresponding subpixels in each aggregated pixel of the plurality of aggregated pixels, so that sensing the magnitude of light from subpixels having similar coordinates within the respective aggregated pixel forms an image comprising light that was filtered by a single spectral filter. Thus by way of example, subpixel X2 Y2 of aggregated pixelsi, aggregated pixeh, aggregated pixeh, and so forth to aggregated pixeln are all collected to form a single spectral image. Optionally, as stated above, when operating in LIDAR mode, the ROIC integrates all the subpixels of an aggregated pixel to increase laser light detected intensity.
[0042] Optionally, some, a majority, or all of the spectral filters in the spectral filter array 110 have at least dual bandpass characteristics. One band would pass the illumination laser wavelength band, and the other band(s) would pass a specific portion of the spectrum as determined by the design requirements of the imager. However, in certain embodiments, one or more of the filters may not pass the laser band, and in certain embodiments at least one of the filters has one bandpass which transfers the laser wavelength.
[0043] Fig. 2 depicts a transmission graph of an exemplary filter. Assuming the LIDAR laser utilized to illuminate the scene operates at 1.550 pm and that the bandpass of interest of the exemplary filter is between 1.2 and 1.3 pm, and that the vertical Y axis represents transmittance of light while the horizontal X axis represents wavelength, band 21 represents the laser light band while band 22 represents the filter-specific bandpass of 1.2-1.3 pm. Various filters may be utilized in the spectral filter array, and individual filters may have other bandpass(es) of interest, which in certain embodiments may include the visual band or portions thereof. Optionally, polarized filters may be used as elements of the spectral filter array in any combination. Optionally, filters may utilize similar or different bandpass, at different polarizations. Further, optionally at least one of the filters may be limited only to a bandpass containing the laser light wavelength. However, to provide the desired spectral separation at least one of the filters should transfer a different bandwidth and/or polarization than a second filter. It is noted that while polarization is not strictly wavelength dependent, for the purpose of this specification, a polarizing filter may be considered spectrally different than a nonpolarizing filter, and filters having a different polarization geometry may be considered spectrally different even if their bandpass characteristics are similar.
[0044] Optionally, a controllable filter 150 is disposed in the light path extending between the image scene and the photodetector, the controllable filter controllably transmitting light in a band containing the wavelength of the laser light while reducing or blocking transfer of light in at least one different wavelength band, or transmitting a broader spectrum than the laser wavelength, which may or may not include the laser wavelength. The controllable filter reduces received light in any wavelength differing from the laser wavelength while the imager operates in LIDAR mode. Such reducing or blocking of light differing from the laser light is advantageous in LIDAR mode as it reduces detected noise, and facilitates precise determination of the time a laser pulse reflected from the image scene is detected, providing more precise distance information. The such controllable filter may be implemented by various methods such as a rotating filter, a liquid crystal based filter, and various light path modifiers, such as placing prisms, mirrors, and the like in the light path such that incoming light is steered through a filter with bandpass at the laser wavelength. A plurality of controllable filters may be utilized.
[0045] Fig. 3 depicts a simplified diagram of optical paths associated with one aggregated pixel in a multispectral camera similar to one described in the ‘042 Patent to Cavanaugh. The example describes in particular an aggregated pixel 310 formed by conceptual grouping of three subpixels 310i, 3102, 310s. As mentioned above, while the drawing depicts only three subpixels an aggregated pixel may comprise any desired plurality of subpixels. Light from any given region in the image scene arrives at objective lens 105 and various spectral portions thereof pass through the spectral filter array 110, which in the drawing is disposed within the objective lens. It is noted the filter array may be in the lens or in its vicinity. The light from the region is focused onto a region- respective microlens from the microlens array 120. The respective microlens projects the light onto respective subpixels 310i, 3102, . . 310n in the photodetector array 115. Depicted subpixels 310i, 3102, 310s are conceptually grouped to form the three subpixel aggregated pixel 310. Digital processing is performed to form each spectral image 320 by combining the outputs captured by corresponding subpixels from the plurality of the aggregated pixels.
[0046] Preferably the microlens array 120 is constructed and disposed such that light arriving thereto from the objective lens 105 is projected onto the photodetector 115 as substantially parallel light rays.
[0047] Fig. 4 depicts a simplified diagram of certain components in an exemplary combined LIDAR and multispectral imager forming a camera. A laser source 100 illuminates the image scene, depicted schematically as an object. The laser source may be a flash laser that illuminates the whole scene or large portions thereof, or a scanning laser. An optional laser lens 135 may also be provided to provide any desired directional and/or polarization character to the laser light as it is discharged from the camera. In the depicted embodiment a laser sensor 130 detects the operation of the laser, commonly known as ‘firing’. Light reflected from the image scene, including laser light, is collected by objective lens 105. The depicted embodiment incorporates the spectral filter array 110 in the objective however the filter array may be located adjacent to the lens. The objective lens focuses light from the image scene onto the microlens array 120, The depicted drawing shows light from a specific region in the image scene being directed to a microlens which corresponds to the image scene region, and various microlenses in the microlens array correspond to respective regions in the image scene.
[0048] Photodetector 115 comprises a plurality of individually readable pixels, grouped into a plurality of aggregated pixels. Each microlens corresponds to an aggregated pixel and projects the light rays focused thereupon onto pixels of the photodetector 115 which act as subpixels of individual aggregated pixels, where subpixels which receive the light from one microlens of the microlens array are considered subpixels of an aggregated pixel. Preferably the microlens array is constructed such that the light rays leaving each microlens are substantially parallel to each other. However, certain minor angles formed between the light rays leaving the microlens may be tolerated as long as the majority of light rays transferred by the microlens impinge on the desired subpixel. The group of subpixels which receives light from one microlens forms one aggregated pixel. Generally, one of the subpixels in an aggregated pixel corresponds to rays passed by a respective one of the plurality of spectral filters in the spectral filter array 110, however in some embodiments more than one subpixel is associated with a single spectral filter. ReadOut Integrated Circuit (ROIC) 125 receives the output of the subpixels of the photodetector and communicates the information to Readout Processing Circuitry 140. Readout circuitry is colloquially known as Read Out Integrated Circuit (ROIC), however, it is pointed out that such circuitry may reside in other embodiments and not only in a single integrated circuit.
[0049] The readout Processing Circuitry (RPC) 140 controls numerous operations of the imager. A simplified block diagram of one optional embodiment of such processing circuitry is schematically depicted in Fig. 5. The RPC performs primarily tasks of control and coordination of various portions of the imager and the laser portion of the device and optionally processes timing related information received from individual aggregated pixels into a three-dimensional map as well as form the individual spectral images. The RPC further communicates information to external devices such as vehicle-based control devices, remote storage devices, and the like, as required by the environment in which the device operates. Amongst others, the mode switching 710 portion of the RPC switches the mode of operation of the imager between LIDAR mode (LM) and multispectral mode (MSM). A spectral control portion 715 portion of the RPC controls operation and reading of information from unit cells of individual subpixels, and resets the respective subpixel for the next reading cycle. The spectral imaging 720 portion of the RPC composes the information from corresponding unit cells of subpixels in the aggregated pixels into respective spectral images, forming a plurality of spectral images, each reflecting light that passed through one spectral filter. A LIDAR control portion 725 of the RPC controls operation of the system in LIDAR mode, including reading timing information from individual aggregated pixels, to establish range information. Three- dimensional mapping portion 730 utilizes timing data obtained from individual aggregated pixels to form the three-dimensional map of the image scene. RPC controller portion 740 controls the operation of the system as a whole, such as duty cycle between the different modes, communications, and other processing required for the operation of the system. It is noted that any and all of the RPC portions may be embodied in hardware, in software executed on one or more processors, or the RPC may be embodied in any combination of hardware and software. It is further noted that the various portions described above represent merely an arbitrary division and that certain functions may be shared between various portions of the RPC, or be delegated to other circuitry and/or controller, as a matter of implementation choice. The various portions may be implemented by one or more electronic processors. Common parts such as power supply, switches, and the light may not be depicted for simplicity. Further, while in fig. 4 the RPC 140 is depicted outside of the camera, the RPC placement of the RPC is a matter of implementation choice and the RPC may be installed within or without the same enclosure as the imager and/or the laser. It is further noted that the imager 2, the laser portion and the support circuitry may also reside in a single or differing enclosures at any combination.
[0050] A simplified block diagram of a single subpixel unit cell is depicted in Fig. 6, where photocurrent from a single subpixel 310n in the photodetector 115 is collected in integration capacitor 605. Responsive to a read command received from the spectral control portion 715 the charge in the integration capacitor is either presented directly to the output port OP, or optionally sampled by sample and hold circuit 620 for later transfer to the output port OP. The integrated charge information is presented by the readout section 610 for reading by the spectral imaging portion 720 of the RPC 140. Once sampled, the integration capacitor 605 may be discharged by reset circuit 615 and start integrating photocurrent for the next reading circle. The readout section 610 may provide an analog or digital representation of the charge integrated by the integration capacitor 605. In certain embodiments, a unit cell controller 600 controls and coordinates the operation of the unit cell circuitry while in other embodiments the control is directed from outside the unit cell to respective portions such as the reset, sample and hold, and the like. [0051] A simplified block diagram of an exemplary embodiment of an aggregated pixel cell unit is depicted in Fig. 7. As described elsewhere, the aggregated pixel cell unit operates in two modes, namely LM, and MSM.
[0052] For each of the subpixels 310i, 3102, 310n of an exemplary aggregated pixel, the aggregated pixel unit cell contains a corresponding changeover switching circuitry SI, S2, and S3 respectively, and subpixel unit cell circuitry UC1, UC2, and UC3 respectively. During MSM mode operation, photocurrent from each of the subpixels passes through the respective switch, and is integrated by the respective unit cell circuitry, as explained relative to Fig. 6. The unit cell circuit presents the output at a respective output ports OP1, OP2, and Op3, in response to a signal from the spectral control portion 715. Once the integrated photocurrent is sampled the RPC signals the respective unit cell circuit to reset and restart integrating the photocurrent in a new cycle.
[0053] During MSM mode operation the mode switching portion 710 switches the imager to MSM mode and the bus control 510 directs the output of each subpixel to its respective unit cell circuitry via the respective switch SI, S2, and S3. The unit cell circuitry integrates the photocurrent of the respective subpixel until the spectral control portion 715 signals that the integration period terminated, at which time the integration results are ready to be read. The integrated photocurrent value is presented to the RPC for reading with a potential delay, until the RPC is ready to read the specific subpixel.
[0054] When the mode switching 710 portion switches the operating mode from MSM to LM, it asserts the mode control signal which causes the laser controller 145 to begin arming the laser 100, and the bus control 510 to utilize switches SI, S2, and S3 to switch the output of the respective photodetector pixels 310i, 3102, 310s from the respective sub-pixel unit cell circuitry to the aggregated pixel unit cell circuitry. Optionally, in the aggregated pixel unit cell circuitry, the outputs of a plurality of the subpixels are coalesced and are connected in parallel by coalescing circuitry 530. It is noted that while Fig. 7 depicts a simple connection as the coalescing circuitry, various circuitries may be utilized to achieve the coalescing of the subpixel signals. [0055] When laser 100 fires, the laser sensor 130 senses the firing and sends a start signal to time counter 520. Generally, if the LIDAR is a flash-type LIDAR the trigger signal is delivered to the counters of all the aggregated pixels. The laser light travels to the image scene and is reflected by an object therein. A portion of the reflected light is captured by the objective lens, and thereby by the aggregated pixel associated with the image scene region containing the reflecting object. When the laser light sensed by the aggregated pixel subpixels reaches a predetermined threshold, the threshold circuit 515 signals the counter to stop the count. Coalescing the output of a plurality of the subpixels by connecting their output in parallel, as seen in Fig. 5, increases the signal collected from the laser, and thus increases the sensitivity of the imager in LIDAR mode.
Optionally the outputs of a majority or all of the subpixels in the aggregated pixel are coalesced, either in parallel or in series, however, the laser pulse detection may be obtained from less than all subpixels in an aggregated pixel, and even by a single subpixel.
[0056] The LIDAR portion depicted in Fig. 5 utilizes Time Of Flight (TOF) to measure the distance to the respective object in the image scene which reflected the light, however other methods of measuring distance are known in the LIDAR art, and may be utilized in various embodiments of the combined LIDAR and multispectral imager. By way of example phase comparison or Frequency Modulation Continuous Wave (FM-CW) may be utilized.
[0057] It is noted that generally pixels of the photodetector are grouped into aggregated pixels by reception of the light from a respective microlens, the operation of the respective portion of the ROIC, and the RPC circuitry. Such pixels which form the aggregate pixel are generally referred to as subpixels. Respective subpixels of aggregated pixel imply that the subpixels have similar coordinates within the respective aggregated pixels. Stated differently by way of explanatory example, if a subpixel that resides in the first row and the second column is selected for the sake of the example, the first row, second column pixel of aggregated pixel 1 is the respective subpixel to the first row, a second pixel of aggregated pixel 2, and of aggregated pixel 3 and so forth. It is further noted that the terms photodetector and photodetector array are equivalent. Photodetector arrays are also commonly known as Focal Plane Array (FPA). The number of pixels in a photodetector may vary and the description above may utilize photodetectors having any number of pixels and/or configuration which meets design requirements. Similarly, the number of subpixels forming an aggregated pixel may be varied to meet design needs.
[0058] The skilled in the art would readily recognize that the terms spectral images and three-dimensional map relate at least to electrical representations of such images and map that are readable by a computer, and from which various physical and/or visual representations may be formed. The term ‘device’ should be construed to refer to any combination of an imager and control circuitry, and may further include power supply, laser light source and control thereof, laser sensor, additional support circuitry, communications, and/or switching circuitry, as well as additional optical components such as lenses, mirrors, filters, and the like, as required by the environment and the task at hand for which the imager is employed. By way of example, a complete camera as depicted in Fig. 4 may contain the laser light source, the RPC, and supporting circuitry and optics in a single enclosure. Alternatively, each of the components may be disposed separately and optionally in a plurality of enclosures. An embodiment of the invention may be embedded in a larger vehicle such as an autonomous vehicle, a robot, or a manned vehicle, and utilized for driving safety, navigational, general mapping purposes, and any combination thereof, in one or more locations within the vehicle, and various components may be dispersed in different places in the vehicle. The specific operating environment and the purpose for which the device is utilized may dictate various distribution of different components. Thus, by way of example, in a vehicle the laser section may be disposed at a different location in the vehicle than the imager section. Conversely, an embodiment of the invention may be embodied in a personal augmented reality system, where compact packaging is highly desired, resulting in a single package being preferred.
[0059] It is further noted that while the behavior of both light and electrical signals has been generally described in ideal terms, the presence of noise and minor discrepancies between components does not detract from the invention. By way of example, a portion of light transferred by one microlens may fall onto subpixels in another aggregated pixel, some light transferred from one filter may impinge on a subpixel associated with another filter, a pixel photocurrent may be exposed to electrical noise, and the like. However, it is recognized that such minor deviations in devices generally embodying the claimed aspects of the invention, whether intentional or unintentional fall under the claims.
[0060] For simplicity, in certain drawings and specification passages, only three subpixels and associated circuitry are depicted and/or described. However, it is clarified that any number of subpixels may be utilized in aggregated pixel by adding respective switches, unit cells, and other associated circuitry as required for the selected number of subpixels. A specific smaller number of elements are depicted to facilitate understanding of the operational information paths and are implemented by electronic circuitry. Numerous electronic circuits to implement an electronic switch are known in the art.
[0061] In these specifications, reference is often made to the accompanying drawings which form a part of the disclosure, and in which are shown by way of illustration and not of limitation, exemplary implementations and embodiments. Further, it should be noted that while the description provides various exemplary embodiments, as described and as illustrated in the drawings, this disclosure is not limited to the implementations described and illustrated herein, but can extend to other embodiments as would be known or as would become known to those skilled in the art, in light of this disclosure. Reference in the specification to "one embodiment", "this embodiment", "these embodiments", “several embodiments”, “selected embodiments” or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment(s) may be included in one or more implementations, and the appearances of these phrases in various places in the specification are not necessarily all referring to the same embodiment s). Additionally, in the description, numerous specific details are set forth in order to provide a thorough disclosure, guidance, and/or to facilitate understanding of the invention or features thereof. However, it will be apparent to one of ordinary skill in the art that these specific details may not all be needed in each implementation. In certain embodiments, well-known structures, materials, circuits, and interfaces have not been described in detail, and/or may be illustrated schematically or in block diagram form, so as to not unnecessarily obscure the disclosure.
[0062] Terms such as “about”, “substantially”, “approximately”, and “adjacent” in the context of configuration relate generally to disposition, location, or configuration that is either exact or sufficiently close to the location, disposition, or configuration of the relevant element, to preserve operability of the element within the invention, and which does not materially modify the invention. Similarly, unless specifically disclosed or clear from its context, numerical values should be construed to include certain tolerances that the skilled in the art would recognize as having negligible importance as it does not materially change the operability of the invention.

Claims

CLAIMS A multi-spectral and LIDAR imager 2 responsive to a spectrum encompassing at least a laser light wavelength and one more wavelength, the imager comprising: an objective lens 105; a plurality of spectral filters 110 disposed within the objective lens or proximal thereto along an optical path of light transferred by the objective lens, at least one of the spectral filters 110 being transmissive of the laser light wavelength; a microlens array 120 having a plurality of microlenses, the microlens array disposed along the optical path; a photodetector array 115 disposed farther away from the objective lens along the optical path than the microlens array, the photodetector array comprising a plurality of pixels; the imager being characterized by: a plurality of aggregated pixels 310 each formed of a group of pixels from the plurality of pixels of the photodetector array 115, the group of pixels acting as subpixels to the respective aggregated pixel, each aggregated pixel associated with one of the plurality of microlenses, and each subpixel receives light passing through a respective one of the plurality of spectral filters, at least one of the subpixels having a different spectral filter associated therewith from the spectral filter associated with a neighboring subpixel; the imager is further characterized by a readout circuitry 125 capable of accumulating photocurrent from each of the subpixels, and provide integrated photocurrent magnitude information and/or timing information to a readout processing circuitry. An imager as claimed in claim 1, wherein the readout circuitry 125 provides timing information relating to time difference between a first time associated with illuminating an image scene by a laser light source 100 emitting light at the laser light wavelength, and a second time associated with detection of the laser light reflected from an area in the image scene by a respective aggregated pixel. An imager as claimed in claim 2, wherein the readout circuitry 125 is further characterized by coalescing circuitry 530 operative to coalesce photocurrent from at least a portion of the group of detectors of at least one aggregated pixel to facilitate determining the second time. An imager as claimed in any preceding claim, wherein a base material of the photodetector array 115 is indium Gallium Arsenide (InGaAs). An imager as claimed in any preceding claim, wherein the number of subpixels in at least one of the aggregated pixels 310 is an integer multiple of the number of spectral filters 110. An imager as claimed in claim 5 wherein the integer is greater than 1. An imager as claimed in any preceding claim, wherein at least one of the plurality of spectral filters 110 transfers light in a bandpass containing a wavelength of the laser light and in a filterspecific bandpass which passes light having a wavelength differing from the laser light wavelength. An imager as claimed in any preceding claim, wherein a majority of the plurality of the spectral filters 110 each transfers light in a bandpass containing the wavelength of the laser light and in a filter-specific bandpass which passes light having a wavelength differing from the laser light wavelength. An imager as claimed in any preceding claim, further comprising a controllable filter 150 disposed in a light path extending between the image scene and the plurality of spectral filters 110, the controllable filter controllably switches between at least a first and a second state, wherein when in the first state the controllable filter transferring light in a bandpass containing wavelength of the laser light, and reducing transfer of light in at least one different wavelength bandpass. An imager as claimed in any preceding claim, wherein the readout circuitry 125 associated with at least one aggregated pixel 310 is capable of selectively outputting image information from a plurality of subpixels in the pixel group forming the at least one aggregated pixel. An imager as claimed in any preceding claim, wherein the imager operates in at least a multi- spectral mode wherein the readout circuitry 125 integrates and outputs information reflective of the magnitude of light impinging during an integration period on individual detectors of the plurality of detectors of the photodetector array 115, and in a LIDAR mode wherein the readout circuitry outputs timing information for individual aggregated pixels from the plurality of aggregated pixels, the timing information relating to time difference between the first time associated with illuminating an image scene by the laser light source 100, and the second time associated with detection of the laser light reflected from an area in the image scene by a respective aggregated pixel 310. An imager as claimed in claim 11, wherein when operated in LIDAR mode the readout circuitry 125 further comprises coalescing circuitry 530 operative to coalesce photocurrent accumulated from a plurality of subpixels of a respective aggregated pixel 310 during determining the second time. An imager as claimed in any preceding claim, wherein the laser light is of a wavelength between 0.9 pm and 2.5 pm. An imager as claimed in any preceding claim, wherein the readout circuitry 125 associated with at least one of the aggregated pixels comprises: for each of the plurality of subpixels forming the aggregated pixel 310, a photocurrent integrator 605, a transfer control circuit 610, and an integrator reset circuit 615; a timing circuit 515,520 capable of detecting time difference between a first time associated with illuminating an image scene by the laser light source 100, and a second time associated with detection of the laser light reflected from an area in the image scene by at least one subpixel of the aggregated pixel 310; and, mode switching circuitry 710 capable of switching between a first operating mode wherein the readout circuitry 125 outputs to the readout processing circuitry 140 information related to the integrated photocurrent, and a second operating mode wherein the readout circuitry outputs to the readout processing circuitry information related to the time difference. An imager as claimed in claim 14, wherein the readout circuitry 125 aggregates photocurrent from a plurality of subpixels of the aggregated pixel to facilitate determining the second time.
16. A camera comprising a multispectral imager 2 as claimed in any preceding claim, wherein the readout processing circuitry is characterized by being capable at least of: a. controlling a plurality of the readout circuitries 125 associated with a plurality of aggregated pixels between outputting integrated photocurrent magnitude information for subpixels of the aggregated pixels 310, and outputting timing information; b. reading the timing information outputted by the individual readout circuitries associated with the aggregated pixels; and, c. forming a three-dimensional map of the image scene responsive to the timing information outputted from the plurality of aggregated pixels; and, d. forming a plurality of spectral images, each of the spectral images reflecting the integrated photocurrent magnitude information received from respective subpixels in the plurality of aggregated pixels. The camera as claimed in claim 16, wherein the readout circuitry of individual aggregated pixels further comprises coalescing circuitry operative to coalesce photocurrent from a plurality of subpixels associated with the aggregated pixel to facilitate determining the timing information. The camera as claimed in claim 16 or 17, wherein the readout processing circuitry comprises at least one processor, and wherein the processor is operated by software to perform at least a portion of the formation of the three-dimensional map, and/or the spectral images.
PCT/US2022/040299 2021-08-16 2022-08-15 Multispectral and lidar detector using light field optics, and systems using same WO2023022964A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163233718P 2021-08-16 2021-08-16
US63/233,718 2021-08-16

Publications (1)

Publication Number Publication Date
WO2023022964A1 true WO2023022964A1 (en) 2023-02-23

Family

ID=85240937

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/040299 WO2023022964A1 (en) 2021-08-16 2022-08-15 Multispectral and lidar detector using light field optics, and systems using same

Country Status (1)

Country Link
WO (1) WO2023022964A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070176082A1 (en) * 2006-02-02 2007-08-02 Rockwell Scientific Licensing, Llc Microlensed focal plane array (FPA) using sub-pixel de-selection for improved operability
US20120170024A1 (en) * 2009-09-22 2012-07-05 Medhat Azzazy Long Range Acquisition and Tracking SWIR Sensor System Comprising Micro-Lamellar Spectrometer
WO2014007869A2 (en) * 2012-06-05 2014-01-09 Hypermed Imaging, Inc. Methods and apparatus for coaxial imaging of multiple wavelengths
US20190349573A1 (en) * 2015-04-15 2019-11-14 Google Llc Image capture for virtual reality displays
US20210190962A1 (en) * 2018-07-17 2021-06-24 Netzvision Llc Multi-wavelength lidar and thermal imager

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070176082A1 (en) * 2006-02-02 2007-08-02 Rockwell Scientific Licensing, Llc Microlensed focal plane array (FPA) using sub-pixel de-selection for improved operability
US20120170024A1 (en) * 2009-09-22 2012-07-05 Medhat Azzazy Long Range Acquisition and Tracking SWIR Sensor System Comprising Micro-Lamellar Spectrometer
WO2014007869A2 (en) * 2012-06-05 2014-01-09 Hypermed Imaging, Inc. Methods and apparatus for coaxial imaging of multiple wavelengths
US20190349573A1 (en) * 2015-04-15 2019-11-14 Google Llc Image capture for virtual reality displays
US20210190962A1 (en) * 2018-07-17 2021-06-24 Netzvision Llc Multi-wavelength lidar and thermal imager

Similar Documents

Publication Publication Date Title
US11598857B2 (en) Integrated lidar image-sensor devices and systems and related methods of operation
US11467286B2 (en) Methods and systems for high-resolution long-range flash lidar
JP7429274B2 (en) Optical imaging transmitter with enhanced brightness
US10739189B2 (en) Multispectral ranging/imaging sensor arrays and systems
AU2018297291B2 (en) Light ranging device with electronically scanned emitter array and synchronized sensor array
US10908266B2 (en) Time of flight distance sensor
US7858939B2 (en) FPA combining SAL and imaging
US20200116836A1 (en) Subpixel apertures for channels in a scanning sensor array
US5870180A (en) Time measurement device and method useful in a laser range camera
US8681260B2 (en) Dual site imaging camera
US6864965B2 (en) Dual-mode focal plane array for missile seekers
US8975594B2 (en) Mixed-material multispectral staring array sensor
US20220163634A1 (en) Active illumination systems for changing illumination wavelength with field angle
Hirsh et al. Hybrid dual-color MWIR detector for airborne missile warning systems
WO2023022964A1 (en) Multispectral and lidar detector using light field optics, and systems using same
Goldberg et al. Multispectral, hyperspectral, and three-dimensional imaging research at the US Army research laboratory
Pollehn et al. Multidomain smart sensors
US8153978B1 (en) Dual color/dual function focal plane
US20230130993A1 (en) Systems and Methods for Spatially-Stepped Imaging
MARINO et al. A photon counting 3-D imaging laser radar for advanced discriminating interceptor seekers
Johnson et al. Adaptive LaDAR receiver for multispectral imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22858985

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE