CN116736463A - Snap fit lens barrel system and method - Google Patents

Snap fit lens barrel system and method Download PDF

Info

Publication number
CN116736463A
CN116736463A CN202310229747.3A CN202310229747A CN116736463A CN 116736463 A CN116736463 A CN 116736463A CN 202310229747 A CN202310229747 A CN 202310229747A CN 116736463 A CN116736463 A CN 116736463A
Authority
CN
China
Prior art keywords
body portion
lens element
lens
imaging device
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310229747.3A
Other languages
Chinese (zh)
Inventor
W·J·豪尔
大卫·奥夫鲁茨基
J·加利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taili Diane Ferrier Business Systems
Original Assignee
Taili Diane Ferrier Business Systems
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/176,755 external-priority patent/US20230288784A1/en
Application filed by Taili Diane Ferrier Business Systems filed Critical Taili Diane Ferrier Business Systems
Publication of CN116736463A publication Critical patent/CN116736463A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/021Mountings, adjusting means, or light-tight connections, for optical elements for lenses for more than one lens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/18Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/026Mountings, adjusting means, or light-tight connections, for optical elements for lenses using retaining rings or springs

Abstract

Techniques are provided for facilitating wide field of view (FOV) imaging systems and methods. In one example, an imaging device includes a lens barrel. The lens barrel includes a first body portion including a first lens element at least partially disposed therein, a second body portion including a second lens element and a third lens element at least partially disposed therein, and a snap-fit mechanism. The first, second and third lens elements comprise a lens system configured to transfer electromagnetic radiation from the scene to the image capturing component. The snap-fit mechanism includes a plurality of finger members extending from the first body portion and a plurality of complementary notches in the second body portion. The finger member is configured to engage with the recess to releasably secure the first body portion to the second body portion. Related methods and systems are also provided.

Description

Snap fit lens barrel system and method
Technical Field
One or more embodiments relate generally to optical components and, more particularly, to, for example, wide field imaging systems and methods.
Background
The imaging system may include an array of detectors arranged in rows and columns, with each detector serving as a pixel to produce a portion of a two-dimensional image. For example, individual detectors of the detector array capture associated pixel values. There are a variety of image detectors, such as visible light image detectors, infrared image detectors, or other types of image detectors that may be disposed in an image detector array for capturing images. As an example, a plurality of sensors may be provided in an image detector array to detect Electromagnetic (EM) radiation of a desired wavelength. In some cases, for example for infrared imaging, readout of image data captured by the detector may be performed by a readout integrated circuit (ROIC) in a time division multiplexed manner. The read-out image data may be transferred to other circuitry, for example for processing, storage and/or display. In some cases, the combination of the detector array and the ROIC may be referred to as a Focal Plane Array (FPA). Advances in processing technology for FPA and image processing have resulted in increased capabilities and complexity of the resulting imaging systems.
Disclosure of Invention
In one or more embodiments, an imaging device includes a lens barrel including a first body portion, a second body portion, and a snap-fit mechanism. The first body portion includes a first lens element disposed at least partially therein. The second body portion includes a second lens element and a third lens element disposed at least partially therein. The first, second, and third lens elements include a lens system configured to transfer electromagnetic radiation from a scene to an image capture component. The snap-fit mechanism includes a plurality of finger members extending from the first body portion and a plurality of complementary notches in the second body portion. The finger member is configured to engage the recess to releasably secure the first body portion to the second body portion.
In one or more embodiments, a method of manufacturing an imaging device includes: the first lens element is disposed at least partially within the first body portion. The method further includes disposing the second lens element and the third lens element at least partially within the second body portion. The method further includes coupling the second body portion to a housing. The method further includes performing a calibration of the second lens element and the third lens element. The method further includes coupling the first body portion to the second body portion after calibration.
In one or more embodiments, a method of manufacturing an imaging device includes: a first manufacturing operation is performed when the first body portion and the second body portion are connected. The method further includes breaking the snap-fit mechanism to separate the first body portion from the second body portion. The method further includes performing a second manufacturing operation when the first body portion is separated from the second body portion.
In one or more embodiments, a method includes providing a lens barrel. The lens barrel includes a first body portion including a first lens element at least partially disposed therein. The lens barrel includes a second body portion including a second lens element and a third lens element at least partially disposed therein. The first, second, and third lens elements include a lens system configured to transfer electromagnetic radiation from a scene to an image capture component. The lens barrel includes a snap-fit mechanism to secure the first body portion to the second body portion. The snap-fit mechanism includes a plurality of finger members extending from the first body portion and a plurality of complementary notches in the second body portion. The method further includes securing the first body portion to the second body portion using the snap-fit mechanism. The finger member is configured to engage the recess to releasably secure the first body portion to the second body portion.
The scope of the present disclosure is defined by the claims, which are incorporated into this section by reference. A more complete appreciation of embodiments of the present disclosure and implementation of additional advantages thereof will be provided to those skilled in the art by consideration of the following detailed description of one or more embodiments. Reference will be made to the accompanying drawings, which will be briefly described first.
Drawings
Fig. 1 illustrates a block diagram of an imaging device in accordance with one or more embodiments of the present disclosure.
Fig. 2 illustrates a perspective view of an imaging device in accordance with one or more embodiments of the present disclosure.
Fig. 3 illustrates a cross-sectional view of an optical system in accordance with one or more embodiments of the present disclosure.
Fig. 4 illustrates a field of view associated with the optical system of fig. 3 in accordance with one or more embodiments of the present disclosure.
Fig. 5 illustrates a field of view associated with a rear lens group of the optical system of fig. 3 in accordance with one or more embodiments of the present disclosure.
Fig. 6 illustrates a diagram having relative illumination curves associated with a lens system including a front lens group and a rear lens group and relative illumination curves associated with only the rear lens group, in accordance with one or more embodiments of the present disclosure.
Fig. 7A illustrates a graph showing a modulation transfer function versus radius associated with a surface of a lens element in accordance with one or more embodiments of the present disclosure.
Fig. 7B illustrates a graph showing a relationship of a modulation transfer function and thickness of a lens element and a distance between two lens groups in accordance with one or more embodiments of the present disclosure.
Fig. 7C illustrates a graph showing the relationship of relative illumination and thickness of lens elements and distance between two lens groups in accordance with one or more embodiments of the present disclosure.
Fig. 7D illustrates a graph showing a relationship of a field of view and a radius associated with a surface of a lens element, in accordance with one or more embodiments of the present disclosure.
Fig. 7E illustrates a graph showing a relationship of a field of view and a thickness of a lens element and a distance between two lens groups, in accordance with one or more embodiments of the present disclosure.
Fig. 7F illustrates a graph showing relative illumination and radius associated with a surface of a lens element, in accordance with one or more embodiments of the present disclosure.
Fig. 8 illustrates a cross-sectional view of an imaging device in accordance with one or more embodiments of the present disclosure.
Fig. 9 shows a flowchart of an exemplary process for manufacturing the imaging device of fig. 8, in accordance with one or more embodiments of the present disclosure.
10A, 10B, 10C, and 10D illustrate perspective views associated with manufacturing the imaging device of FIG. 8 in accordance with one or more embodiments of the present disclosure.
Fig. 11 shows a flowchart of an exemplary process for using the imaging device of fig. 8 in accordance with one or more embodiments of the present disclosure.
Fig. 12 illustrates a cross-sectional view of an optical system having two front lens elements in accordance with one or more embodiments of the present disclosure.
Fig. 13 illustrates a cross-sectional view of an optical system having three rear lens elements in accordance with one or more embodiments of the present disclosure.
Fig. 14 illustrates a block diagram of an exemplary imaging system in accordance with one or more embodiments of the present disclosure.
Fig. 15 illustrates a block diagram of an exemplary image sensor assembly in accordance with one or more embodiments of the present disclosure.
Fig. 16 illustrates a perspective view of an additional imaging device in accordance with one or more embodiments of the present disclosure.
Fig. 17 illustrates a cross-sectional view of an upper lens assembly of the imaging device of fig. 16 in accordance with one or more embodiments of the present disclosure.
Fig. 18 illustrates a cross-sectional view of a lower lens assembly of the imaging device of fig. 16 in accordance with one or more embodiments of the present disclosure.
Fig. 19 shows a flowchart of an exemplary process for manufacturing the imaging device of fig. 16, in accordance with one or more embodiments of the present disclosure.
Embodiments of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be noted that the dimensions of the various components and the distances between these components are not drawn to scale in the figures. It should be appreciated that like reference numerals are used to identify like elements shown in one or more of the figures.
Detailed Description
The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The accompanying drawings are incorporated in and constitute a part of this detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. However, it will be apparent to and apparent to one skilled in the art that the subject technology is not limited to the specific details set forth herein and may be practiced using one or more embodiments. In one or more instances, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology. One or more embodiments of the subject disclosure are illustrated by and/or described in connection with one or more of the figures and set forth in the claims.
In one or more embodiments, a wide field of view imaging system and method are provided. In some aspects, such systems and methods may be used for infrared imaging, such as thermal infrared imaging. In one embodiment, the imaging device includes a detector array, optical element(s) to direct electromagnetic radiation associated with the scene to the detector array, and a lens barrel within which the optical element(s) are disposed and held/fixed. The imaging device may include a housing coupled to the lens barrel. The housing may include (e.g., enclose) the detector array. In some cases, the housing may include logic to process image data from the detector array, memory to store raw image data and/or processed image data, a battery, and/or other components to facilitate operation of the imaging device. As non-limiting examples, the optical elements may include lens elements, windows, mirrors, beam splitters, beam couplers, and/or other components. In one aspect, an imaging device includes a lens system including a front lens group (e.g., also referred to as a front focus group) and a rear lens group (e.g., also referred to as a rear focus group). In some cases, the imaging device may also include other optical elements upstream of the lens element, downstream of the lens element, and/or interspersed between the two lens elements.
The detector array may receive electromagnetic radiation directed (e.g., projected, transmitted) onto the detector array by the lens element(s). In this regard, electromagnetic radiation may be considered image data. The detector array may generate an image based on the electromagnetic radiation. The lens element(s) and/or other optical element(s) of the imaging device may transmit electromagnetic radiation within a wavelength band that depends on the desired application. In one aspect, the imaging device may be an infrared imaging device for facilitating capture of a band of wavelengths including at least a portion of the thermal infrared spectrum, such as a Long Wave Infrared (LWIR) spectrum. In infrared imaging applications, the detector array may include an array of microbolometers and/or an array of other types of infrared detectors. As non-limiting examples, the lens element may include silicon, germanium, chalcogenide glass (e.g., as 40 Se 60 ) Selenium germanium arsenide (GeAsSe), ge 22 As 20 Se 58 、Ge 33 As 12 Se 5 Zinc selenide, organic materials such as polyethylene and 4-methylpenten-1-yl olefin copolymer (TPX), and/or any lens material generally suitable for infrared applications. Accordingly, the lens material used to fabricate the lens element(s) is generally based on the desired application. For example, the lens material may be selected to allow a desired transmission band of the lens element.
In some embodiments, a wide field of view imaging device, such as an ultra wide field of view (UWFOV), may include lens elements formed using a Wafer Level Optics (WLO) fabrication process. In some aspects, the imaging device may be an LWIR camera. As a non-limiting range, the UWFOV imaging device may provide a field of view (FOV) of between about 110 ° to about 220 ° (e.g., about 120 ° to about 160 ° in some cases). In this regard, in some cases, the lens system of the imaging device may be designed to provide a FOV exceeding 180 ° to allow the imaging device to capture scene data (e.g., image data in the form of electromagnetic radiation) behind the imaging device.
WLO fabrication processes (e.g., forming a polymer structure on a substrate followed by transfer etching) are typically associated with lower costs than other fabrication processes, and thus the option of having lens elements formed using WLO fabrication processes in an imaging device allows for cost savings. The lens elements formed as part of the wafer level process may then be singulated to obtain individual lens elements that may be disposed in an imaging device. The lens shapes used in UWFOV applications are typically outside the WLO manufacturing design rule range. In this regard, WLO processes impose limitations on the lens shapes that can be produced. As a typical example of WLO manufacturing design rules, the maximum lens sagittal height should not exceed 0.3mm and the maximum slope along the curvature should not exceed about 15 ° or 16 °.
To provide ultra-wide FOV imaging capability, an imaging device includes a front lens group and a rear lens group, where each lens group includes one or more lens elements. The rear lens group may be formed/produced using a WLO process (e.g., due to their lower cost relative to other manufacturing processes). Thus, the lens element(s) of the rear lens group are designed to meet the rules associated with WLO manufacturing and thus have a lower sagittal height and lower slope. In some cases, the lens element(s) of the rear lens group are aspheric lens element(s). The pre-lens group may be formed/produced using a grinding process, a polishing process, a diamond turning process, and/or a molding process. In some aspects, the two curvatures of the lens element(s) of the front lens group may be designed as spherical surfaces. The grinding/polishing process may be used to create curvature to provide a double-sided polished spherical lens element. In such aspects, with the spherical shape of the spherical lens element, the spherical lens element(s) of the pre-lens group may be formed/produced using an lapping/polishing process that is generally less expensive than diamond point turning or molding processes. Thus, depending on the application, a cost-effective process may be used to form the front lens group and/or the rear lens group.
The lens element(s) of the anterior lens group may have a greater sagittal height and a steeper slope to collectively provide a UWFOV. In this regard, the front lens group may be designed to provide a desired FOV for the imaging device. In one aspect, the front lens group may be referred to as a front fisheye group. The lens element(s) of the rear lens group may collectively provide a narrower FOV than that provided by the front lens group. Thus, using various embodiments, lower costs associated with WLO fabrication may be achieved with ultra-wide FOV imaging by designing the lens element(s) of the rear group lens to meet the rules associated with WLO fabrication while designing the lens element(s) of the front group lens to have a larger sagittal height and steeper slope suitable to allow image capture of an ultra-wide FOV scene.
In some embodiments, the use of one or more spherical lens elements to form the front lens group allows for mitigation of calibration and gain correction process challenges that may be associated with UWFOV. By using spherical lens element(s), the calibration and gain correction process may be performed using only the rear lens group (e.g., without the front lens group). The front lens group may then be mounted in front of the rear lens group after such calibration and gain correction procedures. As further described herein, when the front lens group is formed of one or more spherical lens elements, the Relative Illumination (RI) curve associated with only the rear lens group is substantially the same as the RI curves associated with the front and rear lens groups. The nature of the spherical shape of the lens element(s) of the front lens group that does not change the resulting RI curve (e.g., has minimal or no effect thereon) allows the calibration and gain correction process to be performed with the rear lens group alone. The gain map may be determined based at least in part on the RI curve determined using only the rear lens group. In this regard, after the rear lens group is collimated, the front lens group may be positioned in front of the rear lens group without changing the gain map determined from the collimation using only the rear lens group.
The use of only the rear lens group allows for a calibration setting that is generally easier to achieve than the use of front and rear lens groups. The calibration settings for a wide FOV lens element such as a front lens element(s) involve capturing images of a corresponding large blackbody (e.g., a large flat uniform blackbody of known temperature) that subtends the entire FOV produced by the wide FOV lens element, while the calibration settings associated with using only the rear lens group involve capturing images of a smaller blackbody (e.g., more readily available and/or more readily achievable than a larger blackbody) that subtends a smaller FOV produced by the rear lens group (e.g., compared to the FOV produced by the rear lens group and the front lens group). Thus, using the various embodiments, the arrangement of lens elements set forth in accordance with the various embodiments may allow for beneficial costs and process (e.g., manufacturing process, gain calibration process) characteristics associated with imaging device (e.g., UWFOV LWIR camera) manufacturing and operation.
Although the various embodiments are described primarily with respect to infrared imaging, the methods and systems disclosed herein may be used in connection with devices and systems, such as imaging systems having visible and infrared imaging capabilities, mid-wave infrared (MWIR) imaging systems, short-wave infrared (SWIR) imaging systems, light detection and ranging (LIDAR) imaging systems, RADAR detection and ranging (RADAR) imaging systems, millimeter-wavelength (MMW) imaging systems, ultrasound imaging systems, X-ray imaging systems, microscope systems, mobile digital cameras, video surveillance systems, video processing systems, or other systems or devices that may require acquisition of image data in one or more portions of the EM spectrum.
Referring now to the drawings, fig. 1 illustrates a block diagram of an imaging device 100 in accordance with one or more embodiments of the present disclosure. In an embodiment, the imaging device 100 may be an infrared imaging device. The imaging device 100 may be used to capture and process image frames. The imaging device 100 includes an optical component 105, an image capture component 110, an image capture interface component 115, and an optional shutter component 120.
The optical component 105 may receive electromagnetic radiation through the aperture 125 of the imaging device 100 and transfer the electromagnetic radiation to the image capturing component 110. For example, the optical component 105 may direct and/or focus electromagnetic radiation at the mapLike the capturing means 110. The optical component 105 may include one or more windows, lenses, mirrors, beam splitters, beam couplers, and/or other components. In embodiments, the optical component 105 may include one or more chalcogenide lenses, e.g., made of As 40 Se 60 A lens is made that allows imaging in the broad infrared spectrum. Other materials may be utilized, such as silicon, germanium, and GeAsSe. The optical component 105 may include components that are each formed of a material and appropriately arranged according to desired transmission characteristics (e.g., desired transmission wavelengths and/or light transmission matrix characteristics).
In one embodiment, the image capture component 110 includes one or more sensors (e.g., visible light sensors, infrared sensors, or other types of detectors) for capturing image signals representative of an image of the scene 130. The image capturing component 110 can capture (e.g., detect, sense) infrared radiation having a wavelength in the range of about 700nm to about 1mm or a portion thereof. For example, in some aspects, the image capture component 110 may include one or more sensors that are sensitive (e.g., better detect) to thermal infrared wavelengths, including LWIR radiation (e.g., electromagnetic radiation having a wavelength of 7-14 μm). The sensor(s) of the image capture component 110 can represent (e.g., convert) or facilitate representing the captured thermal image signal of the scene 130 as digital data (e.g., via an analog-to-digital converter).
The image capture interface component 115 may receive image data captured at the image capture component 110 and may transmit the captured image data to other components or devices, for example, via wired and/or wireless communication. In various embodiments, the imaging device 100 may capture image frames of, for example, the scene 130.
In some embodiments, the optical component 105, the image capture component 110, and the image capture interface component 115 may be housed in a protective enclosure. In one instance, the protective enclosure may include a lens barrel (e.g., also referred to as a lens housing) that houses the optical component 105 and a housing that houses the image capture component 110 and/or the image capture interface component 115. In this case, the lens barrel may be coupled to the housing. In one aspect, the protective enclosure may be represented by a solid line box with an aperture 125 in fig. 1. For example, aperture 125 may be an opening defined in the protective housing that allows electromagnetic radiation to reach optical component 105. In some cases, aperture 125 may be an aperture stop of imaging device 100.
Each optical element (e.g., lens element) may include at least one mating feature (e.g., also referred to as a mounting feature). The lens barrel may have mating feature(s) coupled to the optical element(s) to receive and secure the corresponding mating feature(s) of the optical element(s). In this regard, each mating feature of the optical element may be coupled to a corresponding mating feature of the lens barrel to couple the optical element to the lens barrel. In one example, the mating features of the optical element may include a first surface and a second surface at an angle (e.g., 90 ° angle, obtuse angle, or acute angle) relative to the first surface, and the mating features of the lens barrel may have corresponding surfaces to couple to the first surface and the second surface. In another example, the mating feature of the optical element may include a pin portion, and the mating feature of the lens barrel may include a slot portion for receiving the pin portion, and/or vice versa. More generally, the mating feature(s) of the optical element and the corresponding mating feature(s) of the lens barrel may be any structure (e.g., a notch, hole, pin, or other structure) that facilitates coupling of the optical element with the lens barrel.
In some cases, the mating features of the lens elements may be adapted to facilitate rotation and/or other movement of the lens elements. In some cases, mating features may be used to facilitate alignment of lens elements during molding, processing, and/or assembly, such as through pattern recognition. For example, one or more mating features on a surface of a lens element may be positioned (e.g., using pattern recognition to scan the surface) to facilitate processing different surfaces of the lens element according to a desired design. As another example, mating feature(s) of surface(s) of the first lens element and/or mating feature(s) of surface(s) of the second lens element may be used to facilitate alignment of the first lens element relative to the second lens element.
The shutter member 120 may be operated to be selectively inserted into the optical path between the scene 130 and the optical member 105 to expose or block the aperture 125. In some cases, shutter member 120 may be moved (e.g., slid, rotated, etc.) manually (e.g., by a user of imaging device 100) and/or via an actuator (e.g., controllable by a logic device, e.g., autonomously decided by the logic device to perform calibration of imaging device 100, etc.) in response to user input or autonomously. When the shutter component 120 is outside of the optical path to expose the aperture 125, electromagnetic radiation from the scene 130 may be received by the image capture component 110 (e.g., via one or more optical components and/or one or more filters). Thus, the image capturing section 110 captures an image of the scene 130. The shutter member 120 may be said to be in an open position or simply open. When the shutter member 120 is inserted into the optical path to block the aperture 125, electromagnetic radiation from the scene 130 is blocked from the image capturing member 110. Thus, the image capturing section 110 captures an image of the shutter section 120. The shutter member 120 may be said to be in a closed position or simply closed.
In some aspects, the shutter member 120 may block the aperture 125 during the calibration process, wherein the shutter member 120 may act as a uniform blackbody (e.g., a substantially uniform blackbody). For example, the shutter member 120 may function as a single temperature source or substantially a single temperature source. In some cases, shutter member 120 may be temperature controlled to provide a temperature controlled uniform black body (e.g., to present a uniform radiation field to image capture member 110). For example, in some cases, the surface of shutter member 120 imaged by image capture member 110 may be implemented with a uniform blackbody coating. In some cases, such as for imaging devices without a shutter member or with a broken shutter member or as an alternative to shutter member 120, the housing or holster, lens cap, cover, wall of a room, or other suitable object/surface of imaging device 100 may be used to provide a uniform black body (e.g., a substantially uniform black body) and/or a single temperature source (e.g., a substantially single temperature source).
Although in fig. 1 the shutter members 120 are located in front of all optical members 105 (e.g., closer to the scene 130 than all optical members), the shutter members 120 may be located between the optical members. For example, the optical component 105 may include a first set of one or more lens elements and a second set of one or more lens elements, with the shutter component 120 selectively interposed between a last lens of the first set(s) of lens elements and a first lens of the second set(s) of lens elements. Further, alternatively or additionally, although the shutter member 120 is located on or near an outer surface of the housing of the imaging device 100, the shutter member 120 may be located within the housing of the imaging device 100. In some aspects, the imaging device 100 may include no shutter member or include more than one shutter member.
Imaging device 100 may represent any type of camera system, for example, detecting electromagnetic radiation (e.g., thermal radiation) and providing representative data (e.g., one or more still image frames or video image frames). For example, the imaging device 100 may be configured to detect visible and/or infrared radiation and provide associated image data. In some cases, imaging device 100 may include other components such as a heater, a temperature sensor (e.g., for measuring an absolute temperature of a component of imaging device 100), a filter, a polarizer, and/or other components. For example, the integrated heater may be coupled to a lens barrel of the imaging apparatus 100.
Fig. 2 illustrates a perspective view of an imaging device 200 in accordance with one or more embodiments of the present disclosure. As one example, imaging device 200 may be an LWIR thermal camera (e.g., for capturing electromagnetic radiation having a wavelength of 7-14 μm). In other cases, imaging device 200 may be used to capture electromagnetic radiation in other wavelength ranges.
The imaging device 200 may include a lens barrel 205 configured to house at least one lens element 210. The lens barrel 205 may include structure that holds/secures (e.g., fixedly secures, movably secures) the lens element 210. The imaging device 200 may also include an image capturing portion 215 including an image capturing component configured to capture an image viewed through the lens barrel 205. The image capturing portion 215 may include a microbolometer array configured to detect EM radiation. As one example, the microbolometer array may be configured to detect long-wave infrared light having a wavelength between 7.5 μm and 13.5 μm. In an embodiment, the lens barrel 205 may be a lens barrel of the imaging device 100 of fig. 1. In an embodiment, the imaging device 200 may be the imaging device 100 of fig. 1. In this embodiment, the optical component 105 of fig. 1 may include at least the lens element 210, and the image capturing component 110 of fig. 1 may include the image capturing portion 215.
In some cases, the lens barrel 205 may be configured to accommodate a window in front of the lens element 210 (e.g., closer to the scene than the lens element). The window may selectively pass electromagnetic radiation of the scene. In some cases, the window may be a protective window placed in front of lens element 210 to protect lens element 210 and/or other components of imaging device 200 from environmental, mechanical, and/or other damage. The physical properties of the window (e.g., material composition, thickness, and/or other dimensions, etc.) may be determined according to the wavelength band(s) desired to be transmitted through the window. The lens barrel 205 may include structure that retains/secures (e.g., fixedly secures, movably secures) the window and/or the lens element 210.
Fig. 3 illustrates a cross-sectional view of an optical system 300 in accordance with one or more embodiments of the present disclosure. The optical system 300 is oriented in three orthogonal directions denoted X, Y and Z. The X-direction and the Y-direction may be referred to as a horizontal direction and a vertical direction, respectively. In particular, fig. 3 shows a cross-sectional view of an optical system 300 in the YZ plane. The optical system 300 includes a front lens group 305, a rear lens group 310, a window 315, a detector array 320, and a shutter member 325. In an embodiment, the optical component 105 of fig. 1 may include a front lens group 305, a rear lens group 310, and a window 315, and the image capturing component 110 of fig. 1 may include a detector array 320.
Front lens group 305 includes lens element 335. Front lens group 305 may provide a wide FOV, such as UWFOV. In some aspects, the lens element 335 may be a spherical lens element. The spherical lens element may be formed by an abrasive/polishing process. At the position ofIn some cases, both surfaces of the lens element 335 may be spherical. Rear lens group 310 includes lens elements 340 and 345. In some aspects, lens elements 340 and 345 may be aspheric lens elements. Lens elements 340 and 345 may be formed by a WLO process. Where lens elements 340 and 345 are different, lens elements 340 and 345 may be formed as part of one wafer level process (e.g., a wafer level process that may be used to obtain lens elements of different shapes and/or sizes) or as two separate wafer level processes. Lens elements 340 and 345 form a doublet. Each of the lens elements 335, 340, and 345 (e.g., as well as other optical components not labeled or shown in fig. 3) may have particular optical characteristics, such as a particular Effective Focal Length (EFL) and transmitted wavefront. In general, each additional lens element provided may allow more degrees of freedom with respect to the characteristics (e.g., shape, size such as curvature) defined for each lens element to achieve a desired performance. Examples of materials for lens elements 335, 340, and/or 345 may include As 40 Se 60 、Ge 22 As 20 Se 58 、Ge 33 As 12 Se 5 Germanium, zinc selenide, silicon, polyethylene, and TPX. In some cases, one or more coatings may be disposed on lens elements 335, 340, and/or 345. As non-limiting examples, the coating may be an anti-reflective (AR) coating, a polarizing coating, an impact-resistant coating, and/or other coatings.
Lens elements 335, 340, and 345 may coordinate to direct and focus infrared light onto detector array 320. Lens element 335 receives electromagnetic radiation and directs the received electromagnetic radiation to lens element 340 of rear lens group 310. Lens element 340 receives electromagnetic radiation from lens element 335 and directs electromagnetic radiation received from lens element 335 to lens element 345. Lens element 345 receives electromagnetic radiation from lens element 340 and directs the electromagnetic radiation received from lens element 340 to detector array 320. Thus, front lens group 305 and rear lens group 310 together project a scene onto detector array 320. In this regard, FIG. 3 illustrates at least a portion of a scene ray traced through front lens group 305 and rear lens group 310 to detector array 320. As shown in fig. 3, the lens element 335 may be a refractive lens element. Lens elements 340 and 345 may be plano-convex lens elements. The lens element 335 has a surface a and a surface B opposite the surface a. The surface a of the lens element 335 faces the scene. Lens element 340 has a surface D and a surface E opposite surface D. Surface D of lens element 340 faces surface B of lens element 335. Lens element 345 has a surface I and a surface J opposite surface I. Surface I of lens element 345 faces surface E of lens element 340. The surface J of the lens element 340 faces the window 315.
As a non-limiting example, the distance between surface B of lens element 335 and surface D of lens element 340 may be between about 4mm to about 5 mm. As a non-limiting example, the thickness of each of lens elements 340 and 345 may be between about 0.5mm to about 1.5 mm. The thickness of lens elements 340 and 345 is typically selected for lower quality (e.g., associated with lower cost) while providing sufficient mechanical stability. As a non-limiting example, dimension L (e.g., extending from about the bottom surface to the top surface of lens element 335) may be from about 7mm to about 500mm. As a non-limiting example, dimension H (e.g., extending from around surface a of lens element 335 to surface J of lens element 345) may be from about 5mm to 300mm. The dimensions of H and L generally depend on the image diagonal of the detector array 320. For a given pixel size, a larger number of pixels is typically associated with a larger lens. As one example, L may be referred to as the length of an imaging device (e.g., camera) and H may be referred to as the height of the imaging device, or vice versa, L may be referred to as the height and H may be referred to as the length. By way of non-limiting example, the thickness of window 315 may be from about 0.4mm to about 1mm. As a non-limiting example, the gap between window 315 and detector array 310 may be about 0.1mm.
A window 315 is provided in front of detector array 320 to selectively pass electromagnetic radiation to detector array 320. The physical properties of window 315 (e.g., material composition, thickness, and/or other dimensions, etc.) may be determined based on the wavelength band(s) desired to be transmitted through window 315. Window 315 may be provided as a cover for detector array 320. The window 315 may be configured to protect the detector array 320 and create a vacuum between a sensor (e.g., a microbolometer) of the detector array 320 and the window 315. In some cases, window 315 may be used to provide filtering, polarization, and/or other optical effects in addition to protection. In some cases, one or more coatings (e.g., polarizing coatings, AR coatings, impact resistant coatings) may be provided over window 315 to provide filtering, polarization, protection, and/or other effects.
The detector array 320 receives the electromagnetic radiation and generates an image based on the electromagnetic radiation. In one aspect, processing circuitry downstream of the detector array 320 may be used to process the image. As non-limiting examples, detector array 320 may have dimensions of 160 x 120 sensors (e.g., 160 x 120 microbolometer array), 320 x 256 sensors, and 1280 x 1024 sensors.
Although in optical system 300 front lens group 305 has a single lens and rear lens group 310 has two lens elements, in some embodiments front lens group 305 has more than one lens element and/or rear lens group 310 has more or less than two lens elements. As one example, providing more lens elements (e.g., one or more additional spherical lens elements) in the front lens group 305 may facilitate a widening of the FOV associated with the front lens group 305. In this regard, each additional lens element may facilitate a widening of the FOV associated with the front lens group 305. As one example, instead of or in addition to providing more lens elements in the front lens group 305, providing more lens elements (e.g., one or more additional aspheric lens elements) in the rear lens group 310 may allow electromagnetic radiation to be projected onto a larger detector array (e.g., having more rows and/or columns of sensors).
Shutter member 325 may be operated to be selectively inserted into the optical path between the scene and rear lens group 310 to expose or block the scene from detector array 320. In some cases, the shutter member 325 may move (e.g., slide, rotate, etc.) manually (e.g., by a user) and/or via an actuator (e.g., controllable by the logic device in response to user input or autonomously, e.g., autonomously decided by the logic device to perform calibration of the imaging device). In some aspects, the shutter member 325 may block the detector array 320 from the scene during a calibration process, wherein the shutter member 325 may function as a uniform blackbody (e.g., a substantially uniform blackbody), as further described herein.
An aperture stop 350 is positioned/defined in front of the rear lens group 310. The aperture stop 350 defines the amount of light transmitted to the detector array 320. The aperture stop 350 may have a spatial size comparable to that of the rear lens group 310. The aperture stop 350 may be defined by physical properties of the lens element 340 (e.g., size, shape, and material of the front surface of the lens element 340) as well as physical properties of the structure that holds the lens element 340. For example, the structure may be part of a lens barrel (e.g., lens barrel 200). In one case, the structure may be a metallic structure at least partially in front of the lens element 340. As one example, the structure may be a metal structure having a shape conforming to the front surface of the lens element 340.
In an embodiment, lens elements 335, 340, and/or 345 may be movable relative to detector array 320 in order to facilitate horizontal field of view alignment with the horizontal direction of detector array 320 and vertical field of view alignment with the vertical direction of detector array 320. In some aspects, the lens elements 335, 340, and/or 345 may be moved via a sliding motion (e.g., translational motion) to facilitate focusing, for example, through the use of one or more actuators coupled to the lens elements 335, 340, and/or 345. In one case, the sliding motion may be along the Z-axis (e.g., a direction perpendicular to the focal plane) while maintaining a fixed angular orientation. In these aspects, the focusing mechanism of lens elements 335, 340, and/or 345 may include a means (e.g., an actuator) for moving lens elements 335, 340, and/or 345. In some aspects, the lens(s) may be focused by rotating the lens(s) inside the threaded housing. In some aspects, the housing is unthreaded. The housing may allow for a linear sliding fit type arrangement rather than a screw-in type arrangement, wherein the lens elements 335, 340, and/or 345 may be pushed into the housing and held in place using at least friction. Alternatively, some clearance may be provided between the barrel and the housing to allow active alignment of the optics with the detector array 320, which is held in place by epoxy or other suitable adhesive.
In some embodiments, lens elements 335, 340, and 345 are each associated with a lens prescription (lens prescription). In some aspects, each prescription may be represented according to the following:
wherein s=x 2 +y 2 The method comprises the steps of carrying out a first treatment on the surface of the c=1/r; r is the radius of curvature; a is that 1 、A 2 、A 3 、A 4 、...、A 12 Is an aspherical deformation constant; and K is the conic constant.
Table 1 shows example values of various parameters of the optical system 300. For example, as shown in table 1 below, surface E of lens element 340 and surface J of lens element 345 are flat surfaces and thus have zero coefficients.
Fig. 4 illustrates a FOV, denoted as a, associated with (e.g., provided by) the optical system 300 of fig. 3 in accordance with one or more embodiments of the present disclosure. The FOV α depicted in fig. 4 is about 160 °. More generally, in some embodiments, FOV α may be between about 110 ° to about 220 °. In this regard, in some embodiments, the optical system 300 may be designed to provide a field of view α of more than 180 ° to allow an imaging device including the optical system 300 to capture scene data (e.g., image data in the form of electromagnetic radiation) behind the imaging device. In some aspects, the field of view α may be between about 120 ° to about 160 °.
Fig. 5 shows a field of view, denoted β, associated with the rear lens group 310 of the optical system 300 of fig. 3. The FOV β depicted in fig. 5 is about 60 °. In some embodiments, FOV β may be between about 50 ° to about 70 °.
In some embodiments, a gain calibration process may be performed on the optical system 300. In some aspects, the gain calibration process may involve capturing an image of a flat, uniform black body using an imaging device to create a gain map stored in a pipeline. In order to flatten the signal across the detector array, the signal drop due to relative illumination is compensated for with gain. RI refers to the effect that the lens element has an illumination decay from center to angular field. When the lens element 335 (e.g., and any other lens elements of the front lens group 305) is a spherical lens element, a calibration process, such as a gain calibration process, may be performed based solely on the rear lens group 310 (e.g., instead of the rear lens group 310 and the front lens group 305). In such an embodiment, due to the spherical shape of lens element 335, the RI curve associated with optical system 300 including lens element 335 and rear lens group 310 is substantially the same as the RI curve associated with rear lens group 310 alone. As an example, fig. 6 shows a graph 600 having an RI curve 605 associated with a lens system including a lens element 335 and a rear lens group 310 and an RI curve 610 associated with only the rear lens group 310. RI curves 605 and 610 are substantially identical (e.g., substantially overlap/overlap each other).
The gain map may be determined based at least in part on the RI curve determined using only the rear lens group 310. In this regard, after the rear lens group 310 is collimated, the front lens group 305 may be positioned in front of the rear lens group 310 without changing the gain map determined from the collimation. Accordingly, since the lens element 335 does not change the RI curve, gain calibration may be performed using only the rear lens group 310 instead of the rear lens assembly 310 and the lens element 335. The use of only the rear lens group 310 allows for a generally easier to implement alignment setting because the alignment setting for a wide FOV lens element such as lens element 335 involves capturing an image of a corresponding large blackbody (e.g., a large flat uniform blackbody) that subtends the entire FOV produced by the wide FOV lens element. The calibration settings associated with using only the rear lens group 310 (e.g., as shown in fig. 4) involve capturing images of a smaller blackbody that subtends a smaller FOV produced by the rear lens group 320 (e.g., as compared to the FOV produced by the rear lens group 310 and the lens element 335). In some cases, the use of smaller blackbody allows for cost-effective batch-level calibration.
The calibration settings may include a reference object (e.g., also referred to as a reference source) located in the field of view of the detector array 320. The reference object may be at a known temperature (e.g., precisely measured and/or controllable temperature) and provide a uniform black body. In this regard, the reference object may serve as a single temperature source or substantially a single temperature source. In some cases, the reference object may be a shutter member 325 (e.g., an integrated shutter) that selectively closes to block the detector array 320. The logic device may control the actuator to close the shutter member 325, or the user may manually close the shutter member 325 (e.g., by manually controlling the actuator or manually closing the shutter member 325). In some cases, the reference source may be an external reference object provided in the scene. Such external objects may be referred to as external shutters or as providing external shutters.
Although it was described above that the gain calibration process is performed using only the rear lens group 310 and then the front lens group 305 is installed, in other embodiments, appropriate equipment, environment, and/or image device designs may be readily obtained so that the gain calibration process may be performed on the rear lens group 310 along with the front lens group 305.
In some embodiments, using a spherical prescription for the front lens element(s) of front lens group 305 may allow for low sensitivity of lens performance to the pattern errors in the prescription(s) of the front lens element(s) and its/their corresponding positions relative to the rear lens elements of rear lens group 310. As an example, fig. 7A-7F each show graphs showing low sensitivity of performance metrics (e.g., modulation transfer function, FOV, or RI) to pattern errors and positions of lens element 335 (e.g., relative to rear lens group 310). For purposes of explanation, lens element 335 has the prescription provided in table 1. Fig. 7A shows a graph 705 illustrating the relationship of low sensitivity of the on-axis Modulation Transfer Function (MTF) and a radius associated with surface a (e.g., in mm) denoted as radius a and a radius associated with surface B (e.g., in mm) denoted as radius B. Fig. 7B shows a graph 710 illustrating the relationship of low sensitivity of the coaxial MTF and the thickness of lens element 335 expressed as AB thickness (e.g., distance between surface a and surface B) and the distance between surface B of lens element 335 and surface E of lens element 340 (shown as B-E air gap). FIG. 7C shows a graph 715 showing the relationship of low sensitivity of RI and AB thickness and B-E air gap. Fig. 7D shows a graph 720 showing the relationship of low sensitivity of FOV and a radius and B radius. Figure 7E shows a graph 725 which shows the relationship of low sensitivity of FOV and AB thickness and B-E air gap. Fig. 7F shows a graph 730 showing the relationship of low sensitivity of RI and a radius and B radius. In some embodiments, such low sensitivity may be utilized to allow for focusing of a lens system including front lens group 305 and rear lens group 310 using an autofocus device (e.g., a cost-effective autofocus device) as part of the manufacture of an imaging apparatus including the lens system.
Fig. 8 illustrates a cross-sectional view of an imaging device 800 in accordance with one or more embodiments of the present disclosure. However, not all depicted components may be required, and one or more embodiments may include additional components not shown in the figures. Variations in the arrangement and type of the components may be made without departing from the spirit or scope of the claims as set forth herein. Additional components, different components, and/or fewer components may be provided.
Imaging device 800 includes a lens barrel 805, a window 315, and a detector array 320. The window 315 and the detector array 320 are disposed in a housing 810 (e.g., a camera housing) of the imaging device 800. The lens barrel 805 includes body portions 815 and 820. The lens barrel 805 (e.g., body portions 815 and/or 820) includes structure(s) that hold/secure (e.g., fixedly, movably secured) optical element(s) such as lens elements. The body portion 815 may be referred to as a front body portion or a top body portion, and the body portion 820 may be referred to as a rear body portion or a bottom body portion. The body portion 815 and the body portion 820 may be formed as separate pieces that are then coupled together (e.g., using an adhesive, engagement features, etc.). In fig. 8, body portion 815 includes lens element 335 of front lens group 305 and body portion 820 includes lens elements 340 and 345 of rear lens group 310. The lens barrel 805 may allow the optical components disposed therein to maintain an axial position and/or an air gap therebetween. In some cases, a portion of the lens barrel 805 (e.g., a portion of the body portion 820) may be threaded to mate with a threaded portion of the housing 810. Such threads may facilitate focusing of the optical element relative to the focal plane array.
Fig. 9 illustrates a flowchart of an exemplary process 900 for manufacturing the imaging device 800 of fig. 8 in accordance with one or more embodiments of the present disclosure. For purposes of explanation, the example process 900 is described herein with reference to the components of fig. 8, 10A, 10B, 10C, and 10D. Fig. 10A, 10B, 10C, and 10D illustrate perspective views associated with manufacturing an imaging device 800. However, the example process 900 is not limited to the components of fig. 8, 10A, 10B, 10C, and 10D.
At block 905, the detector array 320 is provided. At block 910, an optical component is formed. The optical components may include one or more windows (e.g., window 315) and/or one or more lens elements (e.g., lens elements 335, 340, and 345). In some cases, the lens element 335 may be a spherical lens element (e.g., spherical surface on both sides) formed using an abrasive/polishing process. In some cases, lens elements 340 and 345 may be aspheric lens elements formed using a WLO process. For LWIR imaging applications, window 315 and lens elements 335, 340, and 345 may be formed of a material that transmits in the 7-14 μm band.
At block 915, the detector array 320 is disposed within a housing 810 (e.g., a camera housing) of the imaging device 800. Window 315 may be provided as a cover for detector array 320. The window 315 may be configured to protect the detector array 320 and create a vacuum between a sensor (e.g., a microbolometer) of the detector array 320 and the window 315. In designing the optical system 300, a specific distance between the lens element 335 and the detector array 320 is allocated to support a specific thickness of the window 315. At block 920, referring to fig. 10A, the rear lens group 310 is disposed at least partially within the lens barrel 805 (e.g., the body portion 820 of the lens barrel 805). In some aspects, both lens elements 340 and 345 may have mating features to couple to corresponding mating features of the lens barrel 805. At block 925, the body portion 820 of the lens barrel 805 is coupled to the housing 810.
At block 930, referring to fig. 10B, collet 1005, in which front lens element 1010 is positioned, is coupled/engaged to body portion 820 of lens barrel 805 to allow for precise focusing of the three lens system formed by lens elements 340 and 345 and front lens element 1010. As a result of the focusing process, the three lens system is focused with respect to the detector array 320. During focusing, the collet 1005 may engage with the body portion 820 of the lens barrel 805 using a torque locking feature (not shown) such that the three lens system is focused relative to the detector array 320. The collet 1005 may provide a standard reference to facilitate focusing the rear lens group onto the front lens element 1010. It should be noted that the placement of lenses 340 and 345 within body portion 820 of lens barrel 805 at block 920 need not be accurate. The position of lens elements 340 and 345 may be determined/estimated based on the approximate number of turns. A more precise positioning of lens elements 340 and 345 (e.g., relative to front lens element 1010 and detector array 320) may be performed at block 930.
At block 935, referring to fig. 10C, after the focusing process, the collet 1005 with the front lens element 1010 is removed (e.g., decoupled/decoupled from the body portion 820 of the lens barrel 805). As a result of the focusing, lens elements 340 and 345 of rear lens group 310 are properly positioned relative to detector array 320 (e.g., at a post-focus working distance). At block 940, gain correction/calibration of the rear lens group 310 is performed using the rear lens group 310 according to the focus positioning performed at block 930 to obtain a gain map. To perform calibration, a reference object (e.g., an internal shutter, an external shutter, or other object) may be positioned over the FOV of the rear lens group 310 and image data captured by directing electromagnetic radiation to the detector array 320 using the rear lens group 310. The gain map may be determined based on the image data (e.g., using logic of the imaging device 800 and/or coupled to the imaging device 800). Because front lens element 910 is used to facilitate focusing and collimating, front lens element 910 may be referred to as a reference lens element or a collimating lens element.
At block 945, the front lens group 305 is disposed at least partially within the body portion 815 of the lens barrel 805. At block 950, referring to fig. 10D, a body portion 815 of the lens barrel 805 with the front lens group 305 disposed therein is coupled to a body portion 820 of the lens barrel 805. The imaging device 800 is formed and may be used to capture images. Gain correction may be performed on these images using the gain map determined at block 940 based on the rear lens group 310 (e.g., without the front lens group 305 installed). In some cases, the front lens group 305 may be slightly focused after installation prior to use of the imaging device 800. In some aspects, the impact of variations in geometric errors in the population is typically small due to the low sensitivity of the design to the pattern errors of the front lens elements and their positions (e.g., assuming that the front lens elements are manufactured within proper tolerances). In some aspects, due to such low sensitivity, the lens elements 335 of the front lens group 305 may be selected (e.g., randomly selected) from a population of lens elements manufactured according to a prescription associated with a desired front group of lens elements. In this regard, the rear lens group 310 may form a lens system with any lens elements manufactured according to the prescription associated with the desired front group lens element, with minimal or no further adjustment required prior to use of the lens system including the front group lens element and the rear lens group 310. Each lens element in the population may be produced by the vendor(s) within a certain margin of error. It should be noted that the calibration at block 940 may be performed at the factory and/or in the field. In some cases, lens element 335 may be used and in-situ calibration performed without collet 1005.
Fig. 11 illustrates a flowchart of an exemplary process 1100 for using the imaging device 800 of fig. 8 in accordance with one or more embodiments of the present disclosure. For purposes of explanation, the example process 1100 is described herein primarily with reference to the imaging device 800. However, the exemplary process 1100 is not limited to the imaging device 800 of fig. 8. At block 1105, a lens system including front lens set 305 and rear lens set 310 receives electromagnetic radiation associated with a scene and directs the electromagnetic radiation to detector array 320. At block 1110, the detector array 320 receives electromagnetic radiation from a lens system. In this regard, each detector of detector array 320 may receive a portion of the electromagnetic radiation from the lens system. At block 1115, the detector array 320 generates an image based on the electromagnetic radiation and the gain map. In some aspects, a gain map may be determined (e.g., at block 940) based on the calibration of the rear lens group 310 (e.g., without the front lens group 305 installed). In some aspects, the lens system may be adapted to transmit thermal infrared radiation, and the image generated by the detector array 320 may be a thermal infrared image. In some cases, the image generated by detector array 320 may be provided for processing, storage, and/or display. For example, the image may be provided to a processor for processing to remove distortion in the image, and the processed image may then be provided for storage, display, and/or further processing.
Although in the optical system 300 referenced in fig. 3 and various other figures, the front lens group 305 has a single lens and the rear lens group 310 has two lens elements, in some embodiments, the front lens group 305 has more than one lens element and/or the rear lens group 310 has more or less than two lens elements.
In one aspect, providing more lens elements (e.g., one or more additional spherical lens elements) in the front lens group 305 may facilitate a widening of the FOV associated with the front lens group 305. In this regard, each additional lens element may facilitate a widening of the FOV associated with the front lens group 305. As an example, fig. 12 shows a cross-sectional view of an optical system 1200 in accordance with one or more embodiments of the present disclosure. The description of fig. 3 applies generally to fig. 12, with examples of differences and other descriptions provided herein. The optical system 1200 includes a front lens group 1205, a rear lens group 1210, a window 1215, a detector array 1220, and a shutter member 1225. In embodiments, front lens group 1205, rear lens group 1210, window 1215, detector array 1220, and shutter member 1225 may be, may provide the same or similar functionality as front lens group 305, rear lens group 310, window 315, detector array 320, and shutter member 325, respectively, and/or may otherwise correspond to front lens group 305, rear lens group 310, window 315, detector array 320, and shutter member 325, respectively.
Front lens group 1205 includes lens elements 1235 and 1240. Front lens group 1205 may provide a wide FOV, such as UWFOV. In some aspects, lens elements 1235 and 1240 can be spherical lens elements. The spherical lens element may be formed by an abrasive/polishing process. In some cases, both surfaces of lens elements 1235 and 1240 may be spherical. The additional lens element 1240 may facilitate the widening of the FOV relative to the front lens group 305 of fig. 3, which includes a single front lens element. Rear lens assembly 1210 includes lens elements 1245 and 1250. In some aspects, lens elements 1245 and 1250 may be aspheric lens elements. Lens elements 1245 and 1250 may be formed by a WLO process. In an embodiment, lens element 1235 may have the same or similar prescription/properties (e.g., material properties, applied coatings, etc.) as lens element 335, lens element 1245 may have the same or similar prescription/properties as lens element 340, and/or lens element 1250 may have the same or similar prescription/properties as lens element 345.
Lens elements 1235, 1240, 1245, and 1250 may coordinate to direct and focus infrared light onto detector array 1220. Lens element 1235 receives electromagnetic radiation and directs the received electromagnetic radiation to lens element 1240. Lens element 1240 receives electromagnetic radiation from lens element 1235 and directs the received electromagnetic radiation to lens element 1245. Lens element 1245 receives electromagnetic radiation from lens element 1240 and directs electromagnetic radiation received from lens element 1240 to lens element 1250. Lens element 1250 receives electromagnetic radiation from lens element 1245 and directs the electromagnetic radiation received from lens element 1245 to detector array 1220. Thus, front lens group 1205 and rear lens group 1210 together project a scene onto detector array 1220. In this regard, fig. 12 illustrates at least a portion of a scene ray traced through front lens group 1205 and rear lens group 1210 to detector array 1220. An aperture stop 1255 is positioned/defined in front of the rear lens group 1210. Aperture stop 1255 defines the amount of light transmitted to detector array 1220. The aperture stop 1255 may have a spatial size comparable to that of the rear lens group 1210.
In one aspect, instead of or in addition to providing more lens elements in the front lens group 305, more lens elements may be provided in the rear lens group 310. Providing more lens elements (e.g., one or more additional aspheric lens elements) in the rear lens group 310 may allow electromagnetic radiation to be projected onto a larger detector array (e.g., a sensor having more rows and/or columns). As an example, fig. 13 shows a cross-sectional view of an optical system 1300 in accordance with one or more embodiments of the present disclosure. The description of fig. 3 and 12 applies generally to fig. 13, with examples of differences and other descriptions provided herein. The optical system 1300 includes a front lens group 1305, a rear lens group 1310, a window 1315, a detector array 1320, and a shutter member 1325. In an embodiment, front lens group 1305, rear lens group 1310, window 1315, detector array 1320, and shutter element 1325 may be, may provide the same or similar functionality as front lens group 305, rear lens group 310, window 315, detector array 320, and shutter element 325, respectively, and/or may otherwise correspond to front lens group 305, rear lens group 310, window 315, detector array 320, and shutter element 325, respectively.
Front lens group 1305 includes lens element 1335. Front lens group 1305 may provide a wide FOV, such as UWFOV. In some aspects, the lens element 1335 may be a spherical lens element. The spherical lens element may be formed by an abrasive/polishing process. In some cases, both surfaces of lens element 1335 may be spherical. Rear lens group 1310 includes lens elements 1340, 1345 and 1350. In some aspects, lens elements 1340, 1345, and 1350 may be aspheric lens elements. Lens elements 1340, 1345, and 1350 may be formed by a WLO process. The additional lens element 1345 may facilitate the projection of electromagnetic radiation onto a larger detector array relative to the rear lens set 310 of fig. 3, which includes two rear lens elements. In embodiments, lens element 1335 may have the same or similar prescription/properties (e.g., material properties, applied coatings, etc.) as lens element 335, lens element 1340 may have the same or similar prescription/properties as lens element 340, and/or lens element 1350 may have the same or similar prescription/properties as lens element 345.
Lens elements 1335, 1340, 1345, and 1350 may coordinate to direct and focus infrared light onto detector array 1320. Lens element 1335 receives the electromagnetic radiation and directs the received electromagnetic radiation to lens element 1340. Lens element 1340 receives electromagnetic radiation from lens element 1335 and directs the received electromagnetic radiation to lens element 1345. Lens element 1345 receives electromagnetic radiation from lens element 1340 and directs the electromagnetic radiation received from lens element 1340 to lens element 1350. Lens element 1350 receives electromagnetic radiation from lens element 1345 and directs electromagnetic radiation received from lens element 1345 to detector array 1320. Thus, front lens group 1305 and rear lens group 1310 together project a scene onto detector array 1320. In this regard, FIG. 13 shows at least a portion of a scene ray traced through front lens group 1305 and rear lens group 1310 to detector array 1320. An aperture stop 1355 is positioned/defined in front of the rear lens assembly 1310. The aperture stop 1355 defines the amount of light transmitted to the detector array 1320. The aperture stop 1355 may have a spatial size comparable to that of the rear lens group 1310.
Fig. 14 illustrates a block diagram of an exemplary imaging system 1400 in accordance with one or more embodiments of the present disclosure. However, not all depicted components may be required, and one or more embodiments may include additional components not shown in the figures. Variations in the arrangement and type of the components may be made without departing from the spirit or scope of the claims as set forth herein. Additional components, different components, and/or fewer components may be provided.
Imaging system 1400 may be used to capture and process images in accordance with embodiments of the present disclosure. Imaging system 1400 may represent any type of imaging system that detects one or more ranges (e.g., bands) of EM radiation and provides representative data (e.g., one or more still image frames or video image frames). The imaging system 1400 may include an imaging device 1405. As non-limiting examples, the imaging device 1405 may be, may include, or may be part of an infrared camera, a visible light camera, a tablet, a laptop, a Personal Digital Assistant (PDA), a mobile device, a desktop computer, or other electronic device. The imaging device 1405 may include a housing (e.g., a camera body) that at least partially encloses the components of the imaging device 1405, thereby facilitating compactness and protection of the imaging device 1405. For example, a solid frame labeled 1405 in fig. 14 may represent a housing of the imaging device 1405. The housing may contain more, fewer, and/or different components of the imaging device 1405 than those depicted within the solid frame in fig. 14. In an embodiment, the imaging system 1400 may include a portable device and may be incorporated into, for example, a carrier or non-mobile device that is required to store and/or display images. The vehicle may be a land-based vehicle (e.g., an automobile, truck), a sea-based vehicle, an aircraft (e.g., an Unmanned Aerial Vehicle (UAV)), a spacecraft, or generally any type of vehicle that may include the imaging system 1400 (e.g., mounted therein, mounted thereon, etc.). In another example, the imaging system 1400 may be coupled to various types of fixed locations (e.g., home security, camp, or outdoor mounts, or other locations) via one or more types of mounts.
According to one implementation, the imaging device 1405 includes a logic device 1410, a memory component 1415, an image capturing component 1420 (e.g., an imager, an image sensor device), an image interface 1425, a control component 1430, a display component 1435, a sensing component 1440, and/or a network interface 1445. In embodiments, imaging device 1405 may be, may include, or may be part of imaging device 100 of fig. 1 and/or imaging device 800 of fig. 8. According to various embodiments, logic device 1410 includes one or more of a processor, a microprocessor, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a single-core processor, a multi-core processor, a microcontroller, a Programmable Logic Device (PLD) (e.g., a Field Programmable Gate Array (FPGA)), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processing (DSP) device or other logic device, one or more memories for storing executable instructions (e.g., software, firmware, or other instructions), and/or any other suitable combination of processing devices and/or memories for executing instructions to perform any of the various operations described herein. Logic device 1410 may be configured by hardwired, executing software instructions, or a combination of both to perform the various operations discussed herein with respect to embodiments of the present disclosure. Logic device 1410 may be configured to interface and communicate with various other components of imaging system 1400 (e.g., 1415, 1420, 1425, 1430, 1435, 1440, 1445, etc.) to perform such operations. In one aspect, logic device 1410 may be configured to perform various system control operations (e.g., controlling communications and operations of various components of imaging system 1400) and other image processing operations (e.g., bayer decoding, sharpening, color correction, offset correction, bad pixel replacement, data conversion, data transformation, data compression, video analysis, etc.).
In one embodiment, memory component 1415 includes one or more memory devices configured to store data and information, including infrared image data and information. The memory component 1415 can include one or more of various types of memory devices, including volatile and nonvolatile memory devices, such as Random Access Memory (RAM), dynamic RAM (DRAM), static RAM (SRAM), nonvolatile random access memory (NVRAM), read Only Memory (ROM), programmable Read Only Memory (PROM), erasable Programmable Read Only Memory (EPROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory, hard disk drive, and/or other types of memory. As described above, logic 1410 may be configured to execute software instructions stored in memory component 1415 in order to perform methods and process steps and/or operations. The logic device 1410 and/or the image interface 1425 may be configured to store images or digital image data captured by the image capturing component 1420 in the memory component 1415.
In some embodiments, a separate machine-readable medium 1450 (e.g., a memory, such as a hard drive, optical disk, digital video disk, or flash memory) may store software instructions and/or configuration data that may be executed or accessed by a computer (e.g., a logic device or processor-based system) to perform various methods and operations, such as those associated with processing image data. In one aspect, the machine-readable medium 1450 may be portable and/or located separately from the imaging device 1405, wherein stored software instructions and/or data are provided to the imaging device 1405 by coupling the machine-readable medium 1450 to the imaging device 1405 and/or by downloading from the machine-readable medium 1450 by the imaging device 1405 (e.g., via a wired and/or wireless link). It should be appreciated that the various modules may be integrated in software and/or hardware as part of the logic device 1410, wherein code (e.g., software or configuration data) for the modules is stored in, for example, the memory component 1415.
The imaging device 1405 may be a video and/or still camera that captures and processes images and/or video of the scene 1475. In this regard, the image capturing component 1420 of the imaging device 1405 may be configured to capture images (e.g., still and/or video images) of the scene 1475 in a particular spectrum or modality. Image capturing component 1420 includes image detector circuitry 1465 (e.g., visible light detector circuitry, thermal infrared detector circuitry) and readout circuitry 1470 (e.g., ROIC). For example, the image capturing component 1420 may include an IR imaging sensor (e.g., an IR imaging sensor array) configured to detect IR radiation in the near, mid, and/or far IR spectra and provide an IR image (e.g., IR image data or signals) representative of the IR radiation from the scene 1475. For example, the image detector circuit 1465 may capture (e.g., detect, sense) IR radiation having a wavelength in the range of about 700nm to about 2mm, or a portion thereof. For example, in some aspects, the image detector circuit 1465 may be sensitive (e.g., better detect) to SWIR radiation, MWIR radiation (e.g., EM radiation having a wavelength of 2 μm to 5 μm), and/or LWIR radiation (e.g., EM radiation having a wavelength of 7 μm to 14 μm), or any desired IR wavelength (e.g., typically in the range of 0.7 μm to 14 μm). In other aspects, the image detector circuit 1465 may capture radiation from one or more other bands of EM spectrum, such as visible light, ultraviolet light, and the like.
Image detector circuit 1465 may capture image data (e.g., infrared image data) associated with scene 1475. To capture the detector output image, the image detector circuit 1465 may detect image data (e.g., in the form of EM radiation) of the scene 1475 received through the aperture 1480 of the imaging device 1405 and generate pixel values of the image based on the scene 1475. An image may be referred to as a frame or an image frame. In some cases, image detector circuit 1465 may include a detector array (e.g., also referred to as a pixel array) that may detect radiation in a certain wavelength band, convert the detected radiation into electrical signals (e.g., voltages, currents, etc.), and generate pixel values based on the electrical signals. Each detector in the array may capture a respective portion of the image data and generate pixel values based on the respective portion captured by the detector. The pixel value generated by the detector may be referred to as the output of the detector. As non-limiting examples, each detector may be a photodetector, such as an avalanche photodiode, an infrared photodetector, a quantum well infrared photodetector, a microbolometer, or other detector capable of converting EM radiation (e.g., EM radiation of a certain wavelength) into a pixel value. The array of detectors may be arranged in rows and columns.
The detector output image may be or may be considered a data structure comprising pixels and is a representation of image data associated with scene 1475, each pixel having pixel values representing EM radiation emitted or reflected from a portion of scene 1475 and received by a detector that generates pixel values. Based on context, a pixel may refer to a detector of image detector circuit 1465 that generates an associated pixel value or a pixel (e.g., pixel location, pixel coordinates) of a detector output image formed from the generated pixel value. In one example, the detector output image may be an infrared image (e.g., a thermal infrared image). For a thermal infrared image (e.g., also referred to as a thermal image), each pixel value of the thermal infrared image may represent a temperature of a corresponding portion of scene 1475. In another example, the detector output image may be a visible light image.
In one aspect, the pixel values generated by the image detector circuit 1465 may be represented by digital count values generated based on electrical signals obtained from converting detected radiation. For example, where the image detector circuit 1465 includes or is otherwise coupled to an ADC circuit, the ADC circuit may generate a digital count value based on the electrical signal. In some embodiments, the ADC circuit may be a multi-range ADC circuit, such as a dual slope ADC circuit. For ADC circuits that can use 14 bits to represent an electrical signal, the digital count value can be in the range of 0 to 16,383. In such a case, the pixel value of the detector may be a digital count value output from the ADC circuit. In other cases (e.g., without ADC circuitry), the pixel value may be analog in nature, having a value that is or is indicative of the value of the electrical signal. As an example, for infrared imaging, a greater amount of IR radiation incident on and detected by image detector circuitry 1465 (e.g., IR image detector circuitry) is associated with a higher digital count value and a higher temperature.
The readout circuitry 1470 may serve as an interface between the image detector circuitry 1465 that detects image data and the logic device 1410 that processes the detected image data read out by the readout circuitry 1470, with data communication from the readout circuitry 1470 to the logic device 1410 being facilitated by the image interface 1425. The image capture frame rate may refer to the rate at which images are sequentially detected/output by image detector circuit 1465 and provided to logic device 1410 by readout circuit 1470 (e.g., the number of images output per second of detector). The readout circuit 1470 may read out pixel values generated by the image detector circuit 1465 according to an integration time (e.g., also referred to as an integration period).
In various embodiments, the combination of image detector circuit 1465 and readout circuit 1470 may be, may include, or may together provide an FPA. In some aspects, the image detector circuit 1465 may be a thermal image detector circuit including an array of microbolometers, and the combination of the image detector circuit 1465 and readout circuit 1470 may be referred to as a microbolometer FPA. In some cases, an array of microbolometers may be arranged in rows and columns. The microbolometer may detect IR radiation and generate pixel values based on the detected IR radiation. For example, in some cases, the microbolometer may be a thermal IR detector that detects IR radiation in the form of thermal energy and generates pixel values based on the amount of thermal energy detected. The microbolometer may absorb incident IR radiation and produce a corresponding temperature change in the microbolometer. The change in temperature is correlated to a corresponding change in resistance of the microbolometer. At each microbolometer radiant heat Where the gauges are used as pixels, a two-dimensional image or pictorial representation of the incident IR radiation may be generated by converting the resistance change of each microbolometer into a time-multiplexed electrical signal. The conversion may be performed by the ROIC. The microbolometer FPA may include IR detection materials such as amorphous silicon (a-Si), vanadium oxide (VO x ) Combinations thereof, and/or other detection material(s). In one aspect, for a microbolometer FPA, the integration time may be or may be indicative of the period of time that the microbolometer is biased. In this case, a longer integration time may be associated with a higher gain of the IR signal, but not more IR radiation is collected. IR radiation may be collected by a microbolometer in the form of thermal energy.
In some cases, image capturing component 1420 may include one or more optical components and/or one or more filters. The optical component(s) may include one or more windows, lenses, mirrors, beam splitters, beam couplers, and/or other components that direct and/or focus radiation to the image detector circuit 1465. The optical component(s) may include components that are each formed of a material and appropriately arranged according to desired transmission characteristics (e.g., desired transmission wavelength and/or light transmission matrix characteristics). The filter(s) may be adapted to pass radiation of some wavelengths but substantially block radiation of other wavelengths. For example, image capturing component 1420 may be an IR imaging device that includes one or more filters (e.g., MWIR filters, thermal IR filters, and narrowband filters) adapted to pass IR radiation of some wavelengths while substantially blocking IR radiation of other wavelengths. In this example, such a filter may be used to customize the image capturing component 1420 for increasing sensitivity to a desired IR wavelength band. In one aspect, when the IR imaging device is customized for capturing a thermal IR image, the IR imaging device may be referred to as a thermal imaging device. Other imaging devices, including IR imaging devices tailored to capture infrared IR images outside of the thermal range, may be referred to as non-thermal imaging devices.
In one particular non-limiting example, image capturing component 1420 may include an IR imaging sensor having an FPA responsive to a detector of IR radiation including Near Infrared (NIR), SWIR, MWIR, LWIR, and/or ultra-long wave IR (VLWIR) radiation. In some other embodiments, alternatively or additionally, image capturing component 1420 may include a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge Coupled Device (CCD) sensor that may be found in any consumer camera (e.g., a visible light camera).
In some embodiments, imaging system 1400 includes a shutter 1485. Shutter 1485 may be operated to selectively insert into the optical path between scene 1475 and image capture component 1420 to expose or block aperture 1480. In some cases, shutter 1485 may be moved (e.g., slid, rotated, etc.) manually (e.g., by a user of imaging system 1400) and/or via an actuator (e.g., controllable by logic device 1410 in response to user input or autonomously, e.g., autonomously decided by logic device 1410 to perform calibration of imaging device 1405). When shutter 1485 is out of the optical path to expose aperture 1480, electromagnetic radiation from scene 1475 may be received by image detector circuit 1465 (e.g., via one or more optical components and/or one or more filters). Thus, the image detector circuit 1465 captures an image of the scene 1475. Shutter 1485 may be referred to as being in an open position or simply open. When shutter 1485 is inserted into the optical path to block aperture 1480, electromagnetic radiation from scene 1475 is blocked from image detector circuit 1465. Thus, the image detector circuit 1465 captures an image of the shutter 1485. Shutter 1485 may be referred to as being in a closed position or simply closed. In some cases, the shutter 1485 may block the aperture 1480 during the calibration process, where the shutter 1485 may function as a uniform black body (e.g., a substantially uniform black body). For example, the shutter 1485 serves as a single temperature source or substantially a single temperature source. In some cases, shutter 1485 may be temperature controlled to provide a temperature controlled uniform black body (e.g., to present a uniform field of radiation to image detector circuit 1465). For example, in some cases, the surface of shutter 1485 imaged by image detector circuit 1465 may be implemented by a uniform blackbody coating. In some cases, such as for imaging devices without a shutter or with a broken shutter or as an alternative to shutter 1485, the housing or holster, lens cap, cover, wall of a room, or other suitable object/surface of imaging device 1405 may be used to provide a uniform black body (e.g., a substantially uniform black body) and/or a single temperature source (e.g., a substantially single temperature source).
Other imaging sensors that may be implemented in image capturing component 1420 include a Photon Mixing Device (PMD) imaging sensor or other time of flight (ToF) imaging sensor, a LIDAR imaging device, a RADAR imaging device, a millimeter imaging device, a Positron Emission Tomography (PET) scanner, a Single Photon Emission Computed Tomography (SPECT) scanner, an ultrasound imaging device, or other imaging device that operates in a particular modality and/or spectrum. It should be noted that for some of these imaging sensors configured to capture images in a particular modality and/or spectrum (e.g., infrared spectrum, etc.), for example, they are more prone to producing images with low frequency shadows when compared to typical CMOS-based or CCD-based imaging sensors or other imaging sensors, imaging scanners, or imaging devices of different modalities.
The image provided by the image capturing component 1420 or digital image data corresponding to the image may be associated with a corresponding image size (also referred to as a pixel size). Image size or pixel size generally refers to the number of pixels in an image, which may be represented, for example, in the width of a two-dimensional image multiplied by the height, or otherwise adapted to the relevant size or shape of the image. Thus, an image with the original resolution may be resized to a smaller size (e.g., with smaller pixel size) in order to, for example, reduce the cost of processing and analyzing the image. A filter (e.g., a non-uniformity estimate) may be generated based on an analysis of the resized image. The filter may then be resized to the original resolution and size of the image before being applied to the image.
In some embodiments, the image interface 1425 may include suitable input ports, connectors, switches, and/or circuitry configured to interface with external devices (e.g., remote device 1455 and/or other devices) to receive images (e.g., digital image data) generated by or otherwise stored at the external devices. In one aspect, the image interface 1425 may include a serial interface and telemetry lines for providing metadata associated with the image data. The received image or image data may be provided to logic 1410. In this regard, the received image or image data may be converted into signals or data suitable for processing by the logic device 1410. For example, in one embodiment, the image interface 1425 may be configured to receive analog video data and convert it to suitable digital data for provision to the logic device 1410.
The image interface 1425 may include various standard video ports that may be connected to a video player, camera, or other device capable of generating standard video signals, and may convert received video signals into digital video/image data suitable for processing by the logic device 1410. In some embodiments, the image interface 1425 may also be configured to interface with and receive images (e.g., image data) from the image capture component 1420. In other embodiments, image capture component 1420 may interface directly with logic device 1410.
In one embodiment, the control component 1430 includes user input and/or interface devices adapted to generate user input control signals, such as rotatable knobs (e.g., potentiometers), buttons, sliders, keyboards, and/or other devices. The logic device 1410 may be configured to sense a control input signal from a user via the control component 1430 and respond to any sensed control input signal received therefrom. As is generally understood by those skilled in the art, logic device 1410 may be configured to interpret such control input signals as values. In one embodiment, the control component 1430 may include a control unit (e.g., a wired or wireless handheld control unit) having buttons adapted to interface with a user interface and receive user input control values. In one embodiment, buttons and/or other input mechanisms of the control unit may be used to control various functions of the imaging system 1405, such as calibration initiation and/or related controls, shutter control, auto-focus, menu enablement and selection, field of view, brightness, contrast, noise filtering, image enhancement, and/or various other features.
In one embodiment, the display component 1435 includes an image display device (e.g., a Liquid Crystal Display (LCD)) or various other types of commonly known video displays or monitors. Logic 1410 may be configured to display image data and information on display component 1435. Logic device 1410 may be configured to retrieve image data and information from memory component 1415 and display any retrieved image data and information on display component 1435. The display component 1435 may include display circuitry that may be used by the logic device 1410 to display image data and information. The display component 1435 may be adapted to receive image data and information directly from the image capture component 1420, the logic device 1410, and/or the image interface 1425, or the image data and information may be transferred from the memory component 1415 via the logic device 1410. In some aspects, the control component 1430 may be implemented as part of the display component 1435. For example, the touch screen of the imaging device 1405 may provide both a control component 1430 (e.g., for receiving user input via tap and/or other gestures) and a display component 1435 of the imaging device 1405.
In one embodiment, sensing component 1440 includes one or more sensors of various types, depending on application or implementation requirements, as will be appreciated by those skilled in the art. The sensors of sensing component 1440 provide data and/or information to at least logic device 1410. In one aspect, logic device 1410 may be configured to communicate with sensing component 1440. In various implementations, sensing component 1440 can provide information regarding environmental conditions, such as external temperature, lighting conditions (e.g., day, night, dusk, and/or dawn), humidity levels, specific weather conditions (e.g., sunny, rainy, and/or snowy), distance (e.g., laser rangefinder or time-of-flight camera), and/or whether a tunnel or other type of fence has been entered or exited. Sensing component 1440 may represent a conventional sensor for monitoring various conditions (e.g., environmental conditions) that may have an impact on image data provided by image capturing component 1420 (e.g., on image appearance), as generally known to those skilled in the art.
In some implementations, the sensing component 1440 (e.g., one or more sensors) can include a device that relays information to the logic device 1410 via wired and/or wireless communication. For example, sensing component 1440 can be adapted to receive information from satellites via local broadcast (e.g., radio Frequency (RF)) transmission, via a mobile or cellular network, and/or via an information beacon in an infrastructure (e.g., a transportation or highway information beacon infrastructure), or various other wired and/or wireless technologies. In some embodiments, logic device 1410 may use information (e.g., sensed data) retrieved from sensing component 1440 to modify the configuration of image capturing component 1420 (e.g., adjust the light sensitivity level, adjust the orientation or angle of image capturing component 1420, adjust the aperture, etc.). Sensing component 1440 may include a temperature sensing component to provide temperature data (e.g., one or more measured temperature values) of various components of imaging device 1405, such as image detection circuit 1465 and/or shutter 1485. As non-limiting examples, the temperature sensor may include a thermistor, thermocouple, thermopile, pyrometer, and/or other suitable sensor for providing temperature data.
In some embodiments, the various components of imaging system 1400 may be distributed over a network 1460 and in communication with each other. In this regard, imaging device 1405 may include a network interface 1445 configured to facilitate wired and/or wireless communication between various components of imaging system 1400 via a network 1460. In such an embodiment, the components may also be replicated if desired for a particular application of the imaging system 1400. That is, components configured for the same or similar operations may be distributed over a network. Further, all or portions of any of the various components may be implemented using appropriate components of a remote device 1455 (e.g., a conventional Digital Video Recorder (DVR), a computer configured for image processing, and/or other devices) that communicates with the various components of imaging system 1400 via a network interface 1445 over a network 1460, if desired. Thus, for example, all or portions of logic device 1410, all or portions of memory component 1415, and/or all or portions of display component 1435 may be implemented or replicated at remote device 1455. In some embodiments, the imaging system 1400 may not include an imaging sensor (e.g., image capture component 1420), but rather receive images or image data from an imaging sensor located separate and apart from the logic device 1410 and/or other components of the imaging system 1400. It should be appreciated that many other combinations of distributed implementations of imaging system 1400 are possible without departing from the scope and spirit of the present disclosure.
Further, in various embodiments, various components of imaging system 1400 may be combined and/or implemented or not combined and/or implemented as desired or dependent upon the application or requirements. In one example, logic 1410 may be combined with memory component 1415, image capture component 1420, image interface 1425, display component 1435, sensing component 1440, and/or network interface 1445. In another example, logic 1410 may be combined with image capture component 1420 such that certain functions of logic 1410 are performed by circuitry (e.g., processor, microprocessor, logic device, microcontroller, etc.) within image capture component 1420.
Fig. 15 illustrates a block diagram of an exemplary image sensor assembly 1500 in accordance with one or more embodiments of the present disclosure. However, not all depicted components may be required, and one or more embodiments may include additional components not shown in the figures. Variations in the arrangement and type of the components may be made without departing from the spirit or scope of the claims as set forth herein. Additional components, different components, and/or fewer components may be provided. In an embodiment, image sensor assembly 1500 may be a FPA, for example, implemented as image capturing component 1420 of fig. 14.
The image sensor assembly 1500 includes a unit cell array 1505, column multiplexers 1510 and 1515, column amplifiers 1520 and 1525, a row multiplexer 1530, a control bias and timing circuit 1535, a digital-to-analog converter (DAC) 1540, and a data output buffer 1545. In some aspects, the operation of and/or operations related to unit cell array 1505 and other components may be performed in accordance with a system clock and/or a synchronization signal (e.g., a Line Synchronization (LSYNC) signal). The unit cell array 1505 includes an array of unit cells. In one aspect, each unit cell may include a detector (e.g., a pixel) and an interface circuit. The interface circuit of each unit cell may provide an output signal, such as an output voltage or an output current, in response to a detection signal (e.g., detection current, detection voltage) provided by the detector of the unit cell. The output signal may be indicative of the magnitude of the EM radiation received by the detector and may be referred to as image pixel data or simply image data. The column multiplexer 1515, column amplifier 1520, row multiplexer 1530, and data output buffer 1545 may be used to provide the output signal from the unit cell array 1505 as a data output signal on the data output line 1550. The output signals on data output line 1550 may be provided to components downstream of image sensor assembly 1500, such as processing circuitry (e.g., logic device 1410 of fig. 14), memory (e.g., memory component 1415 of fig. 14), a display device (e.g., display component 1435 of fig. 14), and/or other components, to facilitate processing, storage, and/or display of the output signals. The data output signal may be an image formed by pixel values of the image sensor assembly 1500. In this regard, the column multiplexer 1515, column amplifier 1520, row multiplexer 1530, and data output buffer 1545 may collectively provide the ROIC (or a portion thereof) of the image sensor assembly 1500. In one aspect, the interface circuit may be considered part of the ROIC, or may be considered an interface between the detector and the ROIC. In some embodiments, the components of image sensor assembly 1500 may be implemented such that unit cell array 1505 and ROIC may be part of a single die.
Column amplifier 1525 may generally represent any column processing circuit suitable for a given application (analog and/or digital) and is not limited to amplifier circuits for analog signals. In this regard, the column amplifier 1525 may be more generally referred to as a column processor in this regard. The signals received by the column amplifiers 1525 may be processed according to analog or digital properties of the signals, such as analog signals on an analog bus and/or digital signals on a digital bus. As an example, the column amplifier 1525 may include circuitry for processing digital signals. As another example, column amplifier 1525 may be a path (e.g., unprocessed) through which a digital signal from unit cell array 1505 passes to reach column multiplexer 1515. As another example, the column amplifier 1525 may include an ADC for converting analog signals to digital signals (e.g., to obtain digital count values). These digital signals may be provided to column multiplexer 1515.
Each unit cell may receive a bias signal (e.g., bias voltage, bias current) to bias the detector of the unit cell to compensate for different response characteristics of the unit cell due to, for example, temperature variations, manufacturing variations, and/or other factors. For example, the control bias and timing circuit 1535 may generate bias signals and provide them to the unit cells. By providing an appropriate bias signal to each unit cell, the unit cell array 1505 can be effectively calibrated to provide accurate image data in response to light (e.g., visible light, IR light) incident on the detector of the unit cell. In one aspect, the control bias and timing circuit 1535 may be, include, or be part of a logic circuit.
The control bias and timing circuit 1535 may generate control signals for addressing the unit cell array 1505 to allow access and readout of image data from the addressed portion of the unit cell array 1505. The unit cell array 1505 may be addressed to access and read out image data from the unit cell array 1505 row by row, but in other implementations the unit cell array 1505 may be addressed column by column or by other means.
The control bias and timing circuit 1535 may generate bias values and timing control voltages. In some cases, DAC 1540 may convert bias values received as data input signals on data input signal line 1555 or as part thereof into bias signals (e.g., analog signals on analog signal line(s) 1560) that may be provided to individual unit cells through operation of column multiplexer 1510, column amplifier 1520, and row multiplexer 1530. For example, DAC 1540 may drive a digital control signal (e.g., provided as a bit) to the appropriate analog signal level for the unit cell. In some techniques, a digital control signal of 0 or 1 may be driven to an appropriate logic low voltage level or an appropriate logic high voltage level, respectively. On the other hand, the control bias and timing circuit 1535 may generate a bias signal (e.g., an analog signal) and provide the bias signal to the unit cell without using the DAC 1540. In this regard, some implementations do not include DAC 1540, data input signal line 1555, and/or analog signal line(s) 1560. In an embodiment, the control bias and timing circuit 1535 may be the logic device 1410 and/or the image capture component 1420 of fig. 14, may include the logic device 1410 and/or the image capture component 1420 of fig. 14, may be part of the logic device 1410 and/or the image capture component 1420 of fig. 14, or may be otherwise coupled to the logic device 1410 and/or the image capture component 1420 of fig. 14.
In an embodiment, the image sensor assembly 1500 may be implemented as part of an imaging device (e.g., imaging device 1405). In addition to the various components of image sensor assembly 1500, the imaging device may include one or more processors, memories, logic devices, displays, interfaces, optics (e.g., lenses, mirrors, beam splitters), and/or other components as may be appropriate in various implementations. In one aspect, the data output signal on data output line 1550 may be provided to a processor (not shown) for further processing. For example, the data output signal may be an image formed of pixel values of a unit cell from the image sensor assembly 1500. The processor may perform operations such as non-uniformity correction (e.g., flat field correction or other calibration techniques), spatial and/or temporal filtering, and/or other operations. The image (e.g., processed image) may be stored in memory (e.g., external to the imaging system or local) and/or displayed on a display device (e.g., external to the imaging system and/or integrated with the imaging system). The various components of fig. 15 may be implemented on a single chip or multiple chips. Further, while the various components are shown as a set of separate blocks, the various blocks may be combined together or the various blocks shown in fig. 15 may be separated into separate blocks.
It should be noted that in fig. 15, the unit cell array 1505 is depicted as 8×8 (e.g., 8 rows and 8 columns of unit cells). However, the unit cell array 1505 may have other array sizes. As non-limiting examples, the unit cell array 1505 may include 160×120 (e.g., 160 rows and 120 columns of unit cells), 512×512, 1024×1024, 2048×2048, 4096×4096, 8192×8192, and/or other array sizes. In some cases, the array size may have a different row size (e.g., number of detectors in a row) than the column size (e.g., number of detectors in a column). Examples of frame rates may include 30Hz, 60Hz, and 120Hz. In one aspect, each unit cell of the unit cell array 205 may represent a pixel.
It should be noted that the dimensional aspects provided above are examples, and that other values of dimensions may be utilized in accordance with one or more implementations. Furthermore, the dimensional aspects provided above are typically nominal values. As will be appreciated by those skilled in the art, each dimensional aspect has a tolerance associated with the dimensional aspect. Similarly, aspects related to the distance between features also have associated tolerances.
Fig. 16 illustrates a perspective view of an additional imaging device 1600 in accordance with one or more embodiments of the present disclosure. The imaging device 1600 may be similar to the imaging device 100, the imaging device 200, the imaging device 800, and/or the imaging device 1405 described above, except as described further below. In addition, the imaging device 1600 may be manufactured using the various steps of the process 900 described above. The imaging device 1600 may also be used using the various steps of the process 1100 described above.
For example, imaging device 1600 uses a triplet configuration with first lens element 1610, second lens element 1612, and third lens element 1614. First lens element 1610 may be similar to any of lens element 335, lens element 1235, lens element 1240, or lens element 1335 described above. Second lens element 1612 may be similar to any of lens element 340, lens element 1245, lens element 1340, or lens element 1345 described above. Third lens element 1614 may be similar to any of lens element 345, lens element 1250, lens element 1345, or lens element 1350 described above. The first lens element 1610 may be an AB lens and the second and third lens elements 1612, 1614 may be EF and IJ lenses, although other configurations are contemplated. First lens element 1610 may be referred to as an upper lens of imaging device 1600. The lower two lenses (i.e., the second lens element 1612 and the third lens element 1614) may be collectively referred to as a "doublet" capable of imaging independently of the upper lens.
As shown, the imaging device 1600 includes a lens barrel 1618. The lens barrel 1618 includes a first body portion 1620 and a second body portion 1622. The first body portion 1620 may be referred to as a barrel, first lens barrel component, upper lens assembly, or top portion of the lens barrel 1618. First lens element 1610 may be at least partially disposed in first body portion 1620. In an embodiment, first lens element 1610 may include a first lens group of first body portion 1620 and/or lens barrel 1618, and first body portion 1620 may have a conical shape with a groove or rim 1624 to receive first lens element 1610, although other configurations are contemplated. The first body portion 1620 may be similar to the body portion 815 described above.
The second body portion 1622 may be referred to as a doublet, a second lens barrel component, a lower lens assembly, or a bottom portion of the lens barrel 1618. The second lens element 1612 and the third lens element 1614 may be at least partially disposed in the second body portion 1622. In an embodiment, the second and third lens elements 1612, 1614 may include a second lens group of the second body portion 1622 and/or the lens barrel 1618. As shown, a spacing element 1630 may be positioned or otherwise defined in the second body portion 1622 to position the second and third lens elements 1612, 1614 in the second body portion 1622 (e.g., to define an appropriate spacing between the second and third lens elements 1612, 1614 to secure the second and third lens elements 1612, 1614 in the second body portion 1622, etc.). In an embodiment, the second body portion 1622 may be attached to a housing of the imaging device 1600, such as the housing 810 described above. For example, the second body portion 1622 may include external threads 1632 for threadably connecting the second body portion 1622 to the housing 810. The second body portion 1622 may be similar to the body portion 820 described above.
The first body portion 1620 may be detached from the second body portion 1622, for example, via a quick attach mechanism, as described in detail below. In this manner, the first body portion 1620 may be removed from the second body portion 1622, thereby facilitating replacement of the first body portion 1620 and/or the second body portion 1622, testing of the imaging device 1600, manufacturing of the imaging device 1600, and the like, and easy reattachment. For example, it may be convenient to test the entire imaging device 1600 (including the first body portion 1620) after assembly, but only use the bottom two lenses during the camera housing attachment and camera focusing process ("doublet configuration"). The first body portion 1620 may then be reattached for further steps, such as calibration.
Fig. 17 illustrates a cross-sectional view of a first body portion 1620 in accordance with one or more embodiments of the present disclosure. Fig. 18 illustrates a cross-sectional view of a second body portion 1622 in accordance with one or more embodiments of the present disclosure. Referring to fig. 16-18, imaging device 1600 includes a snap-fit mechanism 1650 that releasably secures first body portion 1620 to second body portion 1622. In this manner, the first body portion 1620 and the second body portion 1622 are designed to snap together using a snap fit. Although a snap fit is shown and described, the first body portion 1620 and the second body portion 1622 may be releasably secured together using other quick attachment mechanisms and devices. Suitable mechanisms/devices include those that allow the first body portion 1620 to be connected to the second body portion 1622 and removed from the second body portion 1622 without damaging or losing the functionality of the quick attach mechanism/device.
As shown, the first body portion 1620 includes one or more finger members 1652 (e.g., a plurality of finger members 1652, e.g., two finger members 1652) that fit into complementary notches 1654 in the second body portion 1622. In an embodiment, the snap-fit mechanism 1650 may be designed to be reversible, meaning that the snap-fit mechanism 1650 may be broken without damage. For example, the tips of the finger members 1652 may have beveled surfaces on the leading and trailing edges to allow reversible bi-directional behavior. Notch 1654 may include corresponding structures that facilitate reversible bi-directional behavior. In this manner, the finger members 1652 and/or the notch 1654 may include a bevel that facilitates removal of the finger members 1652 from the notch 1654 without damage.
In an embodiment, the finger members 1652 and notches 1654 may be positioned to facilitate proper alignment of the first body portion 1620 and the second body portion 1622. For example, the finger members 1652 and notches 1654 may coaxially, rotationally, or otherwise align the first body portion 1620 and the second body portion 1622. For example, the finger members 1652 and notches 1654 may be arranged to fix a rotational position of the first body portion 1620 relative to the second body portion 1622, e.g., to limit rotation of the first body portion 1620 relative to the second body portion 1622 about a coaxial axis. In an embodiment, the second body portion 1622 may include one or more flanges 1660 positioned adjacent the finger members 1652 to limit rotation of the first body portion 1620 relative to the second body portion 1622.
Fig. 19 shows a flowchart of an exemplary process 1900 for manufacturing an imaging device 1600 in accordance with one or more embodiments of the disclosure. For purposes of explanation, the example process 1900 is described herein with reference to the components of fig. 16-18. Fig. 16-18 illustrate perspective views associated with manufacturing an imaging device 1600. However, the example process 1900 is not limited to the components of fig. 16-18.
In block 1910, the process 1900 includes providing a lens barrel 1618 including a first body portion 1620, a second body portion 1622, and a snap-fit mechanism 1650. First body portion 1620 includes first lens element 1610 disposed at least partially therein. The second body portion 1622 includes a second lens element 1612 and a third lens element 1614 disposed at least partially therein. The first, second, and third lens elements 1610, 1612, 1614 include lens systems configured to transfer electromagnetic radiation from a scene to an image capturing component (e.g., image capturing component 1420). The snap-fit mechanism 1650 may include a plurality of finger members 1652 extending from the first body portion 1620 and a plurality of complementary notches 1654 in the second body portion 1622. Block 1910 may include disposing first lens element 1610 within first body portion 1620. The frame 1910 may include disposing the second and third lens elements 1612, 1614 within the second body portion 1622.
In block 1920, process 1900 includes securing first body portion 1620 to second body portion 1622 using a snap-fit mechanism 1650. The finger members 1652 may be configured to engage with the notches 1654 to releasably secure the first body portion 1620 to the second body portion 1622. The frame 1920 may include an arrangement of finger members 1652 and notches 1654 to fix a rotational position of the first body portion 1620 relative to the second body portion 1622, e.g., to limit rotation of the first body portion 1620 relative to the second body portion 1622 about a coaxial axis.
In block 1930, the process 1900 can include breaking the snap-fit mechanism 1650 to separate the first body portion 1620 from the second body portion 1622.
In block 1940, process 1900 may include performing a first manufacturing operation when first body portion 1620 and second body portion 1622 are separated. For example, the first manufacturing operation may include one or more calibration tests of the second and third lens elements 1612, 1614, such as a focusing process or any other process/test described herein. In an embodiment, the first manufacturing operation may include connecting the second body portion 1622 to a housing (e.g., housing 810).
In block 1950, process 1900 may include reconnecting first body portion 1620 to second body portion 1622 using a snap-fit mechanism 1650. For example, the first body portion 1620 may be positioned to the second body portion 1622 and the finger members 1652 snapped into the notches 1654.
In block 1960, process 1900 may include performing a second manufacturing operation when first body portion 1620 and second body portion 1622 are connected. For example, the second manufacturing operation may include one or more calibration tests of the lens system, the entire lens barrel, and the like.
Where applicable, the various embodiments provided by the present disclosure may be implemented using hardware, software, or a combination of hardware and software. Also where applicable, the various hardware components and/or software components set forth herein can be combined into composite components comprising software, hardware, and/or both without departing from the spirit of the present disclosure. Where applicable, the various hardware components and/or software components set forth herein can be separated into sub-components comprising software, hardware, or both without departing from the spirit of the present disclosure. Further, it is contemplated that software components may be implemented as hardware components, or vice versa, where applicable.
Software in accordance with the present disclosure, such as non-transitory instructions, program code, and/or data, may be stored on one or more non-transitory machine-readable media. It is also contemplated that the software identified herein may be implemented using one or more general purpose or special purpose computers and/or computer systems, networked and/or otherwise. The order of the various steps described herein may be changed, combined into composite steps, and/or divided into sub-steps where applicable to provide the features described herein.
The foregoing description is not intended to limit the disclosure to the precise form or particular field of use disclosed. The above examples illustrate but do not limit the invention. It is contemplated that various alternative embodiments and/or modifications of the invention are possible in light of the present disclosure, whether explicitly described or implied herein. Accordingly, the scope of the invention is limited only by the following claims.

Claims (20)

1. An image forming apparatus, the image forming apparatus comprising:
a lens barrel, the lens barrel comprising:
a first body portion comprising a first lens element disposed at least partially in the first body portion;
a second body portion comprising a second lens element and a third lens element disposed at least partially in the second body portion, wherein the first lens element, the second lens element, and the third lens element comprise a lens system configured to transfer electromagnetic radiation from a scene to an image capturing component; and
a snap-fit mechanism comprising a plurality of finger members extending from the first body portion and a plurality of complementary notches in the second body portion, wherein the finger members are configured to engage with the notches to releasably secure the first body portion to the second body portion.
2. The imaging device of claim 1, wherein the one or more finger members and/or the recess includes a bevel that facilitates removal of the one or more finger members from the recess.
3. The imaging device of claim 1, wherein the finger member and the recess are arranged to fix a rotational position of the first body portion relative to the second body portion.
4. The imaging device of claim 3, wherein the second body portion includes one or more flanges positioned adjacent the finger members to limit rotation of the first body portion relative to the second body portion.
5. The imaging device of claim 1, further comprising a spacer positioned or otherwise defined in the second body portion to position the second lens element and the third lens element in the second body portion.
6. The imaging device of claim 1, further comprising a housing, wherein the second body portion includes external threads for threadably connecting the second body portion to the housing.
7. The imaging device of claim 1, wherein:
The first lens element includes a first lens group; and is also provided with
The first lens element is a spherical lens element configured to transmit electromagnetic radiation associated with a scene.
8. The imaging apparatus of claim 7, wherein:
the second lens element and the third lens element comprise a second lens group; and is also provided with
The second lens element and the third lens element are Wafer Level Optical (WLO) aspheric lens elements configured to receive electromagnetic radiation from the first lens group and transmit electromagnetic radiation.
9. The imaging apparatus of claim 8, wherein:
the first lens group is associated with a first field of view; and is also provided with
The second lens group is associated with a second field of view that is narrower than the first field of view.
10. The imaging apparatus of claim 1, wherein the image capturing component comprises a detector array comprising a plurality of detectors, wherein each of the plurality of detectors is configured to receive a portion of the electromagnetic radiation and generate a thermal image based on the electromagnetic radiation.
11. A method of manufacturing the imaging device of claim 1, the method comprising:
disposing a first lens element at least partially within the first body portion;
Disposing the second lens element and the third lens element at least partially within the second body portion;
coupling the second body portion to a housing;
performing a calibration of the second lens element and the third lens element; and
after calibration, the first body portion is coupled to the second body portion.
12. A method of manufacturing the imaging device of claim 1, the method comprising:
performing a first manufacturing operation while connecting the first body portion and the second body portion;
disconnecting the snap-fit mechanism to separate the first body portion from the second body portion; and
a second manufacturing operation is performed when the first body portion is separated from the second body portion.
13. The method according to claim 12, wherein:
the first manufacturing operation includes one or more first calibration tests of the lens system; and is also provided with
The second manufacturing operation includes one or more second calibration tests of the second lens element and the third lens element.
14. The method of claim 13, wherein the one or more second calibration tests comprise a focusing procedure.
15. A method, the method comprising:
providing a lens barrel, the lens barrel comprising:
a first body portion including a first lens element disposed at least partially therein, an
A second body portion comprising a second lens element and a third lens element disposed at least partially in the second body portion, wherein the first lens element, the second lens element, and the third lens element comprise a lens system configured to transfer electromagnetic radiation from a scene to an image capturing component, and
a snap-fit mechanism comprising a plurality of finger members extending from the first body portion and a plurality of complementary recesses in the second body portion; and
the first body portion is secured to the second body portion using the snap-fit mechanism, wherein the finger members are configured to engage with the notches to releasably secure the first body portion to the second body portion.
16. The method of claim 15, wherein the securing includes arranging the finger members and the notch to secure a rotational position of the first body portion relative to the second body portion.
17. The method of claim 15, further comprising:
disconnecting the snap-fit mechanism to separate the first body portion from the second body portion; and
the first body portion is reattached to the second body portion using the snap fit mechanism.
18. The method of claim 17, further comprising performing a first manufacturing operation when the first body portion is separated from the second body portion.
19. The method of claim 18, further comprising performing a second manufacturing operation when the first body portion and the second body portion are reconnected.
20. The method of claim 19, wherein each of the first and second manufacturing operations includes one or more calibration tests.
CN202310229747.3A 2022-03-11 2023-03-10 Snap fit lens barrel system and method Pending CN116736463A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/269,197 2022-03-11
US18/176,755 2023-03-01
US18/176,755 US20230288784A1 (en) 2022-03-11 2023-03-01 Snap-fit lens barrel systems and methods

Publications (1)

Publication Number Publication Date
CN116736463A true CN116736463A (en) 2023-09-12

Family

ID=87915792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310229747.3A Pending CN116736463A (en) 2022-03-11 2023-03-10 Snap fit lens barrel system and method

Country Status (1)

Country Link
CN (1) CN116736463A (en)

Similar Documents

Publication Publication Date Title
US11445131B2 (en) Imager with array of multiple infrared imaging modules
US20130016220A1 (en) Passive multi-band aperture filters and cameras therefrom
Tekaya et al. Hemispherical curved monolithic cooled and uncooled infrared focal plane arrays for compact cameras
US11894855B2 (en) Analog-to-digital converter calibration systems and methods
US20230046320A1 (en) Non-uniformity correction calibrations in infrared imaging systems and methods
US20230288784A1 (en) Snap-fit lens barrel systems and methods
US20230140342A1 (en) Wide field of view imaging systems and methods
US20230048503A1 (en) Temperature compensation in infrared imaging systems and methods
CN116736463A (en) Snap fit lens barrel system and method
EP4012363A1 (en) Infrared imaging-related uncertainty gauging systems and methods
CN216873295U (en) Infrared imaging device
US20230069029A1 (en) Variable sensitivity in infrared imaging systems and methods
US20220124261A1 (en) Imager optical systems and methods
US20230232086A1 (en) Imager optical systems and methods
US20230058486A1 (en) Burn-in mitigation and associated imaging systems and methods
US20220291047A1 (en) Warm filter configuration for reducing effects of reflected infrared radiation systems and methods
US20240048849A1 (en) Multimodal imager systems and methods with steerable fields of view
US20240089610A1 (en) Stray light mitigation systems and methods
US11775049B2 (en) Device attachment systems and methods to facilitate sensor capability in devices
US20220261964A1 (en) Image non-uniformity mitigation systems and methods
US20230131678A1 (en) Imager verification systems and methods
KR102661043B1 (en) Image non-uniformity mitigation systems and methods
WO2023101923A1 (en) Detection threshold determination for infrared imaging systems and methods
Le Naour et al. OCAPI: a multidirectional multichannel polarizing imager
KR20220115866A (en) Image non-uniformity mitigation systems and methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination