WO2012170946A2 - Low power and small form factor infrared imaging - Google Patents
Low power and small form factor infrared imaging Download PDFInfo
- Publication number
- WO2012170946A2 WO2012170946A2 PCT/US2012/041744 US2012041744W WO2012170946A2 WO 2012170946 A2 WO2012170946 A2 WO 2012170946A2 US 2012041744 W US2012041744 W US 2012041744W WO 2012170946 A2 WO2012170946 A2 WO 2012170946A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- approximately
- fpa
- image frames
- infrared sensors
- array
- Prior art date
Links
- 238000003331 infrared imaging Methods 0.000 title claims abstract description 97
- 238000000034 method Methods 0.000 claims abstract description 125
- 230000008569 process Effects 0.000 claims description 54
- 238000012545 processing Methods 0.000 claims description 49
- 238000012937 correction Methods 0.000 claims description 28
- 230000001105 regulatory effect Effects 0.000 claims description 23
- 230000004044 response Effects 0.000 claims description 21
- 238000003384 imaging method Methods 0.000 claims description 8
- 230000001419 dependent effect Effects 0.000 claims description 3
- 230000033001 locomotion Effects 0.000 description 48
- 230000002596 correlated effect Effects 0.000 description 20
- 230000000875 corresponding effect Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 13
- 238000013459 approach Methods 0.000 description 11
- 230000008901 benefit Effects 0.000 description 9
- 238000010438 heat treatment Methods 0.000 description 9
- 239000000758 substrate Substances 0.000 description 9
- 238000001914 filtration Methods 0.000 description 8
- 230000002123 temporal effect Effects 0.000 description 8
- 238000003491 array Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 239000000463 material Substances 0.000 description 7
- 239000000872 buffer Substances 0.000 description 6
- 238000012935 Averaging Methods 0.000 description 5
- 238000003705 background correction Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 230000001747 exhibiting effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000004806 packaging method and process Methods 0.000 description 5
- 230000009467 reduction Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 238000001931 thermography Methods 0.000 description 4
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 3
- 229910052782 aluminium Inorganic materials 0.000 description 3
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000000576 coating method Methods 0.000 description 3
- 229910052802 copper Inorganic materials 0.000 description 3
- 239000010949 copper Substances 0.000 description 3
- 238000013016 damping Methods 0.000 description 3
- 230000000977 initiatory effect Effects 0.000 description 3
- HCHKCACWOHOZIP-UHFFFAOYSA-N Zinc Chemical compound [Zn] HCHKCACWOHOZIP-UHFFFAOYSA-N 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 238000005266 casting Methods 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 239000004020 conductor Substances 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 229910001935 vanadium oxide Inorganic materials 0.000 description 2
- 229910052725 zinc Inorganic materials 0.000 description 2
- 239000011701 zinc Substances 0.000 description 2
- PFNQVRZLDWYSCW-UHFFFAOYSA-N (fluoren-9-ylideneamino) n-naphthalen-1-ylcarbamate Chemical compound C12=CC=CC=C2C2=CC=CC=C2C1=NOC(=O)NC1=CC=CC2=CC=CC=C12 PFNQVRZLDWYSCW-UHFFFAOYSA-N 0.000 description 1
- FYYHWMGAXLPEAU-UHFFFAOYSA-N Magnesium Chemical compound [Mg] FYYHWMGAXLPEAU-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- XHCLAFWTIXFWPH-UHFFFAOYSA-N [O-2].[O-2].[O-2].[O-2].[O-2].[V+5].[V+5] Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[V+5].[V+5] XHCLAFWTIXFWPH-UHFFFAOYSA-N 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 229910010293 ceramic material Inorganic materials 0.000 description 1
- 239000005387 chalcogenide glass Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001143 conditioned effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 229920002457 flexible plastic Polymers 0.000 description 1
- 229910052732 germanium Inorganic materials 0.000 description 1
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 229910052749 magnesium Inorganic materials 0.000 description 1
- 239000011777 magnesium Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 1
- 229910052704 radon Inorganic materials 0.000 description 1
- SYUHGPGVQRZVTB-UHFFFAOYSA-N radon atom Chemical compound [Rn] SYUHGPGVQRZVTB-UHFFFAOYSA-N 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/20—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming only infrared radiation into image signals
- H04N25/21—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming only infrared radiation into image signals for transforming thermal infrared radiation into image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/67—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response
- H04N25/671—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response for non-uniformity detection or correction
- H04N25/673—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to fixed-pattern noise, e.g. non-uniformity of response for non-uniformity detection or correction by using reference sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
- H04N25/75—Circuitry for providing, modifying or processing image signals from the pixel array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/78—Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/725—Cordless telephones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Definitions
- One or more embodiments of the invention relate generally to thermal imaging devices and more particularly, for example, to the implementation and operation of such devices in a manner appropriate for low power and small form factor applications.
- Infrared imaging devices such as infrared cameras or other devices, are typically implemented with an array of infrared sensors.
- existing infrared sensors and related circuitry are typically sensitive to noise and other phenomena. Because of such sensitivity, infrared sensors and related circuitry are often implemented with power supply arrangements wherein multiple voltage supply paths provide different voltages to various circuit components. However, such implementations are typically complex and may be relatively inefficient.
- infrared imaging devices may experience significant self -heating which may cause various undesirable thermal effects. Nevertheless, conventional techniques to reduce such effects are also less than ideal and may rely on active cooling or other measures that increase the cost and complexity of infrared imaging devices.
- a system includes a focal plane array (FPA) comprising: an array of infrared sensors adapted to image a scene; a bias circuit adapted to provide a bias voltage to the infrared sensors, wherein the bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts; and a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames.
- FPA focal plane array
- ROIC read out integrated circuit
- a system in another embodiment, includes a focal plane array (FPA) comprising: an array of infrared sensors adapted to image a scene, wherein a size of the array of infrared sensors is less than approximately 160 by 120; and a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames, wherein the ROIC is adapted to provide the captured image frames at a frame rate selected from a range of approximately 120 Hz to approximately 480 Hz.
- FPA focal plane array
- ROIC read out integrated circuit
- a method in another embodiment, includes providing a bias voltage from a bias circuit of a focal plane array (FPA) to an array of infrared sensors of the FPA, wherein the bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts; imaging the scene using the infrared sensors; and providing signals from the infrared sensors corresponding to captured image frames of the scene using a read out integrated circuit (ROIC) of the FPA.
- FPA focal plane array
- ROIC read out integrated circuit
- a method in another embodiment, includes imaging a scene using an array of infrared sensors of a focal plane array (FPA), wherein a size of the array of infrared sensors is less than approximately 160 by 120; and providing signals from the infrared sensors corresponding to captured image frames using a read out integrated circuit (ROIC) of the FPA, wherein the ROIC provides the captured image frames at a frame rate selected from a range of approximately 120 Hz to approximately 480 Hz.
- FPA focal plane array
- ROIC read out integrated circuit
- a system in another embodiment, includes a focal plane array (FPA) comprising: a low-dropout regulator (LDO) integrated with the FPA and adapted to provide a regulated voltage in response to an external supply voltage; an array of infrared sensors adapted to image a scene; a bias circuit adapted to provide a bias voltage to the infrared sensors in response to the regulated voltage; and a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames.
- FPA focal plane array
- LDO low-dropout regulator
- ROIC read out integrated circuit
- a method in another embodiment, includes receiving an external supply voltage at a focal plane array (FPA); providing a regulated voltage in response to the external supply voltage using a low-dropout regulator (LDO) integrated with the FPA; providing a bias voltage to an array infrared sensors of the FPA in response to the regulated voltage using the bias circuit of the FPA; imaging a scene using the infrared sensors; and providing signals from the infrared sensors corresponding to captured image frames using a read out integrated circuit (ROIC) of the FPA.
- FPA focal plane array
- LDO low-dropout regulator
- Fig. 1 illustrates an infrared imaging module configured to be implemented in a host device in accordance with an embodiment of the disclosure.
- Fig. 2 illustrates an assembled infrared imaging module in accordance with an embodiment of the disclosure.
- Fig. 3 illustrates an exploded view of an infrared imaging module juxtaposed over a socket in accordance with an embodiment of the disclosure.
- Fig. 4 illustrates a block diagram of an infrared sensor assembly including an array of infrared sensors in accordance with an embodiment of the disclosure.
- Fig. 5 illustrates a flow diagram of various operations to determine NUC terms in accordance with an embodiment of the disclosure.
- Fig. 6 illustrates differences between neighboring pixels in accordance with an embodiment of the disclosure.
- Fig. 7 illustrates a flat field correction technique in accordance with an embodiment of the disclosure.
- Fig. 8 illustrates various image processing techniques of Fig. 5 and other operations applied in an image processing pipeline in accordance with an embodiment of the disclosure.
- Fig. 9 illustrates a temporal noise reduction process in accordance with an embodiment of the disclosure.
- Fig. 10 illustrates particular implementation details of several processes of the image processing pipeline of Fig. 6 in accordance with an embodiment of the disclosure.
- Fig. 11 illustrates spatially correlated FPN in a neighborhood of pixels in accordance with an embodiment of the disclosure.
- Fig. 12 illustrates a block diagram of another implementation of an infrared sensor assembly including an array of infrared sensors and a low-dropout regulator in accordance with an embodiment of the disclosure.
- Fig. 13 illustrates a circuit diagram of a portion of the infrared sensor assembly of Fig. 12 in accordance with an embodiment of the disclosure.
- Fig. 1 illustrates an infrared imaging module 100 (e.g., an infrared camera or an infrared imaging device) configured to be implemented in a host device 102 in accordance with an embodiment of the disclosure.
- Infrared imaging module 100 may be implemented, for one or more embodiments, with a small form factor and in accordance with wafer level packaging techniques or other packaging techniques.
- infrared imaging module 100 may be configured to be implemented in a small portable host device 102, such as a mobile telephone, a tablet computing device, a laptop computing device, a personal digital assistant, a visible light camera, a music player, or any other appropriate mobile device.
- infrared imaging module 100 may be used to provide infrared imaging features to host device 102.
- infrared imaging module 100 may be configured to capture, process, and/or otherwise manage infrared images and provide such infrared images to host device 102 for use in any desired fashion (e.g., for further processing, to store in memory, to display, to use by various applications running on host device 102, to export to other devices, or other uses).
- infrared imaging module 100 may be configured to operate at low voltage levels and over a wide temperature range.
- infrared imaging module 100 may operate using a power supply of approximately 2.4 volts, 2.5 volts, 2.8 volts, or lower voltages, and operate over a temperature range of approximately -20 degrees C to approximately +60 degrees C (e.g., providing a suitable dynamic range and performance over an environmental temperature range of approximately 80 degrees C).
- infrared imaging module 100 may experience reduced amounts of self heating in comparison with other types of infrared imaging devices. As a result, infrared imaging module 100 may be operated with reduced measures to compensate for such self heating.
- host device 102 may include a socket 104, a shutter 105, motion sensors 194, a processor 195, a memory 196, a display 197, and/or other components 198.
- Socket 104 may be configured to receive infrared imaging module 100 as identified by arrow 101.
- Fig. 2 illustrates infrared imaging module 100 assembled in socket 104 in accordance with an embodiment of the disclosure.
- Motion sensors 194 may be implemented by one or more accelerometers, gyroscopes, or other appropriate devices that may be used to detect movement of host device 102. Motion sensors 194 may be monitored by and provide information to processing module 160 or processor 195 to detect motion. In various embodiments, motion sensors 194 may be implemented as part of host device 102 (as shown in Fig. 1), infrared imaging module 100, or other devices attached to or otherwise interfaced with host device
- Processor 195 may be implemented as any appropriate processing device (e.g., logic device, microcontroller, processor, application specific integrated circuit (ASIC), or other device) that may be used by host device 102 to execute appropriate instructions, such as software instructions provided in memory 196.
- Display 197 may be used to display captured and/or processed infrared images and/or other images, data, and information.
- Other components 198 may be used to implement any features of host device 102 as may be desired for various applications (e.g., clocks, temperature sensors, a visible light camera, or other components).
- a machine readable medium 193 may be provided for storing non-transitory instructions for loading into memory 196 and execution by processor 195.
- infrared imaging module 100 and socket 104 may be implemented for mass production to facilitate high volume applications, such as for implementation in mobile telephones or other devices (e.g., requiring small form factors).
- the combination of infrared imaging module 100 and socket 104 may exhibit overall dimensions of approximately 8.5 mm by 8.5 mm by 5.9 mm while infrared imaging module 100 is installed in socket 104.
- FIG. 3 illustrates an exploded view of infrared imaging module 100 juxtaposed over socket 104 in accordance with an embodiment of the disclosure.
- Infrared imaging module 100 may include a lens barrel 110, a housing 120, an infrared sensor assembly 128, a circuit board 170, a base 150, and a processing module 160.
- Lens barrel 110 may at least partially enclose an optical element 180 (e.g., a lens) which is partially visible in Fig. 3 through an aperture 112 in lens barrel 110.
- Lens barrel 110 may include a substantially cylindrical extension 114 which may be used to interface lens barrel 110 with an aperture 122 in housing 120.
- Infrared sensor assembly 128 may be implemented, for example, with a cap 130
- Infrared sensor assembly 128 may include a plurality of infrared sensors 132 (e.g., infrared detectors) implemented in an array or other fashion on substrate 140 and covered by cap 130.
- infrared sensor assembly 128 may be implemented as a focal plane array (FPA).
- FPA focal plane array
- Such a focal plane array may be implemented, for example, as a vacuum package assembly (e.g., sealed by cap 130 and substrate 140).
- infrared sensor assembly 128 may be implemented as a wafer level package (e.g., infrared sensor assembly 128 may be singulated from a set of vacuum package assemblies provided on a wafer).
- infrared sensor assembly 128 may be implemented to operate using a power supply of approximately 2.4 volts, 2.5 volts, 2.8 volts, or similar voltages.
- Infrared sensors 132 may be configured to detect infrared radiation (e.g., infrared energy) from a target scene including, for example, mid wave infrared wave bands
- infrared sensor assembly 128 may be provided in accordance with wafer level packaging techniques.
- Infrared sensors 132 may be implemented, for example, as microbolometers or other types of thermal imaging infrared sensors arranged in any desired array pattern to provide a plurality of pixels. In one embodiment, infrared sensors 132 may be
- Vx vanadium oxide detectors with a 17 ⁇ pixel pitch.
- arrays of approximately 32 by 32 infrared sensors 132, approximately 64 by 64 infrared sensors 132, approximately 80 by 64 infrared sensors 132, or other array sizes may be used.
- Substrate 140 may include various circuitry including, for example, a read out integrated circuit (ROIC) with dimensions less than approximately 5.5 mm by 5.5 mm in one embodiment.
- Substrate 140 may also include bond pads 142 that may be used to contact complementary connections positioned on inside surfaces of housing 120 when infrared imaging module 100 is assembled as shown in Figs. 5A, 5B, and 5C.
- the ROIC may be implemented with low-dropout regulators (LDO) to perform voltage regulation to reduce power supply noise introduced to infrared sensor assembly 128 and thus provide an improved power supply rejection ratio (PSRR).
- LDO low-dropout regulators
- the LDO with the ROIC (e.g., within a wafer level package), less die area may be consumed and fewer discrete die (or chips) are needed.
- Fig. 4 illustrates a block diagram of infrared sensor assembly 128 including an array of infrared sensors 132 in accordance with an embodiment of the disclosure.
- infrared sensors 132 are provided as part of a unit cell array of a ROIC 402.
- ROIC 402 includes bias generation and timing control circuitry 404, column amplifiers 405, a column multiplexer 406, a row multiplexer 408, and an output amplifier
- Image frames e.g., thermal images
- processing module 160 e.g., processor 195
- any desired array configuration may be used in other embodiments. Further descriptions of ROICs and infrared sensors (e.g.,
- microbolometer circuits may be found in U.S. Patent No. 6,028,309 issued February 22, 2000, which is incorporated herein by reference in its entirety.
- Infrared sensor assembly 128 may capture images (e.g., image frames) and provide such images from its ROIC at various rates.
- Processing module 160 may be used to perform appropriate processing of captured infrared images and may be implemented in accordance with any appropriate architecture.
- processing module 160 may be implemented as an ASIC.
- ASIC may be configured to perform image processing with high performance and/or high efficiency.
- processing module 160 may be implemented with a general purpose central processing unit (CPU) which may be configured to execute appropriate software instructions to perform image processing, coordinate and perform image processing with various image processing blocks, coordinate interfacing between processing module 160 and host device 102, and/or other operations.
- processing module 160 may be implemented with a field programmable gate array (FPGA).
- FPGA field programmable gate array
- Processing module 160 may be implemented with other types of processing and/or logic circuits in other embodiments as would be understood by one skilled in the art.
- processing module 160 may also be implemented with other components where appropriate, such as, volatile memory, non-volatile memory, and/or one or more interfaces (e.g., infrared detector interfaces, inter-integrated circuit (I2C) interfaces, mobile industry processor interfaces (MIPI), joint test action group (JTAG) interfaces (e.g., IEEE 1149.1 standard test access port and boundary-scan architecture), and/or other interfaces).
- interfaces e.g., infrared detector interfaces, inter-integrated circuit (I2C) interfaces, mobile industry processor interfaces (MIPI), joint test action group (JTAG) interfaces (e.g., IEEE 1149.1 standard test access port and boundary-scan architecture), and/or other interfaces).
- infrared imaging module 100 may further include one or more actuators 199 which may be used to adjust the focus of infrared image frames captured by infrared sensor assembly 128.
- actuators 199 may be used to move optical element 180, infrared sensors 132, and/or other components relative to each other to selectively focus and defocus infrared image frames in accordance with techniques described herein.
- Actuators 199 may be implemented in accordance with any type of motion-inducing apparatus or mechanism, and may positioned at any location within or external to infrared imaging module 100 as appropriate for different applications.
- housing 120 When infrared imaging module 100 is assembled, housing 120 may substantially enclose infrared sensor assembly 128, base 150, and processing module 160. Housing 120 may facilitate connection of various components of infrared imaging module 100. For example, in one embodiment, housing 120 may provide electrical connections 126 to connect various components as further described.
- Electrical connections 126 may be electrically connected with bond pads 142 when infrared imaging module 100 is assembled.
- electrical connections 126 may be embedded in housing 120, provided on inside surfaces of housing 120, and/or otherwise provided by housing 120. Electrical connections 126 may terminate in connections 124 protruding from the bottom surface of housing 120 as shown in Fig. 3. Connections 124 may connect with circuit board 170 when infrared imaging module 100 is assembled (e.g., housing 120 may rest atop circuit board 170 in various embodiments).
- Processing module 160 may be electrically connected with circuit board 170 through appropriate electrical connections.
- infrared sensor assembly 128 may be electrically connected with processing module 160 through, for example, conductive electrical paths provided by: bond pads 142, complementary connections on inside surfaces of housing 120, electrical connections 126 of housing 120, connections 124, and circuit board 170.
- conductive electrical paths provided by: bond pads 142, complementary connections on inside surfaces of housing 120, electrical connections 126 of housing 120, connections 124, and circuit board 170.
- such an arrangement may be implemented without requiring wire bonds to be provided between infrared sensor assembly 128 and processing module 160.
- electrical connections 126 in housing 120 may be made from any desired material (e.g., copper or any other appropriate conductive material). In one embodiment, electrical connections 126 may aid in dissipating heat from infrared imaging module 100.
- sensor assembly 128 may be attached to processing module 160 through a ceramic board that connects to sensor assembly 128 by wire bonds and to processing module 160 by a ball grid array (BGA).
- BGA ball grid array
- sensor assembly 128 may be mounted directly on a rigid flexible board and electrically connected with wire bonds, and processing module 160 may be mounted and connected to the rigid flexible board with wire bonds or a BGA.
- infrared imaging module 100 and host device 102 set forth herein are provided for purposes of example, rather than limitation.
- any of the various techniques described herein may be applied to any infrared camera system, infrared imager, or other device for performing infrared/thermal imaging.
- Substrate 140 of infrared sensor assembly 128 may be mounted on base 150.
- base 150 e.g., a pedestal
- base 150 may be made, for example, of copper formed by metal injection molding (MIM) and provided with a black oxide or nickel- coated finish.
- base 150 may be made of any desired material, such as for example zinc, aluminum, or magnesium, as desired for a given application and may be formed by any desired applicable process, such as for example aluminum casting, MEVI, or zinc rapid casting, as may be desired for particular applications.
- base 150 may be implemented to provide structural support, various circuit paths, thermal heat sink properties, and other features where appropriate.
- base 150 may be a multi-layer structure implemented at least in part using ceramic material.
- circuit board 170 may receive housing 120 and thus may physically support the various components of infrared imaging module 100.
- circuit board 170 may be implemented as a printed circuit board (e.g., an FR4 circuit board or other types of circuit boards), a rigid or flexible interconnect (e.g., tape or other type of interconnects), a flexible circuit substrate, a flexible plastic substrate, or other appropriate structures.
- base 150 may be implemented with the various features and attributes described for circuit board 170, and vice versa.
- Socket 104 may include a cavity 106 configured to receive infrared imaging module
- Infrared imaging module 100 and/or socket 104 may include appropriate tabs, arms, pins, fasteners, or any other appropriate engagement members which may be used to secure infrared imaging module 100 to or within socket 104 using friction, tension, adhesion, and/or any other appropriate manner.
- Socket 104 may include engagement members 107 that may engage surfaces 109 of housing 120 when infrared imaging module 100 is inserted into a cavity 106 of socket 104. Other types of engagement members may be used in other embodiments.
- Infrared imaging module 100 may be electrically connected with socket 104 through appropriate electrical connections (e.g., contacts, pins, wires, or any other appropriate connections).
- socket 104 may include electrical connections 108 which may contact corresponding electrical connections of infrared imaging module 100 (e.g., interconnect pads, contacts, or other electrical connections on side or bottom surfaces of circuit board 170, bond pads 142 or other electrical connections on base 150, or other connections). Electrical connections 108 may be made from any desired material (e.g., copper or any other appropriate conductive material). In one embodiment, electrical connections 108 may be mechanically biased to press against electrical connections of infrared imaging module 100 when infrared imaging module 100 is inserted into cavity 106 of socket 104. In one embodiment, electrical connections 108 may at least partially secure infrared imaging module 100 in socket 104. Other types of electrical connections may be used in other embodiments.
- electrical connections 108 may contact corresponding electrical connections of infrared imaging module 100 (e.g., interconnect pads, contacts, or other electrical connections on side or bottom surfaces of circuit board 170, bond pads 142 or other electrical connections on base 150, or other connections). Electrical connections 108 may be made from any desired material (e.
- Socket 104 may be electrically connected with host device 102 through similar types of electrical connections.
- host device 102 may include electrical connections (e.g., soldered connections, snap-in connections, or other connections) that connect with electrical connections 108 passing through apertures 190.
- electrical connections may be made to the sides and/or bottom of socket 104.
- infrared imaging module 100 may be implemented with flip chip technology which may be used to mount components directly to circuit boards without the additional clearances typically needed for wire bond connections.
- flip chip technology may be used to mount components directly to circuit boards without the additional clearances typically needed for wire bond connections.
- connections may be used, as an example, to reduce the overall size of infrared imaging module 100 for use in compact small form factor applications.
- processing module 160 may be mounted to circuit board 170 using flip chip connections.
- infrared imaging module 100 may be implemented with such flip chip configurations.
- infrared imaging module 100 and/or associated components may be implemented, calibrated, tested, and/or used in accordance with various techniques, such as for example as set forth in U.S. Patent No. 7,470,902 issued December 30, 2008, U.S. Patent No. 6,028,309 issued February 22, 2000, U.S. Patent No. 6,812,465 issued
- host device 102 may include shutter 105.
- shutter 105 may be selectively positioned over socket 104 (e.g., as identified by arrows 103) while infrared imaging module 100 is installed therein.
- shutter 105 may be used, for example, to protect infrared imaging module 100 when not in use.
- Shutter 105 may also be used as a temperature reference as part of a calibration process (e.g., a NUC process or other calibration processes) for infrared imaging module 100 as would be understood by one skilled in the art.
- shutter 105 may be made from various materials such as, for example, polymers, glass, aluminum (e.g., painted or anodized) or other materials.
- shutter 105 may include one or more coatings to selectively filter electromagnetic radiation and/or adjust various optical properties of shutter 105 (e.g., a uniform blackbody coating or a reflective gold coating).
- shutter 105 may be fixed in place to protect infrared imaging module 100 at all times.
- shutter 105 or a portion of shutter 105 may be made from appropriate materials (e.g., polymers or infrared transmitting materials such as silicon, germanium, zinc selenide, or chalcogenide glasses) that do not substantially filter desired infrared wavelengths.
- a shutter may be implemented as part of infrared imaging module 100 (e.g., within or as part of a lens barrel or other components of infrared imaging module 100), as would be understood by one skilled in the art.
- a shutter e.g., shutter 105 or other type of external or internal shutter
- a NUC process or other type of calibration may be performed using shutterless techniques.
- a NUC process or other type of calibration using shutterless techniques may be performed in combination with shutter-based techniques.
- Infrared imaging module 100 and host device 102 may be implemented in accordance with any of the various techniques set forth in U.S. Provisional Patent
- the components of host device 102 and/or infrared imaging module 100 may be implemented as a local or distributed system with components in communication with each other over wired and/or wireless networks. Accordingly, the various operations identified in this disclosure may be performed by local and/or remote components as may be desired in particular implementations.
- Fig. 5 illustrates a flow diagram of various operations to determine NUC terms in accordance with an embodiment of the disclosure.
- the operations of Fig. 5 may be performed by processing module 160 or processor 195 (both also generally referred to as a processor) operating on image frames captured by infrared sensors 132.
- infrared sensors 132 begin capturing image frames of a scene.
- the scene will be the real world environment in which host device 102 is currently located.
- shutter 105 if optionally provided may be opened to permit infrared imaging module to receive infrared radiation from the scene.
- Infrared sensors 132 may continue capturing image frames during all operations shown in Fig. 5.
- the continuously captured image frames may be used for various operations as further discussed.
- the captured image frames may be temporally filtered (e.g., in accordance with the process of block 826 further described herein with regard to Fig.
- a NUC process initiating event is detected.
- the NUC process may be initiated in response to physical movement of host device 102.
- Such movement may be detected, for example, by motion sensors 194 which may be polled by a processor.
- a user may move host device 102 in a particular manner, such as by intentionally waving host device 102 back and forth in an "erase” or "swipe" movement.
- the user may move host device 102 in accordance with a predetermined speed and direction (velocity), such as in an up and down, side to side, or other pattern to initiate the NUC process.
- the use of such movements may permit the user to intuitively operate host device 102 to simulate the "erasing" of noise in captured image frames.
- a NUC process may be initiated by host device 102 if motion exceeding a threshold value is exceeded (e.g., motion greater than expected for ordinary use). It is contemplated that any desired type of spatial translation of host device 102 may be used to initiate the NUC process.
- a threshold value e.g., motion greater than expected for ordinary use. It is contemplated that any desired type of spatial translation of host device 102 may be used to initiate the NUC process.
- a NUC process may be initiated by host device 102 if a minimum time has elapsed since a previously performed NUC process.
- a NUC process may be initiated by host device 102 if infrared imaging module 100 has experienced a minimum temperature change since a previously performed NUC process.
- a NUC process may be continuously initiated and repeated.
- the NUC process may be selectively initiated based on whether one or more additional conditions are met. For example, in one embodiment, the NUC process may not be performed unless a minimum time has elapsed since a previously performed NUC process. In another embodiment, the NUC process may not be performed unless infrared imaging module 100 has experienced a minimum temperature change since a previously performed NUC process. Other criteria or conditions may be used in other embodiments. If appropriate criteria or conditions have been met, then the flow diagram continues to block 520.
- blurred image frames may be used to determine NUC terms which may be applied to captured image frames to correct for FPN.
- the blurred image frames may be obtained by accumulating multiple image frames of a moving scene (e.g., captured while the scene and/or the thermal imager is in motion).
- the blurred image frames may be obtained by defocusing an optical element or other component of the thermal imager. Accordingly, in block 520 a choice of either approach is provided. If the motion- based approach is used, then the flow diagram continues to block 525. If the defocus-based approach is used, then the flow diagram continues to block 530.
- motion is detected.
- motion may be detected based on the image frames captured by infrared sensors 132.
- an appropriate motion detection process e.g., an image registration process, a frame-to-frame difference calculation, or other appropriate process
- it can be determined whether pixels or regions around the pixels of consecutive image frames have changed more than a user defined amount (e.g., a percentage and/or threshold value). If at least a given percentage of pixels have changed by at least the user defined amount, then motion will be detected with sufficient certainty to proceed to block 535.
- a user defined amount e.g., a percentage and/or threshold value
- motion may be determined on a per pixel basis, wherein only pixels that exhibit significant changes are accumulated to provide the blurred image frame.
- counters may be provided for each pixel and used to ensure that the same number of pixel values are accumulated for each pixel, or used to average the pixel values based on the number of pixel values actually accumulated for each pixel.
- Other types of image-based motion detection may be performed such as performing a Radon transform.
- motion may be detected based on data provided by motion sensors 194.
- motion detection may include detecting whether host device 102 is moving along a relatively straight trajectory through space. For example, if host device 102 is moving along a relatively straight trajectory, then it is possible that certain objects appearing in the imaged scene may not be sufficiently blurred (e.g., objects in the scene that may be aligned with or moving substantially parallel to the straight trajectory).
- the motion detected by motion sensors 194 may be conditioned on host device 102 exhibiting, or not exhibiting, particular trajectories.
- both a motion detection process and motion sensors 194 may be used.
- a determination can be made as to whether or not each image frame was captured while at least a portion of the scene and host device 102 were in motion relative to each other (e.g., which may be caused by host device 102 moving relative to the scene, at least a portion of the scene moving relative to host device 102, or both).
- the image frames for which motion was detected may exhibit some secondary blurring of the captured scene (e.g., blurred thermal image data associated with the scene) due to the thermal time constants of infrared sensors 132 (e.g.,
- microbolometer thermal time constants interacting with the scene movement.
- image frames for which motion was detected are accumulated. For example, if motion is detected for a continuous series of image frames, then the image frames of the series may be accumulated. As another example, if motion is detected for only some image frames, then the non-moving image frames may be skipped and not included in the accumulation. Thus, a continuous or discontinuous set of image frames may be selected to be accumulated based on the detected motion.
- the accumulated image frames are averaged to provide a blurred image frame. Because the accumulated image frames were captured during motion, it is expected that actual scene information will vary between the image frames and thus cause the scene information to be further blurred in the resulting blurred image frame (block 545).
- FPN e.g., caused by one or more components of infrared imaging module 100
- FPN will remain fixed over at least short periods of time and over at least limited changes in scene irradiance during motion.
- image frames captured in close proximity in time and space during motion will suffer from identical or at least very similar FPN.
- scene information may change in consecutive image frames, the FPN will stay essentially constant.
- multiple image frames captured during motion will blur the scene information, but will not blur the FPN.
- FPN will remain more clearly defined in the blurred image frame provided in block 545 than the scene information.
- a defocus operation may be performed to intentionally defocus the image frames captured by infrared sensors 132.
- one or more actuators 199 may be used to adjust, move, or otherwise translate optical element 180, infrared sensor assembly 128, and/or other components of infrared imaging module 100 to cause infrared sensors 132 to capture a blurred (e.g., unfocused) image frame of the scene.
- Other non-actuator based techniques are also contemplated for intentionally defocusing infrared image frames such as, for example, manual (e.g., user-initiated) defocusing.
- FPN e.g., caused by one or more components of infrared imaging module 100
- FPN will remain unaffected by the defocusing operation.
- a blurred image frame of the scene will be provided (block 545) with FPN remaining more clearly defined in the blurred image than the scene information.
- the defocus-based approach has been described with regard to a single captured image frame.
- the defocus-based approach may include accumulating multiple image frames while the infrared imaging module 100 has been defocused and averaging the defocused image frames to remove the effects of temporal noise and provide a blurred image frame in block 545.
- a blurred image frame may be provided in block 545 by either the motion-based approach or the defocus-based approach. Because much of the scene information will be blurred by either motion, defocusing, or both, the blurred image frame may be effectively considered a low pass filtered version of the original captured image frames with respect to scene information.
- the blurred image frame is processed to determine updated row and column FPN terms (e.g., if row and column FPN terms have not been previously determined then the updated row and column FPN terms may be new row and column FPN terms in the first iteration of block 550).
- updated row and column FPN terms e.g., if row and column FPN terms have not been previously determined then the updated row and column FPN terms may be new row and column FPN terms in the first iteration of block 550.
- the terms row and column may be used interchangeably depending on the orientation of infrared sensors 132 and/or other components of infrared imaging module 100.
- block 550 includes determining a spatial FPN correction term for each row of the blurred image frame (e.g., each row may have its own spatial FPN correction term), and also determining a spatial FPN correction term for each column of the blurred image frame (e.g., each column may have its own spatial FPN correction term).
- Such processing may be used to reduce the spatial and slowly varying (1/f) row and column FPN inherent in thermal imagers caused by, for example, 1/f noise characteristics of amplifiers in ROIC 402 which may manifest as vertical and horizontal stripes in image frames.
- row and column FPN terms may be determined by considering differences between neighboring pixels of the blurred image frame.
- Fig. 6 illustrates differences between neighboring pixels in accordance with an embodiment of the disclosure. Specifically, in Fig. 6 a pixel 610 is compared to its 8 nearest horizontal neighbors: d0-d3 on one side and d4-d7 on the other side. Differences between the neighbor pixels can be averaged to obtain an estimate of the offset error of the illustrated group of pixels. An offset error may be calculated for each pixel in a row or column and the average result may be used to correct the entire row or column.
- threshold values may be used (thPix and -thPix). Pixel values falling outside these threshold values (pixels dl and d4 in this example) are not used to obtain the offset error.
- the maximum amount of row and column FPN correction may be limited by these threshold values.
- the updated row and column FPN terms determined in block 550 are stored (block 552) and applied (block 555) to the blurred image frame provided in block 545. After these terms are applied, some of the spatial row and column FPN in the blurred image frame may be reduced. However, because such terms are applied generally to rows and columns, additional FPN may remain such as spatially uncorrelated
- FPN associated with pixel to pixel drift or other causes Neighborhoods of spatially correlated FPN may also remain which may not be directly associated with individual rows and columns. Accordingly, further processing may be performed as discussed below to determine NUC terms.
- local contrast values e.g., edges or absolute values of gradients between adjacent or small groups of pixels
- scene information in the blurred image frame includes contrasting areas that have not been significantly blurred (e.g., high contrast edges in the original scene data)
- contrasting areas e.g., high contrast edges in the original scene data
- local contrast values in the blurred image frame may be calculated, or any other desired type of edge detection process may be applied to identify certain pixels in the blurred image as being part of an area of local contrast. Pixels that are marked in this manner may be considered as containing excessive high spatial frequency scene information that would be interpreted as FPN (e.g., such regions may correspond to portions of the scene that have not been sufficiently blurred). As such, these pixels may be excluded from being used in the further determination of NUC terms.
- contrast detection processing may rely on a threshold that is higher than the expected contrast value associated with FPN (e.g., pixels exhibiting a contrast value higher than the threshold may be considered to be scene information, and those lower than the threshold may be considered to be exhibiting FPN).
- the contrast determination of block 560 may be performed on the blurred image frame after row and column FPN terms have been applied to the blurred image frame (e.g., as shown in Fig. 5). In another embodiment, block 560 may be performed prior to block 550 to determine contrast before row and column FPN terms are determined (e.g., to prevent scene based contrast from contributing to the determination of such terms).
- any high spatial frequency content remaining in the blurred image frame may be generally attributed to spatially uncorrected FPN.
- much of the other noise or actual desired scene based information has been removed or excluded from the blurred image frame due to: intentional blurring of the image frame (e.g., by motion or defocusing in blocks 520 through 545), application of row and column FPN terms (block 555), and contrast determination (block 560).
- any remaining high spatial frequency content e.g., exhibited as areas of contrast or differences in the blurred image frame
- the blurred image frame is high pass filtered.
- this may include applying a high pass filter to extract the high spatial frequency content from the blurred image frame.
- this may include applying a low pass filter to the blurred image frame and taking a difference between the low pass filtered image frame and the unfiltered blurred image frame to obtain the high spatial frequency content.
- a high pass filter may be implemented by calculating a mean difference between a sensor signal (e.g., a pixel value) and its neighbors.
- a flat field correction process is performed on the high pass filtered blurred image frame to determine updated NUC terms (e.g., if a NUC process has not previously been performed then the updated NUC terms may be new NUC terms in the first iteration of block 570).
- Fig. 7 illustrates a flat field correction technique 700 in accordance with an embodiment of the disclosure.
- a NUC term may be determined for each pixel 710 of the blurred image frame using the values of its neighboring pixels 712 to 726.
- several gradients may be determined based on the absolute difference between the values of various adjacent pixels. For example, absolute value differences may be determined between: pixels 712 and 714 (a left to right diagonal gradient), pixels 716 and 718 (a top to bottom vertical gradient), pixels 720 and 722 (a right to left diagonal gradient), and pixels 724 and 726 (a left to right horizontal gradient).
- a weight value may be determined for pixel 710 that is inversely proportional to the summed gradient. This process may be performed for all pixels 710 of the blurred image frame until a weight value is provided for each pixel 710. For areas with low gradients (e.g., areas that are blurry or have low contrast), the weight value will be close to one. Conversely, for areas with high gradients, the weight value will be zero or close to zero.
- the update to the NUC term as estimated by the high pass filter is multiplied with the weight value.
- the risk of introducing scene information into the NUC terms can be further reduced by applying some amount of temporal damping to the NUC term determination process.
- NUC terms have been described with regard to gradients, local contrast values may be used instead where appropriate. Other techniques may also be used such as, for example, standard deviation calculations. Other types flat field correction processes may be performed to determine NUC terms including, for example, various processes identified in U.S. Patent No. 6,028,309 issued February 22, 2000, U.S. Patent No. 6,812,465 issued November 2, 2004, and U.S. Patent Application No. 12/114,865 filed May 5, 2008, which are incorporated herein by reference in their entirety.
- block 570 may include additional processing of the NUC terms.
- the sum of all NUC terms may be normalized to zero by subtracting the NUC term mean from each NUC term.
- the mean value of each row and column may be subtracted from the NUC terms for each row and column.
- row and column FPN filters using the row and column FPN terms determined in block 550 may be better able to filter out row and column noise in further iterations (e.g., as further shown in Fig. 8) after the NUC terms are applied to captured images (e.g., in block 580 further discussed herein).
- the row and column FPN filters may in general use more data to calculate the per row and per column offset coefficients (e.g., row and column FPN terms) and may thus provide a more robust alternative for reducing spatially correlated FPN than the NUC terms which are based on high pass filtering to capture spatially uncorrelated noise.
- additional high pass filtering and further determinations of updated NUC terms may be optionally performed to remove spatially correlated FPN with lower spatial frequency than previously removed by row and column FPN terms.
- some variability in infrared sensors 132 or other components of infrared imaging module 100 may result in spatially correlated FPN noise that cannot be easily modeled as row or column noise.
- Such spatially correlated FPN may include, for example, window defects on a sensor package or a cluster of infrared sensors 132 that respond differently to irradiance than neighboring infrared sensors 132.
- such spatially correlated FPN may be mitigated with an offset correction.
- the noise may also be detectable in the blurred image frame. Since this type of noise may affect a neighborhood of pixels, a high pass filter with a small kernel may not detect the FPN in the neighborhood (e.g., all values used in high pass filter may be taken from the neighborhood of affected pixels and thus may be affected by the same offset error). For example, if the high pass filtering of block 565 is performed with a small kernel (e.g., considering only immediately adjacent pixels that fall within a neighborhood of pixels affected by spatially correlated FPN), then broadly distributed spatially correlated FPN may not be detected.
- a small kernel e.g., considering only immediately adjacent pixels that fall within a neighborhood of pixels affected by spatially correlated FPN
- Fig. 11 illustrates spatially correlated FPN in a neighborhood of pixels in accordance with an embodiment of the disclosure.
- a neighborhood of pixels 1110 may exhibit spatially correlated FPN that is not precisely correlated to individual rows and columns and is distributed over a neighborhood of several pixels (e.g., a neighborhood of approximately 4 by 4 pixels in this example).
- Sample image frame 1100 also includes a set of pixels 1120 exhibiting substantially uniform response that are not used in filtering calculations, and a set of pixels 1130 that are used to estimate a low pass value for the neighborhood of pixels 1110.
- pixels 1130 may be a number of pixels divisible by two in order to facilitate efficient hardware or software calculations.
- additional high pass filtering and further determinations of updated NUC terms may be optionally performed to remove spatially correlated FPN such as exhibited by pixels 1110.
- the updated NUC terms determined in block 570 are applied to the blurred image frame.
- the blurred image frame will have been initially corrected for spatially correlated FPN (e.g., by application of the updated row and column FPN terms in block 555), and also initially corrected for spatially uncorrected FPN (e.g., by application of the updated NUC terms applied in block 571).
- a further high pass filter is applied with a larger kernel than was used in block 565, and further updated NUC terms may be determined in block 573.
- the high pass filter applied in block 572 may include data from a sufficiently large enough neighborhood of pixels such that differences can be determined between unaffected pixels (e.g., pixels 1120) and affected pixels (e.g., pixels 1110).
- a low pass filter with a large kernel can be used (e.g., an N by N kernel that is much greater than 3 by 3 pixels) and the results may be subtracted to perform appropriate high pass filtering.
- a sparse kernel may be used such that only a small number of neighboring pixels inside an N by N neighborhood are used.
- the temporal damping factor ⁇ may be set close to 1 for updated NUC terms determined in block 573.
- blocks 571-573 may be repeated (e.g., cascaded) to iteratively perform high pass filtering with increasing kernel sizes to provide further updated NUC terms further correct for spatially correlated FPN of desired neighborhood sizes.
- the decision to perform such iterations may be determined by whether spatially correlated FPN has actually been removed by the updated NUC terms of the previous performance of blocks 571-573.
- thresholding criteria may be applied to individual pixels to determine which pixels receive updated NUC terms.
- the threshold values may correspond to differences between the newly calculated NUC terms and previously calculated NUC terms.
- the threshold values may be independent of previously calculated NUC terms. Other tests may be applied (e.g., spatial correlation tests) to determine whether the
- NUC terms should be applied. If the NUC terms are deemed spurious or unlikely to provide meaningful correction, then the flow diagram returns to block 505. Otherwise, the newly determined NUC terms are stored (block 575) to replace previous NUC terms (e.g., determined by a previously performed iteration of Fig. 5) and applied (block 580) to captured image frames.
- Fig. 8 illustrates various image processing techniques of Fig. 5 and other operations applied in an image processing pipeline 800 in accordance with an embodiment of the disclosure.
- pipeline 800 identifies various operations of Fig. 5 in the context of an overall iterative image processing scheme for correcting image frames provided by infrared imaging module 100.
- pipeline 800 may be provided by processing module 160 or processor 195 (both also generally referred to as a processor) operating on image frames captured by infrared sensors 132.
- Image frames captured by infrared sensors 132 may be provided to a frame averager 804 that integrates multiple image frames to provide image frames 802 with an improved signal to noise ratio.
- Frame averager 804 may be effectively provided by infrared sensors 132, ROIC 402, and other components of infrared sensor assembly 128 that are implemented to support high image capture rates. For example, in one
- infrared sensor assembly 128 may capture infrared image frames at a frame rate of 240 Hz (e.g., 240 images per second).
- a high frame rate may be implemented, for example, by operating infrared sensor assembly 128 at relatively low voltages (e.g., compatible with mobile telephone voltages) and by using a relatively small array of infrared sensors 132 (e.g., an array of 64 by 64 infrared sensors in one embodiment).
- such infrared image frames may be provided from infrared sensor assembly 128 to processing module 160 at a high frame rate (e.g., 240 Hz or other frame rates).
- infrared sensor assembly 128 may integrate over longer time periods, or multiple time periods, to provide integrated (e.g., averaged) infrared image frames to processing module 160 at a lower frame rate (e.g., 30 Hz, 9 Hz, or other frame rates). Further information regarding implementations that may be used to provide high image capture rates may be found in U.S. Provisional Patent Application No.
- Image frames 802 proceed through pipeline 800 where they are adjusted by various terms, temporally filtered, used to determine the various adjustment terms, and gain compensated.
- factory gain terms 812 and factory offset terms 816 are applied to image frames 802 to compensate for gain and offset differences, respectively, between the various infrared sensors 132 and/or other components of infrared imaging module 100 determined during manufacturing and testing.
- NUC terms 817 are applied to image frames 802 to correct for FPN as discussed.
- block 580 may not be performed or initialization values may be used for NUC terms 817 that result in no alteration to the image data (e.g., offsets for every pixel would be equal to zero).
- column FPN terms 820 and row FPN terms 824 are applied to image frames 802.
- Column FPN terms 820 and row FPN terms 824 may be determined in accordance with block 550 as discussed. In one embodiment, if the column FPN terms 820 and row FPN terms 824 have not yet been determined (e.g., before a NUC process has been initiated), then blocks 818 and 822 may not be performed or initialization values may be used for the column FPN terms 820 and row FPN terms 824 that result in no alteration to the image data (e.g., offsets for every pixel would be equal to zero).
- temporal filtering is performed on image frames 802 in accordance with a temporal noise reduction (TNR) process.
- Fig. 9 illustrates a TNR process in accordance with an embodiment of the disclosure.
- Image frames 802a and 802b include local neighborhoods of pixels 803a and 803b centered around pixels 805a and 805b, respectively. Neighborhoods 803a and 803b correspond to the same locations within image frames 802a and 802b and are subsets of the total pixels in image frames 802a and 802b.
- neighborhoods 803a and 803b include areas of 5 by 5 pixels. Other neighborhood sizes may be used in other embodiments.
- Averaged delta value 805c may be used to determine weight values in block 807 to be applied to pixels 805a and 805b of image frames 802a and 802b.
- the weight values determined in block 807 may be inversely proportional to averaged delta value 805c such that weight values drop rapidly towards zero when there are large differences between neighborhoods 803a and 803b.
- large differences between neighborhoods 803a and 803b may indicate that changes have occurred within the scene (e.g., due to motion) and pixels 802a and 802b may be appropriately weighted, in one embodiment, to avoid introducing blur across frame-to-frame scene changes.
- Other associations between weight values and averaged delta value 805c may be used in various embodiments.
- the weight values determined in block 807 may be applied to pixels 805a and 805b to determine a value for corresponding pixel 805e of image frame 802e (block 811).
- pixel 805e may have a value that is a weighted average (or other combination) of pixels 805a and 805b, depending on averaged delta value 805c and the weight values determined in block 807.
- pixel 805e of temporally filtered image frame 802e may be a weighted sum of pixels 805a and 805b of image frames 802a and 802b. If the average difference between pixels 805a and 805b is due to noise, then it may be expected that the average change between neighborhoods 805a and 805b will be close to zero (e.g., corresponding to the average of uncorrelated changes). Under such circumstances, it may be expected that the sum of the differences between neighborhoods 805a and 805b will be close to zero. In this case, pixel 805a of image frame 802a may both be appropriately weighted so as to contribute to the value of pixel 805e.
- averaged delta value 805c has been described as being determined based on neighborhoods 805a and 805b, in other embodiments averaged delta value 805c may be determined based on any desired criteria (e.g., based on individual pixels or other types of groups of sets of pixels).
- image frame 802a has been described as a presently received image frame and image frame 802b has been described as a previously temporally filtered image frame.
- image frames 802a and 802b may be first and second image frames captured by infrared imaging module 100 that have not been temporally filtered.
- Fig. 10 illustrates further implementation details in relation to the TNR process of block 826.
- image frames 802a and 802b may be read into line buffers 1010a and 1010b, respectively, and image frame 802b (e.g., the previous image frame) may be stored in a frame buffer 1020 before being read into line buffer 1010b.
- line buffers lOlOa-b and frame buffer 1020 may be implemented by a block of random access memory (RAM) provided by any appropriate component of infrared imaging module 100 and/or host device 102.
- RAM random access memory
- image frame 802e may be passed to an automatic gain compensation block 828 for further processing to provide a result image frame 830 that may be used by host device 102 as desired.
- Fig. 8 further illustrates various operations that may be performed to determine row and column FPN terms and NUC terms as discussed.
- these operations may use image frames 802e as shown in Fig. 8. Because image frames 802e have already been temporally filtered, at least some temporal noise may be removed and thus will not inadvertently affect the determination of row and column FPN terms 824 and 820 and NUC terms 817. In another embodiment, non-temporally filtered image frames 802 may be used.
- a NUC process may be selectively initiated and performed in response to various NUC process initiating events and based on various criteria or conditions.
- the NUC process may be performed in accordance with a motion-based approach (blocks 525, 535, and 540) or a defocus-based approach (block 530) to provide a blurred image frame (block 545).
- Fig. 8 further illustrates various additional blocks 550,
- row and column FPN terms 824 and 820 and NUC terms 817 may be determined and applied in an iterative fashion such that updated terms are determined using image frames 802 to which previous terms have already been applied. As a result, the overall process of Fig. 8 may repeatedly update and apply such terms to continuously reduce the noise in image frames 830 to be used by host device 102.
- blocks 525, 535, and 540 are shown as operating at the normal frame rate of image frames 802 received by pipeline 800.
- the determination made in block 525 is represented as a decision diamond used to determine whether a given image frame 802 has sufficiently changed such that it may be considered an image frame that will enhance the blur if added to other image frames and is therefore accumulated (block 535 is represented by an arrow in this embodiment) and averaged (block 540).
- column FPN terms 820 (block 550) is shown as operating at an update rate that in this example is 1/32 of the sensor frame rate (e.g., normal frame rate) due to the averaging performed in block 540. Other update rates may be used in other embodiments. Although only column FPN terms 820 are identified in Fig. 10, row FPN terms 824 may be implemented in a similar fashion at the reduced frame rate.
- Fig. 10 also illustrates further implementation details in relation to the NUC determination process of block 570.
- the blurred image frame may be read to a line buffer 1030 (e.g., implemented by a block of RAM provided by any appropriate component of infrared imaging module 100 and/or host device 102).
- the flat field correction technique 700 of Fig. 7 may be performed on the blurred image frame.
- FPN such as spatially correlated row and column FPN and spatially uncorrelated FPN.
- the rate at which row and column FPN terms and/or NUC terms are updated can be inversely proportional to the estimated amount of blur in the blurred image frame and/or inversely proportional to the magnitude of local contrast values (e.g., determined in block 560).
- the described techniques may provide advantages over conventional shutter-based noise correction techniques.
- a shutter e.g., such as shutter 105
- Power and maximum voltage supplied to, or generated by, infrared imaging module 100 may also be reduced if a shutter does not need to be mechanically operated. Reliability will be improved by removing the shutter as a potential point of failure.
- a shutterless process also eliminates potential image interruption caused by the temporary blockage of the imaged scene by a shutter.
- noise correction may be performed on image frames that have irradiance levels similar to those of the actual scene desired to be imaged. This can improve the accuracy and effectiveness of noise correction terms determined in accordance with the various described techniques.
- infrared imaging module 100 may be configured to operate at low voltage levels.
- infrared imaging module 100 may be implemented with circuitry configured to operate at low power and/or in accordance with other parameters that permit infrared imaging module 100 to be conveniently and effectively implemented in various types of host devices 102, such as mobile devices and other devices.
- Fig. 12 illustrates a block diagram of another implementation of infrared sensor assembly 128 including infrared sensors 132 and an LDO 1220 in accordance with an embodiment of the disclosure.
- Fig. 12 also illustrates various components 1202, 1204, 1205, 1206, 1208, and 1210 which may implemented in the same or similar manner as corresponding components previously described with regard to Fig. 4.
- Fig. 12 also illustrates bias correction circuitry 1212 which may be used to adjust one or more bias voltages provided to infrared sensors 132 (e.g., to compensate for temperature changes, self-heating, and/or other factors).
- LDO 1220 may be provided as part of infrared sensor assembly 128 (e.g., on the same chip and/or wafer level package as the ROIC).
- LDO 1220 may be provided as part of an FPA with infrared sensor assembly 128.
- such implementations may reduce power supply noise introduced to infrared sensor assembly 128 and thus provide an improved PSRR.
- by implementing the LDO with the ROIC less die area may be consumed and fewer discrete die (or chips) are needed.
- LDO 1220 receives an input voltage provided by a power source 1230 over a supply line 1232.
- LDO 1220 provides an output voltage to various components of infrared sensor assembly 128 over supply lines 1222.
- LDO 1220 may provide substantially identical regulated output voltages to various components of infrared sensor assembly 128 in response to a single input voltage received from power source 1230.
- power source 1230 may provide an input voltage in a range of approximately 2.8 volts to approximately 11 volts (e.g.,
- LDO 1220 may provide an output voltage in a range of approximately 1.5 volts to approximately 2.8 volts (e.g.,
- LDO 1220 may be used to provide a consistent regulated output voltage, regardless of whether power source 1230 is implemented with a conventional voltage range of approximately 9 volts to approximately 11 volts, or a low voltage such as approximately 2.8 volts. As such, although various voltage ranges are provided for the input and output voltages, it is contemplated that the output voltage of LDO 1220 will remain fixed despite changes in the input voltage.
- LDO 1220 as part of infrared sensor assembly 128 provides various advantages over conventional power implementations for FPAs.
- conventional FPAs typically rely on multiple power sources, each of which may be provided separately to the FPA, and separately distributed to the various components of the FPA.
- appropriate voltages may be separately provided (e.g., to reduce possible noise) to all components of infrared sensor assembly 128 with reduced complexity.
- LDO 1220 also allows infrared sensor assembly 128 to operate in a consistent manner, even if the input voltage from power source 1230 changes (e.g., if the input voltage increases or decreases as a result of charging or discharging a battery or other type of device used for power source 1230).
- infrared sensor assembly 128 shown in Fig. 12 may also be implemented to operate at lower voltages than conventional devices.
- LDO 1220 may be implemented to provide a low voltage (e.g., approximately
- LDO 1220 may reduce or eliminate the need for a separate negative reference voltage to be provided to infrared sensor assembly 128.
- Fig. 13 illustrates a circuit diagram of a portion of infrared sensor assembly 128 of Fig. 12 in accordance with an embodiment of the disclosure.
- Fig. 13 illustrates additional components of bias correction circuitry 1212 (e.g., components 1326, 1330, 1332, 1334, 1336, 1338, and 1341) connected to LDO 1220 and infrared sensors 132.
- bias correction circuitry 1212 may be used to compensate for temperature-dependent changes in bias voltages in accordance with an embodiment of the present disclosure.
- the operation of such additional components may be further understood with reference to similar components identified in U.S. Patent No.
- Infrared sensor assembly 128 may also be implemented in accordance with the various components identified in U.S. Patent No. 6,812,465 issued November 2, 2004 which is hereby incorporated by reference in its entirety.
- bias correction circuitry 1212 may be implemented on a global array basis as shown in Fig. 13 (e.g., used for all infrared sensors 132 collectively in an array). In other embodiments, some or all of the bias correction circuitry 1212 may be implemented an individual sensor basis (e.g., entirely or partially duplicated for each infrared sensor 132). In some embodiments, bias correction circuitry 1212 and other components of Fig. 13 may be implemented as part of ROIC 1202.
- LDO 1220 provides a load voltage Vload to bias correction circuitry 1212 along one of supply lines 1222.
- Vload may be approximately 2.5 volts which contrasts with larger voltages of approximately 9 volts to approximately 11 volts that may be used as load voltages in conventional infrared imaging devices.
- bias correction circuitry 1212 Based on Vload, bias correction circuitry 1212 provides a sensor bias voltage Vbolo at a node 1360.
- Vbolo may be distributed to one or more infrared sensors 132 through appropriate switching circuitry 1370 (e.g., represented by broken lines in Fig. 13).
- switching circuitry 1370 may be implemented in accordance with appropriate components identified in U.S. Patent Nos. 6,812,465 and 7,679,048 previously referenced herein.
- Each infrared sensor 132 includes a node 1350 which receives Vbolo through switching circuitry 1370, and another node 1352 which may be connected to ground, a substrate, and/or a negative reference voltage.
- the voltage at node 1360 may be substantially the same as Vbolo provided at nodes 1350.
- the voltage at node 1360 may be adjusted to compensate for possible voltage drops associated with switching circuitry 1370 and/or other factors.
- Vbolo may be implemented with lower voltages than are typically used for conventional infrared sensor biasing. In one embodiment, Vbolo may be in a range of approximately 0.2 volts to approximately 0.7 volts. In another embodiment, Vbolo may be in a range of approximately 0.4 volts to approximately 0.6 volts. In another embodiment, Vbolo may be approximately 0.5 volts. In contrast, conventional infrared sensors typically use bias voltages of approximately 1 volt.
- infrared sensor assembly 128 permits infrared sensor assembly 128 to exhibit significantly reduced power consumption in comparison with conventional infrared imaging devices.
- the power consumption of each infrared sensor 132 is reduced by the square of the bias voltage.
- a reduction from, for example, 1.0 volt to 0.5 volts provides a significant reduction in power, especially when applied to many infrared sensors 132 in an infrared sensor array. This reduction in power may also result in reduced self -heating of infrared sensor assembly 128.
- various techniques are provided for reducing the effects of noise in image frames provided by infrared imaging devices operating at low voltages.
- noise, self -heating, and/or other phenomena may, if uncorrected, become more pronounced in image frames provided by infrared sensor assembly 128.
- Vbolo when LDO 1220 maintains Vload at a low voltage in the manner described herein, Vbolo will also be maintained at its corresponding low voltage and the relative size of its output signals may be reduced. As a result, noise, self -heating, and/or other phenomena may have a greater effect on the smaller output signals read out from infrared sensors 132, resulting in variations (e.g., errors) in the output signals. If uncorrected, these variations may be exhibited as noise in the image frames.
- low voltage operation may reduce the overall amount of certain phenomena (e.g., self-heating)
- the smaller output signals may permit the remaining error sources (e.g., residual self -heating) to have a disproportionate effect on the output signals during low voltage operation.
- infrared sensor assembly 128, infrared imaging module 100, and/or host device 102 may be implemented with various array sizes, frame rates, and/or frame averaging techniques.
- infrared sensors 132 may be implemented with array sizes ranging from 32 by 32 to 160 by 120 infrared sensors 132.
- Other example array sizes include 80 by 64, 80 by 60, 64 by 64, and 64 by 32. Any desired array size may be used.
- infrared sensor assembly 128 may provide image frames at relatively high frame rates without requiring significant changes to ROIC and related circuitry.
- frame rates may range from approximately 120 Hz to approximately 480 Hz.
- the array size and the frame rate may be scaled relative to each other (e.g., in an inversely proportional manner or otherwise) such that larger arrays are implemented with lower frame rates, and smaller arrays are implemented with higher frame rates.
- an array of 160 by 120 may provide a frame rate of approximately 120 Hz.
- an array of 80 by 60 may provide a correspondingly higher frame rate of approximately 240 Hz.
- Other frame rates are also contemplated.
- the particular readout timing of rows and/or columns of the FPA array may remain consistent, regardless of the actual FPA array size or frame rate.
- the readout timing may be approximately 63 microseconds per row or column.
- the image frames captured by infrared sensors 132 may be provided to a frame averager 804 that integrates multiple image frames to provide image frames 802 (e.g., processed image frames) with a lower frame rate (e.g., approximately 30 Hz, approximately 60 Hz, or other frame rates) and with an improved signal to noise ratio.
- image frames 802 e.g., processed image frames
- a lower frame rate e.g., approximately 30 Hz, approximately 60 Hz, or other frame rates
- image noise attributable to low voltage operation may be effectively averaged out and/or substantially reduced in image frames 802.
- infrared sensor assembly 128 may be operated at relatively low voltages provided by LDO 1220 as discussed without experiencing additional noise and related side effects in the resulting image frames 802 after processing by frame averager 804.
- infrared sensors 132 may be used together to provide higher resolution image frames (e.g., a scene may be imaged across multiple such arrays).
- Such arrays may be provided in multiple infrared sensor assemblies 128 and/or provided in the same infrared sensor assembly 128. Each such array may be operated at low voltages as described, and also may be provided with associated ROIC circuitry such that each array may still be operated at a relatively high frame rate.
- the high frame rate image frames provided by such arrays may be averaged by shared or dedicated frame averagers 804 to reduce and/or eliminate noise associated with low voltage operation. As a result, high resolution infrared images may be obtained while still operating at low voltages.
- infrared sensor assembly 128 may be implemented with appropriate dimensions to permit infrared imaging module 100 to be used with a small form factor socket 104, such as a socket used for mobile devices.
- infrared sensor assembly 128 may be implemented with a chip size in a range of approximately 4.0 mm by approximately 4.0 mm to approximately 5.5 mm by approximately 5.5 mm (e.g., approximately 4.0 mm by approximately 5.5 mm in one example).
- Infrared sensor assembly 128 may be implemented with such sizes or other appropriate sizes to permit use with socket 104 implemented with various sizes such as: 8.5 mm by 8.5 mm, 8.5 mm by 5.9 mm, 6.0 mm by 6.0 mm, 5.5 mm by 5.5 mm, 4.5 mm by 4.5 mm, and/or other socket sizes such as, for example, those identified in Table 1 of U.S. Provisional Patent Application No. 61/495,873 previously referenced herein.
- various embodiments provided by the present disclosure can be implemented using hardware, software, or combinations of hardware and software. Also where applicable, the various hardware components and/or software components set forth herein can be combined into composite components comprising software, hardware, and/or both without departing from the spirit of the present disclosure. Where applicable, the various hardware components and/or software components set forth herein can be separated into sub-components comprising software, hardware, or both without departing from the spirit of the present disclosure. In addition, where applicable, it is contemplated that software components can be implemented as hardware components, and vice-versa.
- Non-transitory instructions, program code, and/or data can be stored on one or more non-transitory machine readable mediums. It is also contemplated that software identified herein can be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, the ordering of various steps described herein can be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Studio Devices (AREA)
- Photometry And Measurement Of Optical Pulse Characteristics (AREA)
- Solid State Image Pick-Up Elements (AREA)
Abstract
Various techniques are provided for implementing an infrared imaging system. In one example, a system includes a focal plane array (FPA). The FPA includes an array of infrared sensors adapted to image a scene. The FPA also includes a bias circuit adapted to provide a bias voltage to the infrared sensors. The bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts. The FPA also includes a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames. Other implementations are also provided.
Description
LOW POWER AND SMALL FORM FACTOR INFRARED IMAGING
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Patent Application No. 61/656,889 filed June 7, 2012 and entitled "LOW POWER AND SMALL FORM
FACTOR INFRARED IMAGING" which is hereby incorporated by reference in its entirety.
This application also claims the benefit of U.S. Provisional Patent Application No. 61/545,056 filed October 7, 2011 and entitled "NON-UNIFORMITY CORRECTION TECHNIQUES FOR INFRARED IMAGING DEVICES" which is hereby incorporated by reference in its entirety.
This application also claims the benefit of U.S. Provisional Patent Application No. 61/495,873 filed June 10, 2011 and entitled "INFRARED CAMERA PACKAGING SYSTEMS AND METHODS" which is hereby incorporated by reference in its entirety.
This application also claims the benefit of U.S. Provisional Patent Application No.
61/495,879 filed June 10, 2011 and entitled "INFRARED CAMERA SYSTEM
ARCHITECTURES" which is hereby incorporated by reference in its entirety.
This application also claims the benefit of U.S. Provisional Patent Application No. 61/495,888 filed June 10, 2011 and entitled "INFRARED CAMERA CALIBRATION TECHNIQUES" which is hereby incorporated by reference in its entirety.
TECHNICAL FIELD
One or more embodiments of the invention relate generally to thermal imaging devices and more particularly, for example, to the implementation and operation of such devices in a manner appropriate for low power and small form factor applications. BACKGROUND
Infrared imaging devices, such as infrared cameras or other devices, are typically implemented with an array of infrared sensors. However, existing infrared sensors and related circuitry are typically sensitive to noise and other phenomena. Because of such sensitivity, infrared sensors and related circuitry are often implemented with power supply arrangements wherein multiple voltage supply paths provide different voltages to various
circuit components. However, such implementations are typically complex and may be relatively inefficient.
In addition, during operation at conventional voltages and currents, infrared imaging devices may experience significant self -heating which may cause various undesirable thermal effects. Nevertheless, conventional techniques to reduce such effects are also less than ideal and may rely on active cooling or other measures that increase the cost and complexity of infrared imaging devices.
SUMMARY
Various techniques are provided for implementing and operating infrared imaging devices, especially for low power and small form factor applications. In one embodiment, a system includes a focal plane array (FPA) comprising: an array of infrared sensors adapted to image a scene; a bias circuit adapted to provide a bias voltage to the infrared sensors, wherein the bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts; and a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames.
In another embodiment, a system includes a focal plane array (FPA) comprising: an array of infrared sensors adapted to image a scene, wherein a size of the array of infrared sensors is less than approximately 160 by 120; and a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames, wherein the ROIC is adapted to provide the captured image frames at a frame rate selected from a range of approximately 120 Hz to approximately 480 Hz.
In another embodiment, a method includes providing a bias voltage from a bias circuit of a focal plane array (FPA) to an array of infrared sensors of the FPA, wherein the bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts; imaging the scene using the infrared sensors; and providing signals from the infrared sensors corresponding to captured image frames of the scene using a read out integrated circuit (ROIC) of the FPA.
In another embodiment, a method includes imaging a scene using an array of infrared sensors of a focal plane array (FPA), wherein a size of the array of infrared sensors is less than approximately 160 by 120; and providing signals from the infrared sensors corresponding to captured image frames using a read out integrated circuit (ROIC) of the
FPA, wherein the ROIC provides the captured image frames at a frame rate selected from a range of approximately 120 Hz to approximately 480 Hz.
In another embodiment, a system includes a focal plane array (FPA) comprising: a low-dropout regulator (LDO) integrated with the FPA and adapted to provide a regulated voltage in response to an external supply voltage; an array of infrared sensors adapted to image a scene; a bias circuit adapted to provide a bias voltage to the infrared sensors in response to the regulated voltage; and a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames.
In another embodiment, a method includes receiving an external supply voltage at a focal plane array (FPA); providing a regulated voltage in response to the external supply voltage using a low-dropout regulator (LDO) integrated with the FPA; providing a bias voltage to an array infrared sensors of the FPA in response to the regulated voltage using the bias circuit of the FPA; imaging a scene using the infrared sensors; and providing signals from the infrared sensors corresponding to captured image frames using a read out integrated circuit (ROIC) of the FPA.
The scope of the invention is defined by the claims, which are incorporated into this section by reference. A more complete understanding of embodiments of the invention will be afforded to those skilled in the art, as well as a realization of additional advantages thereof, by a consideration of the following detailed description of one or more
embodiments. Reference will be made to the appended sheets of drawings that will first be described briefly.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 illustrates an infrared imaging module configured to be implemented in a host device in accordance with an embodiment of the disclosure.
Fig. 2 illustrates an assembled infrared imaging module in accordance with an embodiment of the disclosure.
Fig. 3 illustrates an exploded view of an infrared imaging module juxtaposed over a socket in accordance with an embodiment of the disclosure.
Fig. 4 illustrates a block diagram of an infrared sensor assembly including an array of infrared sensors in accordance with an embodiment of the disclosure.
Fig. 5 illustrates a flow diagram of various operations to determine NUC terms in accordance with an embodiment of the disclosure.
Fig. 6 illustrates differences between neighboring pixels in accordance with an embodiment of the disclosure.
Fig. 7 illustrates a flat field correction technique in accordance with an embodiment of the disclosure.
Fig. 8 illustrates various image processing techniques of Fig. 5 and other operations applied in an image processing pipeline in accordance with an embodiment of the disclosure.
Fig. 9 illustrates a temporal noise reduction process in accordance with an embodiment of the disclosure.
Fig. 10 illustrates particular implementation details of several processes of the image processing pipeline of Fig. 6 in accordance with an embodiment of the disclosure.
Fig. 11 illustrates spatially correlated FPN in a neighborhood of pixels in accordance with an embodiment of the disclosure.
Fig. 12 illustrates a block diagram of another implementation of an infrared sensor assembly including an array of infrared sensors and a low-dropout regulator in accordance with an embodiment of the disclosure.
Fig. 13 illustrates a circuit diagram of a portion of the infrared sensor assembly of Fig. 12 in accordance with an embodiment of the disclosure.
Embodiments of the invention and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures.
DETAILED DESCRIPTION
Fig. 1 illustrates an infrared imaging module 100 (e.g., an infrared camera or an infrared imaging device) configured to be implemented in a host device 102 in accordance with an embodiment of the disclosure. Infrared imaging module 100 may be implemented, for one or more embodiments, with a small form factor and in accordance with wafer level packaging techniques or other packaging techniques.
In one embodiment, infrared imaging module 100 may be configured to be implemented in a small portable host device 102, such as a mobile telephone, a tablet
computing device, a laptop computing device, a personal digital assistant, a visible light camera, a music player, or any other appropriate mobile device. In this regard, infrared imaging module 100 may be used to provide infrared imaging features to host device 102. For example, infrared imaging module 100 may be configured to capture, process, and/or otherwise manage infrared images and provide such infrared images to host device 102 for use in any desired fashion (e.g., for further processing, to store in memory, to display, to use by various applications running on host device 102, to export to other devices, or other uses).
In various embodiments, infrared imaging module 100 may be configured to operate at low voltage levels and over a wide temperature range. For example, in one embodiment, infrared imaging module 100 may operate using a power supply of approximately 2.4 volts, 2.5 volts, 2.8 volts, or lower voltages, and operate over a temperature range of approximately -20 degrees C to approximately +60 degrees C (e.g., providing a suitable dynamic range and performance over an environmental temperature range of approximately 80 degrees C). In one embodiment, by operating infrared imaging module 100 at low voltage levels, infrared imaging module 100 may experience reduced amounts of self heating in comparison with other types of infrared imaging devices. As a result, infrared imaging module 100 may be operated with reduced measures to compensate for such self heating.
As shown in Fig. 1, host device 102 may include a socket 104, a shutter 105, motion sensors 194, a processor 195, a memory 196, a display 197, and/or other components 198. Socket 104 may be configured to receive infrared imaging module 100 as identified by arrow 101. In this regard, Fig. 2 illustrates infrared imaging module 100 assembled in socket 104 in accordance with an embodiment of the disclosure.
Motion sensors 194 may be implemented by one or more accelerometers, gyroscopes, or other appropriate devices that may be used to detect movement of host device 102. Motion sensors 194 may be monitored by and provide information to processing module 160 or processor 195 to detect motion. In various embodiments, motion sensors 194 may be implemented as part of host device 102 (as shown in Fig. 1), infrared imaging module 100, or other devices attached to or otherwise interfaced with host device
102.
Processor 195 may be implemented as any appropriate processing device (e.g., logic device, microcontroller, processor, application specific integrated circuit (ASIC), or other device) that may be used by host device 102 to execute appropriate instructions, such as software instructions provided in memory 196. Display 197 may be used to display captured and/or processed infrared images and/or other images, data, and information. Other components 198 may be used to implement any features of host device 102 as may be desired for various applications (e.g., clocks, temperature sensors, a visible light camera, or other components). In addition, a machine readable medium 193 may be provided for storing non-transitory instructions for loading into memory 196 and execution by processor 195.
In various embodiments, infrared imaging module 100 and socket 104 may be implemented for mass production to facilitate high volume applications, such as for implementation in mobile telephones or other devices (e.g., requiring small form factors). In one embodiment, the combination of infrared imaging module 100 and socket 104 may exhibit overall dimensions of approximately 8.5 mm by 8.5 mm by 5.9 mm while infrared imaging module 100 is installed in socket 104.
Fig. 3 illustrates an exploded view of infrared imaging module 100 juxtaposed over socket 104 in accordance with an embodiment of the disclosure. Infrared imaging module 100 may include a lens barrel 110, a housing 120, an infrared sensor assembly 128, a circuit board 170, a base 150, and a processing module 160.
Lens barrel 110 may at least partially enclose an optical element 180 (e.g., a lens) which is partially visible in Fig. 3 through an aperture 112 in lens barrel 110. Lens barrel 110 may include a substantially cylindrical extension 114 which may be used to interface lens barrel 110 with an aperture 122 in housing 120.
Infrared sensor assembly 128 may be implemented, for example, with a cap 130
(e.g., a lid) mounted on a substrate 140. Infrared sensor assembly 128 may include a plurality of infrared sensors 132 (e.g., infrared detectors) implemented in an array or other fashion on substrate 140 and covered by cap 130. For example, in one embodiment, infrared sensor assembly 128 may be implemented as a focal plane array (FPA). Such a focal plane array may be implemented, for example, as a vacuum package assembly (e.g., sealed by cap 130 and substrate 140). In one embodiment, infrared sensor assembly 128 may be implemented as a wafer level package (e.g., infrared sensor assembly 128 may be
singulated from a set of vacuum package assemblies provided on a wafer). In one embodiment, infrared sensor assembly 128 may be implemented to operate using a power supply of approximately 2.4 volts, 2.5 volts, 2.8 volts, or similar voltages.
Infrared sensors 132 may be configured to detect infrared radiation (e.g., infrared energy) from a target scene including, for example, mid wave infrared wave bands
(MWIR), long wave infrared wave bands (LWIR), and/or other thermal imaging bands as may be desired in particular implementations. In one embodiment, infrared sensor assembly 128 may be provided in accordance with wafer level packaging techniques.
Infrared sensors 132 may be implemented, for example, as microbolometers or other types of thermal imaging infrared sensors arranged in any desired array pattern to provide a plurality of pixels. In one embodiment, infrared sensors 132 may be
implemented as vanadium oxide (VOx) detectors with a 17 μπι pixel pitch. In various embodiments, arrays of approximately 32 by 32 infrared sensors 132, approximately 64 by 64 infrared sensors 132, approximately 80 by 64 infrared sensors 132, or other array sizes may be used.
Substrate 140 may include various circuitry including, for example, a read out integrated circuit (ROIC) with dimensions less than approximately 5.5 mm by 5.5 mm in one embodiment. Substrate 140 may also include bond pads 142 that may be used to contact complementary connections positioned on inside surfaces of housing 120 when infrared imaging module 100 is assembled as shown in Figs. 5A, 5B, and 5C. In one embodiment, the ROIC may be implemented with low-dropout regulators (LDO) to perform voltage regulation to reduce power supply noise introduced to infrared sensor assembly 128 and thus provide an improved power supply rejection ratio (PSRR).
Moreover, by implementing the LDO with the ROIC (e.g., within a wafer level package), less die area may be consumed and fewer discrete die (or chips) are needed.
Fig. 4 illustrates a block diagram of infrared sensor assembly 128 including an array of infrared sensors 132 in accordance with an embodiment of the disclosure. In the illustrated embodiment, infrared sensors 132 are provided as part of a unit cell array of a ROIC 402. ROIC 402 includes bias generation and timing control circuitry 404, column amplifiers 405, a column multiplexer 406, a row multiplexer 408, and an output amplifier
410. Image frames (e.g., thermal images) captured by infrared sensors 132 may be provided by output amplifier 410 to processing module 160, processor 195, and/or any
other appropriate components to perform various processing techniques described herein. Although an 8 by 8 array is shown in Fig. 4, any desired array configuration may be used in other embodiments. Further descriptions of ROICs and infrared sensors (e.g.,
microbolometer circuits) may be found in U.S. Patent No. 6,028,309 issued February 22, 2000, which is incorporated herein by reference in its entirety.
Infrared sensor assembly 128 may capture images (e.g., image frames) and provide such images from its ROIC at various rates. Processing module 160 may be used to perform appropriate processing of captured infrared images and may be implemented in accordance with any appropriate architecture. In one embodiment, processing module 160 may be implemented as an ASIC. In this regard, such an ASIC may be configured to perform image processing with high performance and/or high efficiency. In another embodiment, processing module 160 may be implemented with a general purpose central processing unit (CPU) which may be configured to execute appropriate software instructions to perform image processing, coordinate and perform image processing with various image processing blocks, coordinate interfacing between processing module 160 and host device 102, and/or other operations. In yet another embodiment, processing module 160 may be implemented with a field programmable gate array (FPGA).
Processing module 160 may be implemented with other types of processing and/or logic circuits in other embodiments as would be understood by one skilled in the art.
In these and other embodiments, processing module 160 may also be implemented with other components where appropriate, such as, volatile memory, non-volatile memory, and/or one or more interfaces (e.g., infrared detector interfaces, inter-integrated circuit (I2C) interfaces, mobile industry processor interfaces (MIPI), joint test action group (JTAG) interfaces (e.g., IEEE 1149.1 standard test access port and boundary-scan architecture), and/or other interfaces).
In some embodiments, infrared imaging module 100 may further include one or more actuators 199 which may be used to adjust the focus of infrared image frames captured by infrared sensor assembly 128. For example, actuators 199 may be used to move optical element 180, infrared sensors 132, and/or other components relative to each other to selectively focus and defocus infrared image frames in accordance with techniques described herein. Actuators 199 may be implemented in accordance with any type of
motion-inducing apparatus or mechanism, and may positioned at any location within or external to infrared imaging module 100 as appropriate for different applications.
When infrared imaging module 100 is assembled, housing 120 may substantially enclose infrared sensor assembly 128, base 150, and processing module 160. Housing 120 may facilitate connection of various components of infrared imaging module 100. For example, in one embodiment, housing 120 may provide electrical connections 126 to connect various components as further described.
Electrical connections 126 (e.g., conductive electrical paths, traces, or other types of connections) may be electrically connected with bond pads 142 when infrared imaging module 100 is assembled. In various embodiments, electrical connections 126 may be embedded in housing 120, provided on inside surfaces of housing 120, and/or otherwise provided by housing 120. Electrical connections 126 may terminate in connections 124 protruding from the bottom surface of housing 120 as shown in Fig. 3. Connections 124 may connect with circuit board 170 when infrared imaging module 100 is assembled (e.g., housing 120 may rest atop circuit board 170 in various embodiments). Processing module 160 may be electrically connected with circuit board 170 through appropriate electrical connections. As a result, infrared sensor assembly 128 may be electrically connected with processing module 160 through, for example, conductive electrical paths provided by: bond pads 142, complementary connections on inside surfaces of housing 120, electrical connections 126 of housing 120, connections 124, and circuit board 170. Advantageously, such an arrangement may be implemented without requiring wire bonds to be provided between infrared sensor assembly 128 and processing module 160.
In various embodiments, electrical connections 126 in housing 120 may be made from any desired material (e.g., copper or any other appropriate conductive material). In one embodiment, electrical connections 126 may aid in dissipating heat from infrared imaging module 100.
Other connections may be used in other embodiments. For example, in one embodiment, sensor assembly 128 may be attached to processing module 160 through a ceramic board that connects to sensor assembly 128 by wire bonds and to processing module 160 by a ball grid array (BGA). In another embodiment, sensor assembly 128 may be mounted directly on a rigid flexible board and electrically connected with wire bonds,
and processing module 160 may be mounted and connected to the rigid flexible board with wire bonds or a BGA.
The various implementations of infrared imaging module 100 and host device 102 set forth herein are provided for purposes of example, rather than limitation. In this regard, any of the various techniques described herein may be applied to any infrared camera system, infrared imager, or other device for performing infrared/thermal imaging.
Substrate 140 of infrared sensor assembly 128 may be mounted on base 150. In various embodiments, base 150 (e.g., a pedestal) may be made, for example, of copper formed by metal injection molding (MIM) and provided with a black oxide or nickel- coated finish. In various embodiments, base 150 may be made of any desired material, such as for example zinc, aluminum, or magnesium, as desired for a given application and may be formed by any desired applicable process, such as for example aluminum casting, MEVI, or zinc rapid casting, as may be desired for particular applications. In various embodiments, base 150 may be implemented to provide structural support, various circuit paths, thermal heat sink properties, and other features where appropriate. In one embodiment, base 150 may be a multi-layer structure implemented at least in part using ceramic material.
In various embodiments, circuit board 170 may receive housing 120 and thus may physically support the various components of infrared imaging module 100. In various embodiments, circuit board 170 may be implemented as a printed circuit board (e.g., an FR4 circuit board or other types of circuit boards), a rigid or flexible interconnect (e.g., tape or other type of interconnects), a flexible circuit substrate, a flexible plastic substrate, or other appropriate structures. In various embodiments, base 150 may be implemented with the various features and attributes described for circuit board 170, and vice versa.
Socket 104 may include a cavity 106 configured to receive infrared imaging module
100 (e.g., as shown in the assembled view of Fig. 2). Infrared imaging module 100 and/or socket 104 may include appropriate tabs, arms, pins, fasteners, or any other appropriate engagement members which may be used to secure infrared imaging module 100 to or within socket 104 using friction, tension, adhesion, and/or any other appropriate manner. Socket 104 may include engagement members 107 that may engage surfaces 109 of housing 120 when infrared imaging module 100 is inserted into a cavity 106 of socket 104. Other types of engagement members may be used in other embodiments.
Infrared imaging module 100 may be electrically connected with socket 104 through appropriate electrical connections (e.g., contacts, pins, wires, or any other appropriate connections). For example, socket 104 may include electrical connections 108 which may contact corresponding electrical connections of infrared imaging module 100 (e.g., interconnect pads, contacts, or other electrical connections on side or bottom surfaces of circuit board 170, bond pads 142 or other electrical connections on base 150, or other connections). Electrical connections 108 may be made from any desired material (e.g., copper or any other appropriate conductive material). In one embodiment, electrical connections 108 may be mechanically biased to press against electrical connections of infrared imaging module 100 when infrared imaging module 100 is inserted into cavity 106 of socket 104. In one embodiment, electrical connections 108 may at least partially secure infrared imaging module 100 in socket 104. Other types of electrical connections may be used in other embodiments.
Socket 104 may be electrically connected with host device 102 through similar types of electrical connections. For example, in one embodiment, host device 102 may include electrical connections (e.g., soldered connections, snap-in connections, or other connections) that connect with electrical connections 108 passing through apertures 190. In various embodiments, such electrical connections may be made to the sides and/or bottom of socket 104.
Various components of infrared imaging module 100 may be implemented with flip chip technology which may be used to mount components directly to circuit boards without the additional clearances typically needed for wire bond connections. Flip chip
connections may be used, as an example, to reduce the overall size of infrared imaging module 100 for use in compact small form factor applications. For example, in one embodiment, processing module 160 may be mounted to circuit board 170 using flip chip connections. For example, infrared imaging module 100 may be implemented with such flip chip configurations.
In various embodiments, infrared imaging module 100 and/or associated
components may be implemented in accordance with various techniques (e.g., wafer level packaging techniques) as set forth in U.S. Patent Application No. 12/844,124 filed July 27,
2010, and U.S. Provisional Patent Application No. 61/469,651 filed March 30, 2011, which are incorporated herein by reference in their entirety. Furthermore, in accordance with one
or more embodiments, infrared imaging module 100 and/or associated components may be implemented, calibrated, tested, and/or used in accordance with various techniques, such as for example as set forth in U.S. Patent No. 7,470,902 issued December 30, 2008, U.S. Patent No. 6,028,309 issued February 22, 2000, U.S. Patent No. 6,812,465 issued
November 2, 2004, U.S. Patent No. 7,034,301 issued April 25, 2006, U.S. Patent No.
7,679,048 issued March 16, 2010, U.S. Patent No. 7,470,904 issued December 30, 2008, U.S. Patent Application No. 12/202,880 filed September 2, 2008, and U.S. Patent
Application No. 12/202,896 filed September 2, 2008, which are incorporated herein by reference in their entirety.
Referring again to Fig. 1, in various embodiments, host device 102 may include shutter 105. In this regard, shutter 105 may be selectively positioned over socket 104 (e.g., as identified by arrows 103) while infrared imaging module 100 is installed therein. In this regard, shutter 105 may be used, for example, to protect infrared imaging module 100 when not in use. Shutter 105 may also be used as a temperature reference as part of a calibration process (e.g., a NUC process or other calibration processes) for infrared imaging module 100 as would be understood by one skilled in the art.
In various embodiments, shutter 105 may be made from various materials such as, for example, polymers, glass, aluminum (e.g., painted or anodized) or other materials. In various embodiments, shutter 105 may include one or more coatings to selectively filter electromagnetic radiation and/or adjust various optical properties of shutter 105 (e.g., a uniform blackbody coating or a reflective gold coating).
In another embodiment, shutter 105 may be fixed in place to protect infrared imaging module 100 at all times. In this case, shutter 105 or a portion of shutter 105 may be made from appropriate materials (e.g., polymers or infrared transmitting materials such as silicon, germanium, zinc selenide, or chalcogenide glasses) that do not substantially filter desired infrared wavelengths. In another embodiment, a shutter may be implemented as part of infrared imaging module 100 (e.g., within or as part of a lens barrel or other components of infrared imaging module 100), as would be understood by one skilled in the art.
Alternatively, in another embodiment, a shutter (e.g., shutter 105 or other type of external or internal shutter) need not be provided, but rather a NUC process or other type of calibration may be performed using shutterless techniques. In another embodiment, a NUC
process or other type of calibration using shutterless techniques may be performed in combination with shutter-based techniques.
Infrared imaging module 100 and host device 102 may be implemented in accordance with any of the various techniques set forth in U.S. Provisional Patent
Application No. 61/495,873 filed June 10, 2011, U.S. Provisional Patent Application No. 61/495,879 filed June 10, 2011, and U.S. Provisional Patent Application No. 61/495,888 filed June 10, 2011, which are incorporated herein by reference in their entirety.
In various embodiments, the components of host device 102 and/or infrared imaging module 100 may be implemented as a local or distributed system with components in communication with each other over wired and/or wireless networks. Accordingly, the various operations identified in this disclosure may be performed by local and/or remote components as may be desired in particular implementations.
Fig. 5 illustrates a flow diagram of various operations to determine NUC terms in accordance with an embodiment of the disclosure. In some embodiments, the operations of Fig. 5 may be performed by processing module 160 or processor 195 (both also generally referred to as a processor) operating on image frames captured by infrared sensors 132.
In block 505, infrared sensors 132 begin capturing image frames of a scene.
Typically, the scene will be the real world environment in which host device 102 is currently located. In this regard, shutter 105 (if optionally provided) may be opened to permit infrared imaging module to receive infrared radiation from the scene. Infrared sensors 132 may continue capturing image frames during all operations shown in Fig. 5. In this regard, the continuously captured image frames may be used for various operations as further discussed. In one embodiment, the captured image frames may be temporally filtered (e.g., in accordance with the process of block 826 further described herein with regard to Fig. 8) and be processed by other terms (e.g., factory gain terms 812, factory offset terms 816, previously determined NUC terms 817, column FPN terms 820, and row FPN terms 824 as further described herein with regard to Fig. 8) before they are used in the operations shown in Fig. 5.
In block 510, a NUC process initiating event is detected. In one embodiment, the NUC process may be initiated in response to physical movement of host device 102. Such movement may be detected, for example, by motion sensors 194 which may be polled by a processor. In one example, a user may move host device 102 in a particular manner, such
as by intentionally waving host device 102 back and forth in an "erase" or "swipe" movement. In this regard, the user may move host device 102 in accordance with a predetermined speed and direction (velocity), such as in an up and down, side to side, or other pattern to initiate the NUC process. In this example, the use of such movements may permit the user to intuitively operate host device 102 to simulate the "erasing" of noise in captured image frames.
In another example, a NUC process may be initiated by host device 102 if motion exceeding a threshold value is exceeded (e.g., motion greater than expected for ordinary use). It is contemplated that any desired type of spatial translation of host device 102 may be used to initiate the NUC process.
In yet another example, a NUC process may be initiated by host device 102 if a minimum time has elapsed since a previously performed NUC process. In a further example, a NUC process may be initiated by host device 102 if infrared imaging module 100 has experienced a minimum temperature change since a previously performed NUC process. In a still further example, a NUC process may be continuously initiated and repeated.
In block 515, after a NUC process initiating event is detected, it is determined whether the NUC process should actually be performed. In this regard, the NUC process may be selectively initiated based on whether one or more additional conditions are met. For example, in one embodiment, the NUC process may not be performed unless a minimum time has elapsed since a previously performed NUC process. In another embodiment, the NUC process may not be performed unless infrared imaging module 100 has experienced a minimum temperature change since a previously performed NUC process. Other criteria or conditions may be used in other embodiments. If appropriate criteria or conditions have been met, then the flow diagram continues to block 520.
Otherwise, the flow diagram returns to block 505.
In the NUC process, blurred image frames may be used to determine NUC terms which may be applied to captured image frames to correct for FPN. As discussed, in one embodiment, the blurred image frames may be obtained by accumulating multiple image frames of a moving scene (e.g., captured while the scene and/or the thermal imager is in motion). In another embodiment, the blurred image frames may be obtained by defocusing an optical element or other component of the thermal imager.
Accordingly, in block 520 a choice of either approach is provided. If the motion- based approach is used, then the flow diagram continues to block 525. If the defocus-based approach is used, then the flow diagram continues to block 530.
Referring now to the motion-based approach, in block 525 motion is detected. For example, in one embodiment, motion may be detected based on the image frames captured by infrared sensors 132. In this regard, an appropriate motion detection process (e.g., an image registration process, a frame-to-frame difference calculation, or other appropriate process) may be applied to captured image frames to determine whether motion is present (e.g., whether static or moving image frames have been captured). For example, in one embodiment, it can be determined whether pixels or regions around the pixels of consecutive image frames have changed more than a user defined amount (e.g., a percentage and/or threshold value). If at least a given percentage of pixels have changed by at least the user defined amount, then motion will be detected with sufficient certainty to proceed to block 535.
In another embodiment, motion may be determined on a per pixel basis, wherein only pixels that exhibit significant changes are accumulated to provide the blurred image frame. For example, counters may be provided for each pixel and used to ensure that the same number of pixel values are accumulated for each pixel, or used to average the pixel values based on the number of pixel values actually accumulated for each pixel. Other types of image-based motion detection may be performed such as performing a Radon transform.
In another embodiment, motion may be detected based on data provided by motion sensors 194. In one embodiment, such motion detection may include detecting whether host device 102 is moving along a relatively straight trajectory through space. For example, if host device 102 is moving along a relatively straight trajectory, then it is possible that certain objects appearing in the imaged scene may not be sufficiently blurred (e.g., objects in the scene that may be aligned with or moving substantially parallel to the straight trajectory). Thus, in such an embodiment, the motion detected by motion sensors 194 may be conditioned on host device 102 exhibiting, or not exhibiting, particular trajectories.
In yet another embodiment, both a motion detection process and motion sensors 194 may be used. Thus, using any of these various embodiments, a determination can be
made as to whether or not each image frame was captured while at least a portion of the scene and host device 102 were in motion relative to each other (e.g., which may be caused by host device 102 moving relative to the scene, at least a portion of the scene moving relative to host device 102, or both).
It is expected that the image frames for which motion was detected may exhibit some secondary blurring of the captured scene (e.g., blurred thermal image data associated with the scene) due to the thermal time constants of infrared sensors 132 (e.g.,
microbolometer thermal time constants) interacting with the scene movement.
In block 535, image frames for which motion was detected are accumulated. For example, if motion is detected for a continuous series of image frames, then the image frames of the series may be accumulated. As another example, if motion is detected for only some image frames, then the non-moving image frames may be skipped and not included in the accumulation. Thus, a continuous or discontinuous set of image frames may be selected to be accumulated based on the detected motion.
In block 540, the accumulated image frames are averaged to provide a blurred image frame. Because the accumulated image frames were captured during motion, it is expected that actual scene information will vary between the image frames and thus cause the scene information to be further blurred in the resulting blurred image frame (block 545).
In contrast, FPN (e.g., caused by one or more components of infrared imaging module 100) will remain fixed over at least short periods of time and over at least limited changes in scene irradiance during motion. As a result, image frames captured in close proximity in time and space during motion will suffer from identical or at least very similar FPN. Thus, although scene information may change in consecutive image frames, the FPN will stay essentially constant. By averaging, multiple image frames captured during motion will blur the scene information, but will not blur the FPN. As a result, FPN will remain more clearly defined in the blurred image frame provided in block 545 than the scene information.
In one embodiment, 32 or more image frames are accumulated and averaged in blocks 535 and 540. However, any desired number of image frames may be used in other embodiments, but with generally decreasing correction accuracy as frame count is decreased.
Referring now to the defocus-based approach, in block 530, a defocus operation may be performed to intentionally defocus the image frames captured by infrared sensors 132. For example, in one embodiment, one or more actuators 199 may be used to adjust, move, or otherwise translate optical element 180, infrared sensor assembly 128, and/or other components of infrared imaging module 100 to cause infrared sensors 132 to capture a blurred (e.g., unfocused) image frame of the scene. Other non-actuator based techniques are also contemplated for intentionally defocusing infrared image frames such as, for example, manual (e.g., user-initiated) defocusing.
Although the scene may appear blurred in the image frame, FPN (e.g., caused by one or more components of infrared imaging module 100) will remain unaffected by the defocusing operation. As a result, a blurred image frame of the scene will be provided (block 545) with FPN remaining more clearly defined in the blurred image than the scene information.
In the above discussion, the defocus-based approach has been described with regard to a single captured image frame. In another embodiment, the defocus-based approach may include accumulating multiple image frames while the infrared imaging module 100 has been defocused and averaging the defocused image frames to remove the effects of temporal noise and provide a blurred image frame in block 545.
Thus, it will be appreciated that a blurred image frame may be provided in block 545 by either the motion-based approach or the defocus-based approach. Because much of the scene information will be blurred by either motion, defocusing, or both, the blurred image frame may be effectively considered a low pass filtered version of the original captured image frames with respect to scene information.
In block 550, the blurred image frame is processed to determine updated row and column FPN terms (e.g., if row and column FPN terms have not been previously determined then the updated row and column FPN terms may be new row and column FPN terms in the first iteration of block 550). As used in this disclosure, the terms row and column may be used interchangeably depending on the orientation of infrared sensors 132 and/or other components of infrared imaging module 100.
In one embodiment, block 550 includes determining a spatial FPN correction term for each row of the blurred image frame (e.g., each row may have its own spatial FPN correction term), and also determining a spatial FPN correction term for each column of the
blurred image frame (e.g., each column may have its own spatial FPN correction term). Such processing may be used to reduce the spatial and slowly varying (1/f) row and column FPN inherent in thermal imagers caused by, for example, 1/f noise characteristics of amplifiers in ROIC 402 which may manifest as vertical and horizontal stripes in image frames.
Advantageously, by determining spatial row and column FPN terms using the blurred image frame, there will be a reduced risk of vertical and horizontal objects in the actual imaged scene from being mistaken for row and column noise (e.g., real scene content will be blurred while FPN remains unblurred).
In one embodiment, row and column FPN terms may be determined by considering differences between neighboring pixels of the blurred image frame. For example, Fig. 6 illustrates differences between neighboring pixels in accordance with an embodiment of the disclosure. Specifically, in Fig. 6 a pixel 610 is compared to its 8 nearest horizontal neighbors: d0-d3 on one side and d4-d7 on the other side. Differences between the neighbor pixels can be averaged to obtain an estimate of the offset error of the illustrated group of pixels. An offset error may be calculated for each pixel in a row or column and the average result may be used to correct the entire row or column.
To prevent real scene data from being interpreted as noise, upper and lower threshold values may be used (thPix and -thPix). Pixel values falling outside these threshold values (pixels dl and d4 in this example) are not used to obtain the offset error. In addition, the maximum amount of row and column FPN correction may be limited by these threshold values.
Further techniques for performing spatial row and column FPN correction processing are set forth in U.S. Patent Application No. 12/396,340 filed March 2, 2009 which is incorporated herein by reference in its entirety.
Referring again to Fig. 5, the updated row and column FPN terms determined in block 550 are stored (block 552) and applied (block 555) to the blurred image frame provided in block 545. After these terms are applied, some of the spatial row and column FPN in the blurred image frame may be reduced. However, because such terms are applied generally to rows and columns, additional FPN may remain such as spatially uncorrelated
FPN associated with pixel to pixel drift or other causes. Neighborhoods of spatially correlated FPN may also remain which may not be directly associated with individual rows
and columns. Accordingly, further processing may be performed as discussed below to determine NUC terms.
In block 560, local contrast values (e.g., edges or absolute values of gradients between adjacent or small groups of pixels) in the blurred image frame are determined. If scene information in the blurred image frame includes contrasting areas that have not been significantly blurred (e.g., high contrast edges in the original scene data), then such features may be identified by a contrast determination process in block 560.
For example, local contrast values in the blurred image frame may be calculated, or any other desired type of edge detection process may be applied to identify certain pixels in the blurred image as being part of an area of local contrast. Pixels that are marked in this manner may be considered as containing excessive high spatial frequency scene information that would be interpreted as FPN (e.g., such regions may correspond to portions of the scene that have not been sufficiently blurred). As such, these pixels may be excluded from being used in the further determination of NUC terms. In one embodiment, such contrast detection processing may rely on a threshold that is higher than the expected contrast value associated with FPN (e.g., pixels exhibiting a contrast value higher than the threshold may be considered to be scene information, and those lower than the threshold may be considered to be exhibiting FPN).
In one embodiment, the contrast determination of block 560 may be performed on the blurred image frame after row and column FPN terms have been applied to the blurred image frame (e.g., as shown in Fig. 5). In another embodiment, block 560 may be performed prior to block 550 to determine contrast before row and column FPN terms are determined (e.g., to prevent scene based contrast from contributing to the determination of such terms).
Following block 560, it is expected that any high spatial frequency content remaining in the blurred image frame may be generally attributed to spatially uncorrected FPN. In this regard, following block 560, much of the other noise or actual desired scene based information has been removed or excluded from the blurred image frame due to: intentional blurring of the image frame (e.g., by motion or defocusing in blocks 520 through 545), application of row and column FPN terms (block 555), and contrast determination (block 560).
Thus, it can be expected that following block 560, any remaining high spatial frequency content (e.g., exhibited as areas of contrast or differences in the blurred image frame) may be attributed to spatially uncorrelated FPN. Accordingly, in block 565, the blurred image frame is high pass filtered. In one embodiment, this may include applying a high pass filter to extract the high spatial frequency content from the blurred image frame. In another embodiment, this may include applying a low pass filter to the blurred image frame and taking a difference between the low pass filtered image frame and the unfiltered blurred image frame to obtain the high spatial frequency content. In accordance with various embodiments of the present disclosure, a high pass filter may be implemented by calculating a mean difference between a sensor signal (e.g., a pixel value) and its neighbors.
In block 570, a flat field correction process is performed on the high pass filtered blurred image frame to determine updated NUC terms (e.g., if a NUC process has not previously been performed then the updated NUC terms may be new NUC terms in the first iteration of block 570).
For example, Fig. 7 illustrates a flat field correction technique 700 in accordance with an embodiment of the disclosure. In Fig. 7, a NUC term may be determined for each pixel 710 of the blurred image frame using the values of its neighboring pixels 712 to 726. For each pixel 710, several gradients may be determined based on the absolute difference between the values of various adjacent pixels. For example, absolute value differences may be determined between: pixels 712 and 714 (a left to right diagonal gradient), pixels 716 and 718 (a top to bottom vertical gradient), pixels 720 and 722 (a right to left diagonal gradient), and pixels 724 and 726 (a left to right horizontal gradient).
These absolute differences may be summed to provide a summed gradient for pixel 710. A weight value may be determined for pixel 710 that is inversely proportional to the summed gradient. This process may be performed for all pixels 710 of the blurred image frame until a weight value is provided for each pixel 710. For areas with low gradients (e.g., areas that are blurry or have low contrast), the weight value will be close to one. Conversely, for areas with high gradients, the weight value will be zero or close to zero. The update to the NUC term as estimated by the high pass filter is multiplied with the weight value.
In one embodiment, the risk of introducing scene information into the NUC terms can be further reduced by applying some amount of temporal damping to the NUC term determination process. For example, a temporal damping factor λ between 0 and 1 may be chosen such that the new NUC term (NUCNEW) stored is a weighted average of the old NUC term (NUCOLD) and the estimated updated NUC term (NUCUPDATE)- In one embodiment, this can be expressed as NUCNEW = -NUCOLD + (1- )-(NUCOLD+NUCUPDAIE).
Although the determination of NUC terms has been described with regard to gradients, local contrast values may be used instead where appropriate. Other techniques may also be used such as, for example, standard deviation calculations. Other types flat field correction processes may be performed to determine NUC terms including, for example, various processes identified in U.S. Patent No. 6,028,309 issued February 22, 2000, U.S. Patent No. 6,812,465 issued November 2, 2004, and U.S. Patent Application No. 12/114,865 filed May 5, 2008, which are incorporated herein by reference in their entirety.
Referring again to Fig. 5, block 570 may include additional processing of the NUC terms. For example, in one embodiment, to preserve the scene signal mean, the sum of all NUC terms may be normalized to zero by subtracting the NUC term mean from each NUC term. Also in block 570, to avoid row and column noise from affecting the NUC terms, the mean value of each row and column may be subtracted from the NUC terms for each row and column. As a result, row and column FPN filters using the row and column FPN terms determined in block 550 may be better able to filter out row and column noise in further iterations (e.g., as further shown in Fig. 8) after the NUC terms are applied to captured images (e.g., in block 580 further discussed herein). In this regard, the row and column FPN filters may in general use more data to calculate the per row and per column offset coefficients (e.g., row and column FPN terms) and may thus provide a more robust alternative for reducing spatially correlated FPN than the NUC terms which are based on high pass filtering to capture spatially uncorrelated noise.
In blocks 571-573, additional high pass filtering and further determinations of updated NUC terms may be optionally performed to remove spatially correlated FPN with lower spatial frequency than previously removed by row and column FPN terms. In this regard, some variability in infrared sensors 132 or other components of infrared imaging
module 100 may result in spatially correlated FPN noise that cannot be easily modeled as row or column noise. Such spatially correlated FPN may include, for example, window defects on a sensor package or a cluster of infrared sensors 132 that respond differently to irradiance than neighboring infrared sensors 132. In one embodiment, such spatially correlated FPN may be mitigated with an offset correction. If the amount of such spatially correlated FPN is significant, then the noise may also be detectable in the blurred image frame. Since this type of noise may affect a neighborhood of pixels, a high pass filter with a small kernel may not detect the FPN in the neighborhood (e.g., all values used in high pass filter may be taken from the neighborhood of affected pixels and thus may be affected by the same offset error). For example, if the high pass filtering of block 565 is performed with a small kernel (e.g., considering only immediately adjacent pixels that fall within a neighborhood of pixels affected by spatially correlated FPN), then broadly distributed spatially correlated FPN may not be detected.
For example, Fig. 11 illustrates spatially correlated FPN in a neighborhood of pixels in accordance with an embodiment of the disclosure. As shown in a sample image frame 1100, a neighborhood of pixels 1110 may exhibit spatially correlated FPN that is not precisely correlated to individual rows and columns and is distributed over a neighborhood of several pixels (e.g., a neighborhood of approximately 4 by 4 pixels in this example). Sample image frame 1100 also includes a set of pixels 1120 exhibiting substantially uniform response that are not used in filtering calculations, and a set of pixels 1130 that are used to estimate a low pass value for the neighborhood of pixels 1110. In one embodiment, pixels 1130 may be a number of pixels divisible by two in order to facilitate efficient hardware or software calculations.
Referring again to Fig. 5, in blocks 571-573, additional high pass filtering and further determinations of updated NUC terms may be optionally performed to remove spatially correlated FPN such as exhibited by pixels 1110. In block 571, the updated NUC terms determined in block 570 are applied to the blurred image frame. Thus, at this time, the blurred image frame will have been initially corrected for spatially correlated FPN (e.g., by application of the updated row and column FPN terms in block 555), and also initially corrected for spatially uncorrected FPN (e.g., by application of the updated NUC terms applied in block 571).
In block 572, a further high pass filter is applied with a larger kernel than was used in block 565, and further updated NUC terms may be determined in block 573. For example, to detect the spatially correlated FPN present in pixels 1110, the high pass filter applied in block 572 may include data from a sufficiently large enough neighborhood of pixels such that differences can be determined between unaffected pixels (e.g., pixels 1120) and affected pixels (e.g., pixels 1110). For example, a low pass filter with a large kernel can be used (e.g., an N by N kernel that is much greater than 3 by 3 pixels) and the results may be subtracted to perform appropriate high pass filtering.
In one embodiment, for computational efficiency, a sparse kernel may be used such that only a small number of neighboring pixels inside an N by N neighborhood are used. For any given high pass filter operation using distant neighbors (e.g., a large kernel), there is a risk of modeling actual (potentially blurred) scene information as spatially correlated FPN. Accordingly, in one embodiment, the temporal damping factor λ may be set close to 1 for updated NUC terms determined in block 573.
In various embodiments, blocks 571-573 may be repeated (e.g., cascaded) to iteratively perform high pass filtering with increasing kernel sizes to provide further updated NUC terms further correct for spatially correlated FPN of desired neighborhood sizes. In one embodiment, the decision to perform such iterations may be determined by whether spatially correlated FPN has actually been removed by the updated NUC terms of the previous performance of blocks 571-573.
After blocks 571-573 are finished, a decision is made regarding whether to apply the updated NUC terms to captured image frames (block 574). For example, if an average of the absolute value of the NUC terms for the entire image frame is less than a minimum threshold value, or greater than a maximum threshold value, the NUC terms may be deemed spurious or unlikely to provide meaningful correction. Alternatively, thresholding criteria may be applied to individual pixels to determine which pixels receive updated NUC terms. In one embodiment, the threshold values may correspond to differences between the newly calculated NUC terms and previously calculated NUC terms. In another
embodiment, the threshold values may be independent of previously calculated NUC terms. Other tests may be applied (e.g., spatial correlation tests) to determine whether the
NUC terms should be applied.
If the NUC terms are deemed spurious or unlikely to provide meaningful correction, then the flow diagram returns to block 505. Otherwise, the newly determined NUC terms are stored (block 575) to replace previous NUC terms (e.g., determined by a previously performed iteration of Fig. 5) and applied (block 580) to captured image frames.
Fig. 8 illustrates various image processing techniques of Fig. 5 and other operations applied in an image processing pipeline 800 in accordance with an embodiment of the disclosure. In this regard, pipeline 800 identifies various operations of Fig. 5 in the context of an overall iterative image processing scheme for correcting image frames provided by infrared imaging module 100. In some embodiments, pipeline 800 may be provided by processing module 160 or processor 195 (both also generally referred to as a processor) operating on image frames captured by infrared sensors 132.
Image frames captured by infrared sensors 132 may be provided to a frame averager 804 that integrates multiple image frames to provide image frames 802 with an improved signal to noise ratio. Frame averager 804 may be effectively provided by infrared sensors 132, ROIC 402, and other components of infrared sensor assembly 128 that are implemented to support high image capture rates. For example, in one
embodiment, infrared sensor assembly 128 may capture infrared image frames at a frame rate of 240 Hz (e.g., 240 images per second). In this embodiment, such a high frame rate may be implemented, for example, by operating infrared sensor assembly 128 at relatively low voltages (e.g., compatible with mobile telephone voltages) and by using a relatively small array of infrared sensors 132 (e.g., an array of 64 by 64 infrared sensors in one embodiment).
In one embodiment, such infrared image frames may be provided from infrared sensor assembly 128 to processing module 160 at a high frame rate (e.g., 240 Hz or other frame rates). In another embodiment, infrared sensor assembly 128 may integrate over longer time periods, or multiple time periods, to provide integrated (e.g., averaged) infrared image frames to processing module 160 at a lower frame rate (e.g., 30 Hz, 9 Hz, or other frame rates). Further information regarding implementations that may be used to provide high image capture rates may be found in U.S. Provisional Patent Application No.
61/495,879 previously referenced herein.
Image frames 802 proceed through pipeline 800 where they are adjusted by various terms, temporally filtered, used to determine the various adjustment terms, and gain compensated.
In blocks 810 and 814, factory gain terms 812 and factory offset terms 816 are applied to image frames 802 to compensate for gain and offset differences, respectively, between the various infrared sensors 132 and/or other components of infrared imaging module 100 determined during manufacturing and testing.
In block 580, NUC terms 817 are applied to image frames 802 to correct for FPN as discussed. In one embodiment, if NUC terms 817 have not yet been determined (e.g., before a NUC process has been initiated), then block 580 may not be performed or initialization values may be used for NUC terms 817 that result in no alteration to the image data (e.g., offsets for every pixel would be equal to zero).
In blocks 818 and 822, column FPN terms 820 and row FPN terms 824, respectively, are applied to image frames 802. Column FPN terms 820 and row FPN terms 824 may be determined in accordance with block 550 as discussed. In one embodiment, if the column FPN terms 820 and row FPN terms 824 have not yet been determined (e.g., before a NUC process has been initiated), then blocks 818 and 822 may not be performed or initialization values may be used for the column FPN terms 820 and row FPN terms 824 that result in no alteration to the image data (e.g., offsets for every pixel would be equal to zero).
In block 826, temporal filtering is performed on image frames 802 in accordance with a temporal noise reduction (TNR) process. Fig. 9 illustrates a TNR process in accordance with an embodiment of the disclosure. In Fig. 9, a presently received image frame 802a and a previously temporally filtered image frame 802b are processed to determine a new temporally filtered image frame 802e. Image frames 802a and 802b include local neighborhoods of pixels 803a and 803b centered around pixels 805a and 805b, respectively. Neighborhoods 803a and 803b correspond to the same locations within image frames 802a and 802b and are subsets of the total pixels in image frames 802a and 802b. In the illustrated embodiment, neighborhoods 803a and 803b include areas of 5 by 5 pixels. Other neighborhood sizes may be used in other embodiments.
Differences between corresponding pixels of neighborhoods 803a and 803b are determined and averaged to provide an averaged delta value 805c for the location
corresponding to pixels 805a and 805b. Averaged delta value 805c may be used to determine weight values in block 807 to be applied to pixels 805a and 805b of image frames 802a and 802b.
In one embodiment, as shown in graph 809, the weight values determined in block 807 may be inversely proportional to averaged delta value 805c such that weight values drop rapidly towards zero when there are large differences between neighborhoods 803a and 803b. In this regard, large differences between neighborhoods 803a and 803b may indicate that changes have occurred within the scene (e.g., due to motion) and pixels 802a and 802b may be appropriately weighted, in one embodiment, to avoid introducing blur across frame-to-frame scene changes. Other associations between weight values and averaged delta value 805c may be used in various embodiments.
The weight values determined in block 807 may be applied to pixels 805a and 805b to determine a value for corresponding pixel 805e of image frame 802e (block 811). In this regard, pixel 805e may have a value that is a weighted average (or other combination) of pixels 805a and 805b, depending on averaged delta value 805c and the weight values determined in block 807.
For example, pixel 805e of temporally filtered image frame 802e may be a weighted sum of pixels 805a and 805b of image frames 802a and 802b. If the average difference between pixels 805a and 805b is due to noise, then it may be expected that the average change between neighborhoods 805a and 805b will be close to zero (e.g., corresponding to the average of uncorrelated changes). Under such circumstances, it may be expected that the sum of the differences between neighborhoods 805a and 805b will be close to zero. In this case, pixel 805a of image frame 802a may both be appropriately weighted so as to contribute to the value of pixel 805e.
However, if the sum of such differences is not zero (e.g., even differing from zero by a small amount in one embodiment), then the changes may be interpreted as being attributed to motion instead of noise. Thus, motion may be detected based on the average change exhibited by neighborhoods 805a and 805b. Under these circumstances, pixel 805a of image frame 802a may be weighted heavily, while pixel 805b of image frame 802b may be weighted lightly.
Other embodiments are also contemplated. For example, although averaged delta value 805c has been described as being determined based on neighborhoods 805a and
805b, in other embodiments averaged delta value 805c may be determined based on any desired criteria (e.g., based on individual pixels or other types of groups of sets of pixels).
In the above embodiments, image frame 802a has been described as a presently received image frame and image frame 802b has been described as a previously temporally filtered image frame. In another embodiment, image frames 802a and 802b may be first and second image frames captured by infrared imaging module 100 that have not been temporally filtered.
Fig. 10 illustrates further implementation details in relation to the TNR process of block 826. As shown in Fig. 10, image frames 802a and 802b may be read into line buffers 1010a and 1010b, respectively, and image frame 802b (e.g., the previous image frame) may be stored in a frame buffer 1020 before being read into line buffer 1010b. In one embodiment, line buffers lOlOa-b and frame buffer 1020 may be implemented by a block of random access memory (RAM) provided by any appropriate component of infrared imaging module 100 and/or host device 102.
Referring again to Fig. 8, image frame 802e may be passed to an automatic gain compensation block 828 for further processing to provide a result image frame 830 that may be used by host device 102 as desired.
Fig. 8 further illustrates various operations that may be performed to determine row and column FPN terms and NUC terms as discussed. In one embodiment, these operations may use image frames 802e as shown in Fig. 8. Because image frames 802e have already been temporally filtered, at least some temporal noise may be removed and thus will not inadvertently affect the determination of row and column FPN terms 824 and 820 and NUC terms 817. In another embodiment, non-temporally filtered image frames 802 may be used.
In Fig. 8, blocks 510, 515, and 520 of Fig. 5 are collectively represented together.
As discussed, a NUC process may be selectively initiated and performed in response to various NUC process initiating events and based on various criteria or conditions. As also discussed, the NUC process may be performed in accordance with a motion-based approach (blocks 525, 535, and 540) or a defocus-based approach (block 530) to provide a blurred image frame (block 545). Fig. 8 further illustrates various additional blocks 550,
552, 555, 560, 565, 570, 571, 572, 573, and 575 previously discussed with regard to Fig. 5.
As shown in Fig. 8, row and column FPN terms 824 and 820 and NUC terms 817 may be determined and applied in an iterative fashion such that updated terms are determined using image frames 802 to which previous terms have already been applied. As a result, the overall process of Fig. 8 may repeatedly update and apply such terms to continuously reduce the noise in image frames 830 to be used by host device 102.
Referring again to Fig. 10, further implementation details are illustrated for various blocks of Figs. 5 and 8 in relation to pipeline 800. For example, blocks 525, 535, and 540 are shown as operating at the normal frame rate of image frames 802 received by pipeline 800. In the embodiment shown in Fig. 10, the determination made in block 525 is represented as a decision diamond used to determine whether a given image frame 802 has sufficiently changed such that it may be considered an image frame that will enhance the blur if added to other image frames and is therefore accumulated (block 535 is represented by an arrow in this embodiment) and averaged (block 540).
Also in Fig. 10, the determination of column FPN terms 820 (block 550) is shown as operating at an update rate that in this example is 1/32 of the sensor frame rate (e.g., normal frame rate) due to the averaging performed in block 540. Other update rates may be used in other embodiments. Although only column FPN terms 820 are identified in Fig. 10, row FPN terms 824 may be implemented in a similar fashion at the reduced frame rate.
Fig. 10 also illustrates further implementation details in relation to the NUC determination process of block 570. In this regard, the blurred image frame may be read to a line buffer 1030 (e.g., implemented by a block of RAM provided by any appropriate component of infrared imaging module 100 and/or host device 102). The flat field correction technique 700 of Fig. 7 may be performed on the blurred image frame.
In view of the present disclosure, it will be appreciated that techniques described herein may be used to remove various types of FPN (e.g., including very high amplitude
FPN) such as spatially correlated row and column FPN and spatially uncorrelated FPN.
Other embodiments are also contemplated. For example, in one embodiment, the rate at which row and column FPN terms and/or NUC terms are updated can be inversely proportional to the estimated amount of blur in the blurred image frame and/or inversely proportional to the magnitude of local contrast values (e.g., determined in block 560).
In various embodiments, the described techniques may provide advantages over conventional shutter-based noise correction techniques. For example, by using a
shutterless process, a shutter (e.g., such as shutter 105) need not be provided, thus permitting reductions in size, weight, cost, and mechanical complexity. Power and maximum voltage supplied to, or generated by, infrared imaging module 100 may also be reduced if a shutter does not need to be mechanically operated. Reliability will be improved by removing the shutter as a potential point of failure. A shutterless process also eliminates potential image interruption caused by the temporary blockage of the imaged scene by a shutter.
Also, by correcting for noise using intentionally blurred image frames captured from a real world scene (not a uniform scene provided by a shutter), noise correction may be performed on image frames that have irradiance levels similar to those of the actual scene desired to be imaged. This can improve the accuracy and effectiveness of noise correction terms determined in accordance with the various described techniques.
As discussed, in various embodiments, infrared imaging module 100 may be configured to operate at low voltage levels. In particular, infrared imaging module 100 may be implemented with circuitry configured to operate at low power and/or in accordance with other parameters that permit infrared imaging module 100 to be conveniently and effectively implemented in various types of host devices 102, such as mobile devices and other devices.
For example, Fig. 12 illustrates a block diagram of another implementation of infrared sensor assembly 128 including infrared sensors 132 and an LDO 1220 in accordance with an embodiment of the disclosure. As shown, Fig. 12 also illustrates various components 1202, 1204, 1205, 1206, 1208, and 1210 which may implemented in the same or similar manner as corresponding components previously described with regard to Fig. 4. Fig. 12 also illustrates bias correction circuitry 1212 which may be used to adjust one or more bias voltages provided to infrared sensors 132 (e.g., to compensate for temperature changes, self-heating, and/or other factors).
In some embodiments, LDO 1220 may be provided as part of infrared sensor assembly 128 (e.g., on the same chip and/or wafer level package as the ROIC). For example, LDO 1220 may be provided as part of an FPA with infrared sensor assembly 128. As discussed, such implementations may reduce power supply noise introduced to infrared sensor assembly 128 and thus provide an improved PSRR. In addition, by implementing
the LDO with the ROIC, less die area may be consumed and fewer discrete die (or chips) are needed.
LDO 1220 receives an input voltage provided by a power source 1230 over a supply line 1232. LDO 1220 provides an output voltage to various components of infrared sensor assembly 128 over supply lines 1222. In this regard, LDO 1220 may provide substantially identical regulated output voltages to various components of infrared sensor assembly 128 in response to a single input voltage received from power source 1230.
For example, in some embodiments, power source 1230 may provide an input voltage in a range of approximately 2.8 volts to approximately 11 volts (e.g.,
approximately 2.8 volts in one embodiment), and LDO 1220 may provide an output voltage in a range of approximately 1.5 volts to approximately 2.8 volts (e.g.,
approximately 2.5 volts in one embodiment). In this regard, LDO 1220 may be used to provide a consistent regulated output voltage, regardless of whether power source 1230 is implemented with a conventional voltage range of approximately 9 volts to approximately 11 volts, or a low voltage such as approximately 2.8 volts. As such, although various voltage ranges are provided for the input and output voltages, it is contemplated that the output voltage of LDO 1220 will remain fixed despite changes in the input voltage.
The implementation of LDO 1220 as part of infrared sensor assembly 128 provides various advantages over conventional power implementations for FPAs. For example, conventional FPAs typically rely on multiple power sources, each of which may be provided separately to the FPA, and separately distributed to the various components of the FPA. By regulating a single power source 1230 by LDO 1220, appropriate voltages may be separately provided (e.g., to reduce possible noise) to all components of infrared sensor assembly 128 with reduced complexity. The use of LDO 1220 also allows infrared sensor assembly 128 to operate in a consistent manner, even if the input voltage from power source 1230 changes (e.g., if the input voltage increases or decreases as a result of charging or discharging a battery or other type of device used for power source 1230).
The various components of infrared sensor assembly 128 shown in Fig. 12 may also be implemented to operate at lower voltages than conventional devices. For example, as discussed, LDO 1220 may be implemented to provide a low voltage (e.g., approximately
2.5 volts). This contrasts with the multiple higher voltages typically used to power conventional FPAs, such as: approximately 3.3 volts to approximately 5 volts used to
power digital circuitry; approximately 3.3 volts used to power analog circuitry; and approximately 9 volts to approximately 11 volts used to power loads. Also, in some embodiments, the use of LDO 1220 may reduce or eliminate the need for a separate negative reference voltage to be provided to infrared sensor assembly 128.
Additional aspects of the low voltage operation of infrared sensor assembly 128 may be further understood with reference to Fig. 13. Fig. 13 illustrates a circuit diagram of a portion of infrared sensor assembly 128 of Fig. 12 in accordance with an embodiment of the disclosure. In particular, Fig. 13 illustrates additional components of bias correction circuitry 1212 (e.g., components 1326, 1330, 1332, 1334, 1336, 1338, and 1341) connected to LDO 1220 and infrared sensors 132. For example, bias correction circuitry 1212 may be used to compensate for temperature-dependent changes in bias voltages in accordance with an embodiment of the present disclosure. The operation of such additional components may be further understood with reference to similar components identified in U.S. Patent No. 7,679,048 issued March 16, 2010 which is hereby incorporated by reference in its entirety. Infrared sensor assembly 128 may also be implemented in accordance with the various components identified in U.S. Patent No. 6,812,465 issued November 2, 2004 which is hereby incorporated by reference in its entirety.
In various embodiments, some or all of the bias correction circuitry 1212 may be implemented on a global array basis as shown in Fig. 13 (e.g., used for all infrared sensors 132 collectively in an array). In other embodiments, some or all of the bias correction circuitry 1212 may be implemented an individual sensor basis (e.g., entirely or partially duplicated for each infrared sensor 132). In some embodiments, bias correction circuitry 1212 and other components of Fig. 13 may be implemented as part of ROIC 1202.
As shown in Fig. 13, LDO 1220 provides a load voltage Vload to bias correction circuitry 1212 along one of supply lines 1222. As discussed, in some embodiments, Vload may be approximately 2.5 volts which contrasts with larger voltages of approximately 9 volts to approximately 11 volts that may be used as load voltages in conventional infrared imaging devices.
Based on Vload, bias correction circuitry 1212 provides a sensor bias voltage Vbolo at a node 1360. Vbolo may be distributed to one or more infrared sensors 132 through appropriate switching circuitry 1370 (e.g., represented by broken lines in Fig. 13). In some examples, switching circuitry 1370 may be implemented in accordance with appropriate
components identified in U.S. Patent Nos. 6,812,465 and 7,679,048 previously referenced herein.
Each infrared sensor 132 includes a node 1350 which receives Vbolo through switching circuitry 1370, and another node 1352 which may be connected to ground, a substrate, and/or a negative reference voltage. In some embodiments, the voltage at node 1360 may be substantially the same as Vbolo provided at nodes 1350. In other
embodiments, the voltage at node 1360 may be adjusted to compensate for possible voltage drops associated with switching circuitry 1370 and/or other factors.
Vbolo may be implemented with lower voltages than are typically used for conventional infrared sensor biasing. In one embodiment, Vbolo may be in a range of approximately 0.2 volts to approximately 0.7 volts. In another embodiment, Vbolo may be in a range of approximately 0.4 volts to approximately 0.6 volts. In another embodiment, Vbolo may be approximately 0.5 volts. In contrast, conventional infrared sensors typically use bias voltages of approximately 1 volt.
The use of a lower bias voltage for infrared sensors 132 in accordance with the present disclosure permits infrared sensor assembly 128 to exhibit significantly reduced power consumption in comparison with conventional infrared imaging devices. In particular, the power consumption of each infrared sensor 132 is reduced by the square of the bias voltage. As a result, a reduction from, for example, 1.0 volt to 0.5 volts provides a significant reduction in power, especially when applied to many infrared sensors 132 in an infrared sensor array. This reduction in power may also result in reduced self -heating of infrared sensor assembly 128.
In accordance with additional embodiments of the present disclosure, various techniques are provided for reducing the effects of noise in image frames provided by infrared imaging devices operating at low voltages. In this regard, when infrared sensor assembly 128 is operated with low voltages as described, noise, self -heating, and/or other phenomena may, if uncorrected, become more pronounced in image frames provided by infrared sensor assembly 128.
For example, referring to Fig. 13, when LDO 1220 maintains Vload at a low voltage in the manner described herein, Vbolo will also be maintained at its corresponding low voltage and the relative size of its output signals may be reduced. As a result, noise, self -heating, and/or other phenomena may have a greater effect on the smaller output
signals read out from infrared sensors 132, resulting in variations (e.g., errors) in the output signals. If uncorrected, these variations may be exhibited as noise in the image frames. Moreover, although low voltage operation may reduce the overall amount of certain phenomena (e.g., self-heating), the smaller output signals may permit the remaining error sources (e.g., residual self -heating) to have a disproportionate effect on the output signals during low voltage operation.
To compensate for such phenomena, infrared sensor assembly 128, infrared imaging module 100, and/or host device 102 may be implemented with various array sizes, frame rates, and/or frame averaging techniques. For example, as discussed, a variety of different array sizes are contemplated for infrared sensors 132. In some embodiments, infrared sensors 132 may be implemented with array sizes ranging from 32 by 32 to 160 by 120 infrared sensors 132. Other example array sizes include 80 by 64, 80 by 60, 64 by 64, and 64 by 32. Any desired array size may be used.
Advantageously, when implemented with such relatively small array sizes, infrared sensor assembly 128 may provide image frames at relatively high frame rates without requiring significant changes to ROIC and related circuitry. For example, in some embodiments, frame rates may range from approximately 120 Hz to approximately 480 Hz.
In some embodiments, the array size and the frame rate may be scaled relative to each other (e.g., in an inversely proportional manner or otherwise) such that larger arrays are implemented with lower frame rates, and smaller arrays are implemented with higher frame rates. For example, in one embodiment, an array of 160 by 120 may provide a frame rate of approximately 120 Hz. In another embodiment, an array of 80 by 60 may provide a correspondingly higher frame rate of approximately 240 Hz. Other frame rates are also contemplated.
By scaling the array size and the frame rate relative to each other, the particular readout timing of rows and/or columns of the FPA array may remain consistent, regardless of the actual FPA array size or frame rate. In one embodiment, the readout timing may be approximately 63 microseconds per row or column.
As previously discussed with regard to Fig. 8, the image frames captured by infrared sensors 132 may be provided to a frame averager 804 that integrates multiple image frames to provide image frames 802 (e.g., processed image frames) with a lower frame rate (e.g., approximately 30 Hz, approximately 60 Hz, or other frame rates) and with
an improved signal to noise ratio. In particular, by averaging the high frame rate image frames provided by a relatively small FPA array, image noise attributable to low voltage operation may be effectively averaged out and/or substantially reduced in image frames 802. Accordingly, infrared sensor assembly 128 may be operated at relatively low voltages provided by LDO 1220 as discussed without experiencing additional noise and related side effects in the resulting image frames 802 after processing by frame averager 804.
Other embodiments are also contemplated. For example, although a single array of infrared sensors 132 is illustrated, it is contemplated that multiple such arrays may be used together to provide higher resolution image frames (e.g., a scene may be imaged across multiple such arrays). Such arrays may be provided in multiple infrared sensor assemblies 128 and/or provided in the same infrared sensor assembly 128. Each such array may be operated at low voltages as described, and also may be provided with associated ROIC circuitry such that each array may still be operated at a relatively high frame rate. The high frame rate image frames provided by such arrays may be averaged by shared or dedicated frame averagers 804 to reduce and/or eliminate noise associated with low voltage operation. As a result, high resolution infrared images may be obtained while still operating at low voltages.
In various embodiments, infrared sensor assembly 128 may be implemented with appropriate dimensions to permit infrared imaging module 100 to be used with a small form factor socket 104, such as a socket used for mobile devices. For example, in some embodiments, infrared sensor assembly 128 may be implemented with a chip size in a range of approximately 4.0 mm by approximately 4.0 mm to approximately 5.5 mm by approximately 5.5 mm (e.g., approximately 4.0 mm by approximately 5.5 mm in one example). Infrared sensor assembly 128 may be implemented with such sizes or other appropriate sizes to permit use with socket 104 implemented with various sizes such as: 8.5 mm by 8.5 mm, 8.5 mm by 5.9 mm, 6.0 mm by 6.0 mm, 5.5 mm by 5.5 mm, 4.5 mm by 4.5 mm, and/or other socket sizes such as, for example, those identified in Table 1 of U.S. Provisional Patent Application No. 61/495,873 previously referenced herein.
Where applicable, various embodiments provided by the present disclosure can be implemented using hardware, software, or combinations of hardware and software. Also where applicable, the various hardware components and/or software components set forth herein can be combined into composite components comprising software, hardware, and/or
both without departing from the spirit of the present disclosure. Where applicable, the various hardware components and/or software components set forth herein can be separated into sub-components comprising software, hardware, or both without departing from the spirit of the present disclosure. In addition, where applicable, it is contemplated that software components can be implemented as hardware components, and vice-versa.
Software in accordance with the present disclosure, such as non-transitory instructions, program code, and/or data, can be stored on one or more non-transitory machine readable mediums. It is also contemplated that software identified herein can be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, the ordering of various steps described herein can be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.
Embodiments described above illustrate but do not limit the invention. It should also be understood that numerous modifications and variations are possible in accordance with the principles of the invention. Accordingly, the scope of the invention is defined only by the following claims.
Claims
What is claimed is: 1 . A system comprising:
a focal plane array (FPA) comprising:
an array of infrared sensors adapted to image a scene;
a bias circuit adapted to provide a bias voltage to the infrared sensors, wherein the bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts; and
a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames.
2 . The system of claim 1, wherein a size of the array of infrared sensors is less than approximately 160 by 120.
3 . The system of claim 2, wherein the size is one of 80 by 64, 80 by 60, 64 by 64, 64 by 32, or 32 by 32.
4 . The system of claim 1, wherein the ROIC is adapted to provide the captured image frames at a frame rate selected from a range of approximately 120 Hz to
approximately 480 Hz.
5 . The system of claim 4, wherein:
the frame rate is a first frame rate; and
the system further comprises a processor adapted to combine at least a subset of the captured image frames to provide processed image frames at a second frame rate selected from a range of approximately 30 Hz to approximately 60 Hz with reduced noise in comparison with the captured image frames.
6 . The system of claim 1, wherein the bias circuit is adapted to provide the bias voltage to the infrared sensors in response to a regulated voltage selected from a range of approximately 1.5 volts to approximately 2.8 volts.
7 . The system of claim 6, further comprising a low-dropout regulator (LDO) integrated with the FPA and adapted to provide the regulated voltage to the bias circuit in response to an external supply voltage selected from a range of approximately 2.8 volts to approximately 11 volts.
8 . The system of claim 1, wherein the FPA is a first FPA, wherein the array of infrared sensors is a first array, the system further comprising a second FPA comprising a second array of infrared sensors adapted to image the scene.
9 . The system of claim 1, wherein the system is an infrared imaging module adapted to be inserted into a socket having a size less than approximately 8.5 mm by 8.5 mm.
1 0 . The system of claim 1, further comprising a processor adapted to process an intentionally blurred image frame, wherein the blurred image frame comprises blurred thermal image data associated with the scene and noise introduced by the system, wherein the processor is adapted to:
use the blurred image frame to determine a plurality of non-uniformity correction (NUC) terms to reduce at least a portion of the noise; and
apply the NUC terms to the captured image frames.
1 1 . A system comprising:
a focal plane array (FPA) comprising:
an array of infrared sensors adapted to image a scene, wherein a size of the array of infrared sensors is less than approximately 160 by 120; and
a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames, wherein the ROIC is adapted to provide the captured image frames at a frame rate selected from a range of approximately 120 Hz to approximately 480 Hz.
12 . The system of claim 11, further comprising a bias circuit adapted to provide a bias voltage to the infrared sensors, wherein the bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts.
13 . The system of claim 12, wherein the bias circuit is adapted to provide the bias voltage to the infrared sensors in response to a regulated voltage selected from a range of approximately 1.5 volts to approximately 2.8 volts.
14 . The system of claim 13, further comprising a low-dropout regulator (LDO) integrated with the FPA and adapted to provide the regulated voltage to the bias circuit in response to an external supply voltage selected from a range of approximately 2.8 volts to approximately 11 volts.
15 . The system of claim 11, wherein the size is one of 80 by 64, 80 by 60, 64 by 64, 64 by 32, or 32 by 32.
1 6 . The system of claim 11, wherein:
the frame rate is a first frame rate; and
the system further comprises a processor adapted to combine at least a subset of the captured image frames to provide processed image frames at a second frame selected from a range of approximately 30 Hz to approximately 60 Hz with reduced noise in comparison with the captured image frames.
17 . The system of claim 11, wherein the FPA is a first FPA, wherein the array of infrared sensors is a first array, the system further comprising a second FPA comprising a second array of infrared sensors adapted to image the scene.
1 8 . The system of claim 11, wherein the system is an infrared imaging module adapted to be inserted into a socket having a size less than approximately 8.5 mm by 8.5 mm.
1 9 . The system of claim 11, further comprising a processor adapted to process an intentionally blurred image frame, wherein the blurred image frame comprises blurred thermal image data associated with the scene and noise introduced by the system, wherein the processor is adapted to:
use the blurred image frame to determine a plurality of non-uniformity correction (NUC) terms to reduce at least a portion of the noise; and
apply the NUC terms to the captured image frames.
2 0 . A method comprising:
providing a bias voltage from a bias circuit of a focal plane array (FPA) to an array of infrared sensors of the FPA, wherein the bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts;
imaging the scene using the infrared sensors; and
providing signals from the infrared sensors corresponding to captured image frames of the scene using a read out integrated circuit (ROIC) of the FPA.
2 1 . The method of claim 20, wherein a size of the array of infrared sensors is less than approximately 160 by 120.
22 . The method of claim 21, wherein the size is one of 80 by 64, 80 by 60, 64 by 64, 64 by 32, or 32 by 32.
23 . The method of claim 20, wherein the captured image frames are provided at a frame rate selected from a range of approximately 120 Hz to approximately 480 Hz.
24 . The method of claim 23, wherein:
the frame rate is a first frame rate; and
the method further comprises combining at least a subset of the captured image frames to provide processed image frames at a second frame rate selected from a range of approximately 30 Hz to approximately 60 Hz with reduced noise in comparison with the captured image frames.
25 . The method of claim 20, further comprising receiving a regulated voltage at the bias circuit selected from a range of approximately 1.5 volts to approximately 2.8 volts, wherein the bias voltage is provided from the bias circuit to the infrared sensors in response to the regulated voltage.
2 6 . The method of claim 25, further comprising providing the regulated voltage from a low-dropout regulator (LDO) integrated with the FPA to the bias circuit in response to an external supply voltage selected from a range of approximately 2.8 volts to approximately 11 volts.
27 . The method of claim 20, wherein the FPA is a first FPA, wherein the array of infrared sensors is a first array, the method further comprising imaging the scene using a second FPA comprising a second array of infrared sensors.
2 8 . The method of claim 20, wherein the FPA is part of an infrared imaging module adapted to be inserted into a socket having a size less than approximately 8.5 mm by 8.5 mm.
2 9 . The method of claim 20, further comprising:
receiving an intentionally blurred one of the captured image frames, wherein the blurred image frame comprises blurred thermal image data associated with the scene and noise introduced by an infrared imaging system;
processing the blurred image frame to determine a plurality of non-uniformity correction (NUC) terms to reduce at least a portion of the noise; and
applying the NUC terms to the captured image frames.
30 . A method comprising:
imaging a scene using an array of infrared sensors of a focal plane array (FPA), wherein a size of the array of infrared sensors is less than approximately 160 by 120; and providing signals from the infrared sensors corresponding to captured image frames using a read out integrated circuit (ROIC) of the FPA, wherein the ROIC provides the captured image frames at a frame rate selected from a range of approximately 120 Hz to approximately 480 Hz.
31 . The method of claim 30, further comprising providing a bias voltage from a bias circuit of the FPA to the infrared sensors, wherein the bias voltage is selected from a range of approximately 0.2 volts to approximately 0.7 volts.
32 . The method of claim 31, further comprising receiving a regulated voltage at the bias circuit selected from a range of approximately 1.5 volts to approximately 2.8 volts, wherein the bias voltage is provided from the bias circuit to the infrared sensors in response to the regulated voltage.
33 . The method of claim 32, further comprising providing the regulated voltage from a low-dropout regulator (LDO) integrated with the FPA to the bias circuit in response to an external supply voltage selected from a range of approximately 2.8 volts to approximately 11 volts.
34 . The method of claim 30, wherein the size is one of 80 by 64, 80 by 60, 64 by 64, 64 by 32, or 32 by 32.
35 . The method of claim 30, wherein:
the frame rate is a first frame rate; and
the method further comprises combining at least a subset of the captured image frames to provide processed image frames at a second frame rate selected from a range of approximately 30 Hz to approximately 60 Hz with reduced noise in comparison with the captured image frames.
3 6 . The method of claim 30, wherein the FPA is a first FPA, wherein the array of infrared sensors is a first array, the method further comprising imaging the scene using a second FPA comprising a second array of infrared sensors.
37 . The method of claim 30, wherein the FPA is part of an infrared imaging module adapted to be inserted into a socket having a size less than approximately 8.5 mm by 8.5 mm.
38 . The method of claim 30, further comprising:
receiving an intentionally blurred one of the captured image frames, wherein the blurred image frame comprises blurred thermal image data associated with the scene and noise introduced by an infrared imaging system;
processing the blurred image frame to determine a plurality of non-uniformity correction (NUC) terms to reduce at least a portion of the noise; and
applying the NUC terms to the captured image frames.
3 9 . A system comprising:
a focal plane array (FPA) comprising:
a low-dropout regulator (LDO) integrated with the FPA and adapted to provide a regulated voltage in response to an external supply voltage;
an array of infrared sensors adapted to image a scene;
a bias circuit adapted to provide a bias voltage to the infrared sensors in response to the regulated voltage; and
a read out integrated circuit (ROIC) adapted to provide signals from the infrared sensors corresponding to captured image frames.
4 0 . The system of claim 39, wherein the FPA is powered by only the external supply voltage.
4 1 . The system of claim 40, further comprising:
a multiplexer;
an output amplifier; and
wherein the LDO is adapted to provide the regulated voltage to the multiplexer and the output amplifier.
42 . The system of claim 41, further comprising a battery external to the FPA and adapted to provide the external supply voltage.
43 . The system of claim 39, wherein the bias circuit is part of the ROIC and is adapted to compensate for temperature-dependent changes in the bias voltage.
44 . The system of claim 39, wherein the FPA is a wafer level package.
45 . The system of claim 39, wherein the system is a mobile telephone.
4 6 . The system of claim 39, wherein the system is an infrared camera, the system further comprising a housing adapted to substantially enclose the FPA.
47 . The system of claim 46, further comprising a processor adapted to receive the captured image frames and provide result image frames with reduced noise in response thereto.
4 8 . The system of claim 39, further comprising a processor adapted to process an intentionally blurred one of the captured image frames, wherein the blurred image frame comprises blurred thermal image data associated with the scene and noise introduced by the system, wherein the processor is adapted to:
use the blurred image frame to determine a plurality of non-uniformity correction (NUC) terms to reduce at least a portion of the noise; and
apply the NUC terms to the captured image frames.
4 9 . A method comprising:
receiving an external supply voltage at a focal plane array (FPA);
providing a regulated voltage in response to the external supply voltage using a low-dropout regulator (LDO) integrated with the FPA;
providing a bias voltage to an array infrared sensors of the FPA in response to the regulated voltage using the bias circuit of the FPA;
imaging a scene using the infrared sensors; and
providing signals from the infrared sensors corresponding to captured image frames using a read out integrated circuit (ROIC) of the FPA.
50 . The method of claim 49, wherein the FPA is powered by only the external supply voltage.
51 . The method of claim 50, further comprising providing the regulated voltage to a multiplexer and an output amplifier of the FPA.
52 . The method of claim 50, further comprising providing the external supply voltage from a battery external to the FPA.
53 . The method of claim 49, wherein the bias circuit is part of the ROIC, the method further comprising compensating for temperature-dependent changes in the bias voltage using the bias circuit.
54 . The method of claim 49, wherein the FPA is a wafer level package.
55 . The method of claim 49, wherein the method is performed by an infrared imaging module of a mobile telephone.
5 6 . The method of claim 49, wherein the method is performed by an infrared camera comprising a housing adapted to substantially enclose the FPA.
57 . The method of claim 56, further comprising processing the captured image frames to provide result image frames with reduced noise.
58 . The method of claim 49, further comprising:
receiving an intentionally blurred one of the captured image frames, wherein the blurred image frame comprises blurred thermal image data associated with the scene and noise introduced by an infrared imaging system;
processing the blurred image frame to determine a plurality of non-uniformity correction (NUC) terms to reduce at least a portion of the noise; and
applying the NUC terms to the captured image frames.
Priority Applications (59)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280038760.0A CN103748867B (en) | 2011-06-10 | 2012-06-08 | Low-power consumption and small form factor infrared imaging |
KR1020147000703A KR101808375B1 (en) | 2011-06-10 | 2012-06-08 | Low power and small form factor infrared imaging |
EP12737632.5A EP2719167B1 (en) | 2011-06-10 | 2012-06-08 | Low power and small form factor infrared imaging |
US13/802,615 US9509924B2 (en) | 2011-06-10 | 2013-03-13 | Wearable apparatus with integrated infrared imaging module |
US13/802,495 US9235023B2 (en) | 2011-06-10 | 2013-03-13 | Variable lens sleeve spacer |
US13/802,578 US9058653B1 (en) | 2011-06-10 | 2013-03-13 | Alignment of visible light sources based on thermal images |
EP13770520.8A EP2829056B1 (en) | 2012-03-19 | 2013-03-14 | Wearable apparatus with integrated infrared imaging module |
CN201390000509.5U CN204615945U (en) | 2012-03-19 | 2013-03-14 | Wearable device |
KR1020147028165A KR101668544B1 (en) | 2012-03-19 | 2013-03-14 | Wearable apparatus with integrated infrared imaging module |
CA2867895A CA2867895C (en) | 2012-03-19 | 2013-03-14 | Wearable apparatus with integrated infrared imaging module |
PCT/US2013/031734 WO2013184220A2 (en) | 2012-03-19 | 2013-03-14 | Wearable apparatus with integrated infrared imaging module |
US13/839,118 US9706137B2 (en) | 2011-06-10 | 2013-03-15 | Electrical cabinet infrared monitor |
EP13717653.3A EP2834968B1 (en) | 2012-04-06 | 2013-04-03 | Electrical cabinet infrared monitor |
CA2869741A CA2869741C (en) | 2012-04-06 | 2013-04-03 | Electrical cabinet infrared monitor |
CN201390000539.6U CN204465713U (en) | 2012-04-06 | 2013-04-03 | For the surveillance of rack |
PCT/US2013/035149 WO2013152122A2 (en) | 2012-04-06 | 2013-04-03 | Electrical cabinet infrared monitor |
US13/892,202 US20130278771A1 (en) | 2011-06-10 | 2013-05-10 | Systems and methods for monitoring vehicle wheel assembly |
US13/893,809 US9843742B2 (en) | 2009-03-02 | 2013-05-14 | Thermal image frame capture using de-aligned sensor array |
US13/901,428 US20130258111A1 (en) | 2009-03-02 | 2013-05-23 | Device attachment with infrared imaging sensor |
US13/902,177 US20130314536A1 (en) | 2009-03-02 | 2013-05-24 | Systems and methods for monitoring vehicle occupants |
US13/902,115 US9948872B2 (en) | 2009-03-02 | 2013-05-24 | Monitor and control systems and methods for occupant safety and energy efficiency of structures |
US13/940,232 US9843743B2 (en) | 2009-06-03 | 2013-07-11 | Infant monitoring systems and methods using thermal imaging |
US13/966,052 US9473681B2 (en) | 2011-06-10 | 2013-08-13 | Infrared camera system housing with metalized surface |
US14/029,716 US9235876B2 (en) | 2009-03-02 | 2013-09-17 | Row and column noise reduction in thermal images |
US14/029,683 US9208542B2 (en) | 2009-03-02 | 2013-09-17 | Pixel-wise noise reduction in thermal images |
US14/034,493 US9716843B2 (en) | 2009-06-03 | 2013-09-23 | Measurement device for electrical installations and related methods |
US14/091,266 US9706138B2 (en) | 2010-04-23 | 2013-11-26 | Hybrid infrared sensor array having heterogeneous infrared sensors |
US14/092,794 US9848134B2 (en) | 2010-04-23 | 2013-11-27 | Infrared imager with integrated metal layers |
US14/097,197 US9517679B2 (en) | 2009-03-02 | 2013-12-04 | Systems and methods for monitoring vehicle occupants |
US14/101,245 US9706139B2 (en) | 2011-06-10 | 2013-12-09 | Low power and small form factor infrared imaging |
US14/106,696 US9918023B2 (en) | 2010-04-23 | 2013-12-13 | Segmented focal plane array architecture |
US14/106,666 US9207708B2 (en) | 2010-04-23 | 2013-12-13 | Abnormal clock rate detection in imaging sensor arrays |
US14/133,095 US9716844B2 (en) | 2011-06-10 | 2013-12-18 | Low power and small form factor infrared imaging |
US14/135,493 US9756262B2 (en) | 2009-06-03 | 2013-12-19 | Systems and methods for monitoring power systems |
US14/136,557 US9819880B2 (en) | 2009-06-03 | 2013-12-20 | Systems and methods of suppressing sky regions in images |
US14/137,573 US10091439B2 (en) | 2009-06-03 | 2013-12-20 | Imager with array of multiple infrared imaging modules |
US14/138,040 US9451183B2 (en) | 2009-03-02 | 2013-12-21 | Time spaced infrared image enhancement |
US14/138,052 US9635285B2 (en) | 2009-03-02 | 2013-12-21 | Infrared imaging enhancement with fusion |
US14/138,058 US10244190B2 (en) | 2009-03-02 | 2013-12-21 | Compact multi-spectrum imaging with fusion |
US14/138,055 US9292909B2 (en) | 2009-06-03 | 2013-12-21 | Selective image correction for infrared imaging devices |
US14/506,377 US20160156880A1 (en) | 2009-06-03 | 2014-10-03 | Durable compact multisensor observation devices |
US14/506,430 US9807319B2 (en) | 2009-06-03 | 2014-10-03 | Wearable imaging devices, systems, and methods |
US14/736,133 US9247131B2 (en) | 2011-06-10 | 2015-06-10 | Alignment of visible light sources based on thermal images |
US14/747,865 US10389953B2 (en) | 2011-06-10 | 2015-06-23 | Infrared imaging device having a shutter |
US14/747,202 US9986175B2 (en) | 2009-03-02 | 2015-06-23 | Device attachment with infrared imaging sensor |
US14/748,873 US10051210B2 (en) | 2011-06-10 | 2015-06-24 | Infrared detector array with selectable pixel binning systems and methods |
US14/748,542 US10841508B2 (en) | 2011-06-10 | 2015-06-24 | Electrical cabinet infrared monitor systems and methods |
US14/750,781 US9756264B2 (en) | 2009-03-02 | 2015-06-25 | Anomalous pixel detection |
US14/749,886 US9900526B2 (en) | 2011-06-10 | 2015-06-25 | Techniques to compensate for calibration drifts in infrared imaging devices |
US14/850,558 US10169666B2 (en) | 2011-06-10 | 2015-09-10 | Image-assisted remote control vehicle systems and methods |
US14/961,829 US9948878B2 (en) | 2010-04-23 | 2015-12-07 | Abnormal clock rate detection in imaging sensor arrays |
US15/269,905 US10033944B2 (en) | 2009-03-02 | 2016-09-19 | Time spaced infrared image enhancement |
US15/362,752 US10250822B2 (en) | 2011-06-10 | 2016-11-28 | Wearable apparatus with integrated infrared imaging module |
US15/375,069 US9998697B2 (en) | 2009-03-02 | 2016-12-09 | Systems and methods for monitoring vehicle occupants |
US15/645,988 US10110833B2 (en) | 2010-04-23 | 2017-07-10 | Hybrid infrared sensor array having heterogeneous infrared sensors |
US15/645,949 US10122944B2 (en) | 2011-06-10 | 2017-07-10 | Low power and small form factor infrared imaging |
US15/932,372 US10321031B2 (en) | 2011-06-10 | 2018-02-16 | Device attachment with infrared imaging sensor |
US15/919,116 US11070747B2 (en) | 2010-04-23 | 2018-03-12 | Segmented focal plane array architecture |
US16/147,381 US11445131B2 (en) | 2009-06-03 | 2018-09-28 | Imager with array of multiple infrared imaging modules |
Applications Claiming Priority (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161495873P | 2011-06-10 | 2011-06-10 | |
US201161495888P | 2011-06-10 | 2011-06-10 | |
US201161495879P | 2011-06-10 | 2011-06-10 | |
US61/495,873 | 2011-06-10 | ||
US61/495,888 | 2011-06-10 | ||
US61/495,879 | 2011-06-10 | ||
US201161545056P | 2011-10-07 | 2011-10-07 | |
US61/545,056 | 2011-10-07 | ||
US201261656889P | 2012-06-07 | 2012-06-07 | |
US61/656,889 | 2012-06-07 |
Related Parent Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2012/041749 Continuation-In-Part WO2012170949A2 (en) | 2009-03-02 | 2012-06-08 | Non-uniformity correction techniques for infrared imaging devices |
US13/622,178 Continuation-In-Part US9237284B2 (en) | 2009-03-02 | 2012-09-18 | Systems and methods for processing infrared images |
US13/802,615 Continuation-In-Part US9509924B2 (en) | 2011-06-10 | 2013-03-13 | Wearable apparatus with integrated infrared imaging module |
US13/966,052 Continuation-In-Part US9473681B2 (en) | 2009-03-02 | 2013-08-13 | Infrared camera system housing with metalized surface |
US14/099,818 Continuation-In-Part US9723227B2 (en) | 2009-03-02 | 2013-12-06 | Non-uniformity correction techniques for infrared imaging devices |
US14/101,258 Continuation-In-Part US9723228B2 (en) | 2009-03-02 | 2013-12-09 | Infrared camera system architectures |
Related Child Applications (26)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/396,340 Continuation-In-Part US8208026B2 (en) | 2009-03-02 | 2009-03-02 | Systems and methods for processing infrared images |
US12/766,739 Continuation-In-Part US8520970B2 (en) | 2009-03-02 | 2010-04-23 | Infrared resolution and contrast enhancement with fusion |
US201113902177A Continuation-In-Part | 2009-03-02 | 2011-05-24 | |
US29/423,027 Continuation-In-Part USD765081S1 (en) | 2009-03-02 | 2012-05-25 | Mobile communications device attachment with camera |
PCT/US2012/041739 Continuation-In-Part WO2012170941A1 (en) | 2009-03-02 | 2012-06-08 | Infrared camera system architectures |
US13/802,615 Continuation-In-Part US9509924B2 (en) | 2011-06-10 | 2013-03-13 | Wearable apparatus with integrated infrared imaging module |
US13/802,495 Continuation-In-Part US9235023B2 (en) | 2011-06-10 | 2013-03-13 | Variable lens sleeve spacer |
US13/802,578 Continuation-In-Part US9058653B1 (en) | 2011-06-10 | 2013-03-13 | Alignment of visible light sources based on thermal images |
US13/839,118 Continuation-In-Part US9706137B2 (en) | 2011-06-10 | 2013-03-15 | Electrical cabinet infrared monitor |
PCT/US2013/035149 Continuation-In-Part WO2013152122A2 (en) | 2011-06-10 | 2013-04-03 | Electrical cabinet infrared monitor |
US13/892,202 Continuation-In-Part US20130278771A1 (en) | 2011-06-10 | 2013-05-10 | Systems and methods for monitoring vehicle wheel assembly |
US13/893,809 Continuation-In-Part US9843742B2 (en) | 2009-03-02 | 2013-05-14 | Thermal image frame capture using de-aligned sensor array |
US13/901,428 Continuation-In-Part US20130258111A1 (en) | 2003-09-04 | 2013-05-23 | Device attachment with infrared imaging sensor |
US13/902,177 Continuation-In-Part US20130314536A1 (en) | 2009-03-02 | 2013-05-24 | Systems and methods for monitoring vehicle occupants |
US13/902,115 Continuation-In-Part US9948872B2 (en) | 2009-03-02 | 2013-05-24 | Monitor and control systems and methods for occupant safety and energy efficiency of structures |
US13/940,232 Continuation-In-Part US9843743B2 (en) | 2009-06-03 | 2013-07-11 | Infant monitoring systems and methods using thermal imaging |
US13/966,052 Continuation-In-Part US9473681B2 (en) | 2009-03-02 | 2013-08-13 | Infrared camera system housing with metalized surface |
US14/029,683 Continuation-In-Part US9208542B2 (en) | 2009-03-02 | 2013-09-17 | Pixel-wise noise reduction in thermal images |
US14/029,716 Continuation-In-Part US9235876B2 (en) | 2009-03-02 | 2013-09-17 | Row and column noise reduction in thermal images |
US14/034,493 Continuation-In-Part US9716843B2 (en) | 2009-06-03 | 2013-09-23 | Measurement device for electrical installations and related methods |
US14/091,266 Continuation US9706138B2 (en) | 2010-04-23 | 2013-11-26 | Hybrid infrared sensor array having heterogeneous infrared sensors |
US14/091,266 Continuation-In-Part US9706138B2 (en) | 2010-04-23 | 2013-11-26 | Hybrid infrared sensor array having heterogeneous infrared sensors |
US14/092,794 Continuation-In-Part US9848134B2 (en) | 2010-04-23 | 2013-11-27 | Infrared imager with integrated metal layers |
US14/097,197 Continuation-In-Part US9517679B2 (en) | 2009-03-02 | 2013-12-04 | Systems and methods for monitoring vehicle occupants |
US14/101,245 Continuation-In-Part US9706139B2 (en) | 2009-03-02 | 2013-12-09 | Low power and small form factor infrared imaging |
US14/101,245 Continuation US9706139B2 (en) | 2009-03-02 | 2013-12-09 | Low power and small form factor infrared imaging |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2012170946A2 true WO2012170946A2 (en) | 2012-12-13 |
WO2012170946A9 WO2012170946A9 (en) | 2013-04-04 |
WO2012170946A3 WO2012170946A3 (en) | 2013-05-23 |
Family
ID=46545878
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2012/041744 WO2012170946A2 (en) | 2009-03-02 | 2012-06-08 | Low power and small form factor infrared imaging |
Country Status (5)
Country | Link |
---|---|
US (3) | US9706139B2 (en) |
EP (1) | EP2719167B1 (en) |
KR (1) | KR101808375B1 (en) |
CN (2) | CN109618084B (en) |
WO (1) | WO2012170946A2 (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014100787A1 (en) * | 2012-12-21 | 2014-06-26 | Flir Systems, Inc. | Compact multi-spectrum imaging with fusion |
WO2014100786A1 (en) * | 2012-12-21 | 2014-06-26 | Flir Systems, Inc. | Selective image correction for infrared imaging devices |
US20140196131A1 (en) * | 2013-01-07 | 2014-07-10 | Salutron, Inc. | User authentication based on a wrist vein pattern |
US9208542B2 (en) | 2009-03-02 | 2015-12-08 | Flir Systems, Inc. | Pixel-wise noise reduction in thermal images |
EP2962451A1 (en) * | 2013-02-28 | 2016-01-06 | Raytheon Company | Method and apparatus for gain and level correction of multi-tap ccd cameras |
US9235876B2 (en) | 2009-03-02 | 2016-01-12 | Flir Systems, Inc. | Row and column noise reduction in thermal images |
US9451183B2 (en) | 2009-03-02 | 2016-09-20 | Flir Systems, Inc. | Time spaced infrared image enhancement |
US9706139B2 (en) | 2011-06-10 | 2017-07-11 | Flir Systems, Inc. | Low power and small form factor infrared imaging |
US9723227B2 (en) | 2011-06-10 | 2017-08-01 | Flir Systems, Inc. | Non-uniformity correction techniques for infrared imaging devices |
US9723228B2 (en) | 2011-06-10 | 2017-08-01 | Flir Systems, Inc. | Infrared camera system architectures |
US9736399B2 (en) | 2013-03-14 | 2017-08-15 | Drs Network & Imaging Systems, Llc | System architecture for thermal imaging and thermography cameras |
US9843742B2 (en) | 2009-03-02 | 2017-12-12 | Flir Systems, Inc. | Thermal image frame capture using de-aligned sensor array |
US10083501B2 (en) | 2015-10-23 | 2018-09-25 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
US10271020B2 (en) | 2014-10-24 | 2019-04-23 | Fluke Corporation | Imaging system employing fixed, modular mobile, and portable infrared cameras with ability to receive, communicate, and display data and images with proximity detection |
US10530977B2 (en) | 2015-09-16 | 2020-01-07 | Fluke Corporation | Systems and methods for placing an imaging tool in a test and measurement tool |
US10602082B2 (en) | 2014-09-17 | 2020-03-24 | Fluke Corporation | Triggered operation and/or recording of test and measurement or imaging tools |
US11113791B2 (en) | 2017-01-03 | 2021-09-07 | Flir Systems, Inc. | Image noise reduction using spectral transforms |
US11445131B2 (en) | 2009-06-03 | 2022-09-13 | Teledyne Flir, Llc | Imager with array of multiple infrared imaging modules |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10757308B2 (en) | 2009-03-02 | 2020-08-25 | Flir Systems, Inc. | Techniques for device attachment with dual band imaging sensor |
US11297264B2 (en) | 2014-01-05 | 2022-04-05 | Teledyne Fur, Llc | Device attachment with dual band imaging sensor |
JP6268304B2 (en) * | 2014-09-29 | 2018-01-24 | 富士フイルム株式会社 | Infrared imaging device, fixed pattern noise calculation method, and fixed pattern noise calculation program |
JP2016151523A (en) * | 2015-02-18 | 2016-08-22 | 浜松ホトニクス株式会社 | Infrared detection device |
WO2016176370A1 (en) | 2015-04-27 | 2016-11-03 | Flir Systems, Inc. | Moisture measurement device with thermal imaging capabilities and related methods |
WO2016197010A1 (en) | 2015-06-05 | 2016-12-08 | Flir Systems, Inc. | Systems and methods for enhanced dynamic range infrared imaging |
EP4187218A1 (en) | 2016-01-11 | 2023-05-31 | Carrier Corporation | Infrared presence detector system |
US10518900B2 (en) | 2016-06-14 | 2019-12-31 | Micasense, Inc. | Thermal calibration of an infrared image sensor |
CN110476416B (en) | 2017-01-26 | 2021-08-17 | 菲力尔系统公司 | System and method for infrared imaging in multiple imaging modes |
JP6866655B2 (en) * | 2017-01-31 | 2021-04-28 | 株式会社Jvcケンウッド | Thermal image processing device, infrared imaging device, thermal image processing method, and thermal image processing program |
WO2018176493A1 (en) * | 2017-04-01 | 2018-10-04 | SZ DJI Technology Co., Ltd. | Low-profile multi-band hyperspectral imaging for machine vision |
US10504221B2 (en) | 2017-09-28 | 2019-12-10 | Intel Corporation | Methods, apparatus and systems for monitoring devices |
WO2019087522A1 (en) | 2017-10-31 | 2019-05-09 | ソニーセミコンダクタソリューションズ株式会社 | Image pickup device |
CN107920142B (en) * | 2017-11-22 | 2019-10-25 | Oppo广东移动通信有限公司 | Electronic device |
US10896492B2 (en) | 2018-11-09 | 2021-01-19 | Qwake Technologies, Llc | Cognitive load reducing platform having image edge enhancement |
US10417497B1 (en) | 2018-11-09 | 2019-09-17 | Qwake Technologies | Cognitive load reducing platform for first responders |
US11890494B2 (en) | 2018-11-09 | 2024-02-06 | Qwake Technologies, Inc. | Retrofittable mask mount system for cognitive load reducing platform |
US11301996B2 (en) | 2018-12-11 | 2022-04-12 | Eko.Ai Pte. Ltd. | Training neural networks of an automatic clinical workflow that recognizes and analyzes 2D and doppler modality echocardiogram images |
US11446009B2 (en) | 2018-12-11 | 2022-09-20 | Eko.Ai Pte. Ltd. | Clinical workflow to diagnose heart disease based on cardiac biomarker measurements and AI recognition of 2D and doppler modality echocardiogram images |
US11931207B2 (en) | 2018-12-11 | 2024-03-19 | Eko.Ai Pte. Ltd. | Artificial intelligence (AI) recognition of echocardiogram images to enhance a mobile ultrasound device |
US12001939B2 (en) | 2018-12-11 | 2024-06-04 | Eko.Ai Pte. Ltd. | Artificial intelligence (AI)-based guidance for an ultrasound device to improve capture of echo image views |
CN109979909A (en) * | 2019-04-30 | 2019-07-05 | 烟台艾睿光电科技有限公司 | A kind of WLP device |
US11915376B2 (en) | 2019-08-28 | 2024-02-27 | Qwake Technologies, Inc. | Wearable assisted perception module for navigation and communication in hazardous environments |
CN111447382B (en) * | 2020-04-08 | 2021-09-21 | 电子科技大学 | Focal plane array non-uniformity correction method and correction circuit |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6028309A (en) | 1997-02-11 | 2000-02-22 | Indigo Systems Corporation | Methods and circuitry for correcting temperature-induced errors in microbolometer focal plane array |
US6812465B2 (en) | 2002-02-27 | 2004-11-02 | Indigo Systems Corporation | Microbolometer focal plane array methods and circuitry |
US7034301B2 (en) | 2002-02-27 | 2006-04-25 | Indigo Systems Corporation | Microbolometer focal plane array systems and methods |
US7470904B1 (en) | 2006-03-20 | 2008-12-30 | Flir Systems, Inc. | Infrared camera packaging |
US7470902B1 (en) | 2006-03-20 | 2008-12-30 | Flir Systems, Inc. | Infrared camera electronic architectures |
US7679048B1 (en) | 2008-04-18 | 2010-03-16 | Flir Systems, Inc. | Systems and methods for selecting microbolometers within microbolometer focal plane arrays |
Family Cites Families (446)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2764055A (en) | 1953-03-03 | 1956-09-25 | John E Clemens | Interferometer light system |
US5021663B1 (en) | 1988-08-12 | 1997-07-01 | Texas Instruments Inc | Infrared detector |
US5128796A (en) | 1989-03-31 | 1992-07-07 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Cryogenic shutter |
EP0398725B1 (en) | 1989-05-18 | 1994-08-03 | Murata Manufacturing Co., Ltd. | Pyroelectric IR - sensor |
JP3204982B2 (en) | 1995-05-26 | 2001-09-04 | 日立化成工業株式会社 | Environmental purification material |
US5763885A (en) | 1995-12-19 | 1998-06-09 | Loral Infrared & Imaging Systems, Inc. | Method and apparatus for thermal gradient stabilization of microbolometer focal plane arrays |
US7495220B2 (en) | 1995-10-24 | 2009-02-24 | Bae Systems Information And Electronics Systems Integration Inc. | Uncooled infrared sensor |
JPH09275518A (en) | 1996-04-05 | 1997-10-21 | Nippon Avionics Co Ltd | Visible image camera attachment angle control mechanism for infrared camera |
US6507018B2 (en) | 1996-08-30 | 2003-01-14 | Raytheon Company | Ditherless non-uniformity compensation for infrared detector arrays with recursive spatial low pass filtering |
US5994701A (en) | 1996-10-15 | 1999-11-30 | Nippon Avonics Co., Ltd. | Infrared sensor device with temperature correction function |
US5721427A (en) | 1996-12-19 | 1998-02-24 | Hughes Electronics | Scene-based nonuniformity correction processor incorporating motion triggering |
US6681120B1 (en) | 1997-03-26 | 2004-01-20 | Minerva Industries, Inc., | Mobile entertainment and communication device |
FI111892B (en) | 1997-04-22 | 2003-09-30 | Nokia Oy Ab | Multifunction messaging device |
US20040157612A1 (en) | 1997-04-25 | 2004-08-12 | Minerva Industries, Inc. | Mobile communication and stethoscope system |
US7321783B2 (en) | 1997-04-25 | 2008-01-22 | Minerva Industries, Inc. | Mobile entertainment and communication device |
KR100272582B1 (en) | 1997-10-10 | 2000-11-15 | 구자홍 | Scan converter |
KR100285817B1 (en) | 1997-12-26 | 2001-04-16 | 이중구 | Digital still camera capable of photographing objects with infra red rays and ultraviloet ray |
US6330371B1 (en) * | 1998-10-19 | 2001-12-11 | Raytheon Company | Adaptive non-uniformity compensation using feedforward shunting and min-mean filter |
KR20000026757A (en) | 1998-10-22 | 2000-05-15 | 국필종 | Motiom detection system and method |
KR100282397B1 (en) | 1998-12-31 | 2001-02-15 | 구자홍 | Deinterlacing device of digital image data |
KR100539520B1 (en) | 1999-03-02 | 2005-12-29 | 엘지전자 주식회사 | apparatus for displaying caption in digital TV |
KR20000073381A (en) | 1999-05-10 | 2000-12-05 | 정선종 | Management method for shared virtual reality space |
AUPQ056099A0 (en) | 1999-05-25 | 1999-06-17 | Silverbrook Research Pty Ltd | A method and apparatus (pprint01) |
KR100294925B1 (en) | 1999-06-03 | 2001-07-12 | 윤종용 | 3-D graphic image manufacturing method and binocular visual disparity adjustment method therefor |
JP2000347836A (en) | 1999-06-04 | 2000-12-15 | Sony Corp | High-order radix divider and method therefor |
US6633231B1 (en) | 1999-06-07 | 2003-10-14 | Horiba, Ltd. | Communication device and auxiliary device for communication |
KR20010002462A (en) | 1999-06-15 | 2001-01-15 | 전철현 | Video tape case |
US7035475B1 (en) | 1999-06-17 | 2006-04-25 | Raytheon Company | Non-traditional adaptive non-uniformity compensation (ADNUC) system employing adaptive feedforward shunting and operating methods therefor |
KR20010010010A (en) | 1999-07-15 | 2001-02-05 | 윤종용 | Vacuum chamber for metal deposition |
JP2001065629A (en) | 1999-08-31 | 2001-03-16 | Toyoda Gosei Co Ltd | Liquid sealed vibration control device |
US6444983B1 (en) * | 1999-10-07 | 2002-09-03 | Infrared Solutions, Inc. | Microbolometer focal plane array with controlled bias |
KR100423439B1 (en) | 1999-12-28 | 2004-03-19 | 주식회사 포스코 | Anti finger resin solution and method for manufacturing anti finger steel sheet using the solution |
JP2001251542A (en) | 1999-12-28 | 2001-09-14 | Casio Comput Co Ltd | Portable image pickup device |
JP4622028B2 (en) | 2000-03-17 | 2011-02-02 | ソニー株式会社 | Content processing apparatus and content processing system |
US6911652B2 (en) | 2000-03-22 | 2005-06-28 | Jonathan A. Walkenstein | Low light imaging device |
US7215809B2 (en) | 2000-04-04 | 2007-05-08 | Sony Corporation | Three-dimensional image producing method and apparatus therefor |
KR100865598B1 (en) | 2000-05-29 | 2008-10-27 | 브이케이비 인코포레이티드 | Virtual data entry device and method for input of alphanumeric and other data |
KR100454599B1 (en) | 2000-06-29 | 2004-11-03 | 신종식 | Method for distance lecturing using cyber-character |
JP4081965B2 (en) | 2000-07-07 | 2008-04-30 | 株式会社豊田自動織機 | Capacity control mechanism of variable capacity compressor |
KR100412781B1 (en) | 2000-07-14 | 2003-12-31 | 주식회사 케이티프리텔 | Wireless Commnication Terminal Lending Method and System Using Internet |
AU2002213108A1 (en) * | 2000-10-13 | 2002-04-22 | Litton Systems Inc. | Monolithic lead-salt infrared radiation detectors |
US6567099B1 (en) | 2000-11-15 | 2003-05-20 | Sony Corporation | Method and system for dynamically allocating a frame buffer for efficient anti-aliasing |
KR20020044339A (en) | 2000-12-05 | 2002-06-15 | 구자홍 | Multimedia Synthesis System And Method For Image |
KR100387445B1 (en) | 2000-12-19 | 2003-06-18 | 기아자동차주식회사 | Fuse with relay box for automobil |
US6833822B2 (en) | 2000-12-21 | 2004-12-21 | Raytheon Company | Method and apparatus for generating a visible image with an infrared transmissive window |
KR100747497B1 (en) | 2001-01-16 | 2007-08-08 | 삼성전자주식회사 | Digital camera and image composing method thereof |
KR20020061920A (en) | 2001-01-18 | 2002-07-25 | 김영희 | wire and wireless type camera muti release |
JP3635051B2 (en) | 2001-02-01 | 2005-03-30 | 株式会社ソニー・コンピュータエンタテインメント | Image generation method and apparatus, recording medium storing image processing program, and image processing program |
KR20020069690A (en) | 2001-02-27 | 2002-09-05 | 주식회사 컴텍멀티미디어 | Internet photo system using digital photo kiosk and management system thereof |
KR20010044756A (en) | 2001-03-22 | 2001-06-05 | 김종명 | Method for processing digital photo image by one-touch |
US6850147B2 (en) | 2001-04-02 | 2005-02-01 | Mikos, Ltd. | Personal biometric key |
KR20020078469A (en) | 2001-04-03 | 2002-10-19 | 김봉환 | Apparatus for Removing Slag from Cutting Die |
EP1380012A1 (en) | 2001-04-09 | 2004-01-14 | Koninklijke Philips Electronics N.V. | Method of blending digital pictures |
US6901173B2 (en) | 2001-04-25 | 2005-05-31 | Lockheed Martin Corporation | Scene-based non-uniformity correction for detector arrays |
US7016550B2 (en) | 2002-04-19 | 2006-03-21 | Lockheed Martin Corporation | Scene-based non-uniformity offset correction for staring arrays |
KR100382008B1 (en) | 2001-04-27 | 2003-04-26 | 삼성테크윈 주식회사 | A digital camera which can improve a picture quality and the picture quality improving method using the digital camera |
KR100600479B1 (en) | 2001-05-04 | 2006-07-13 | 고등기술연구원연구조합 | a car-used apparatus for detecting and alarming existence and movement of an object |
EP1386481A1 (en) * | 2001-05-07 | 2004-02-04 | Flir Systems AB | Infrared camera sensitive for infrared radiation |
US6707044B2 (en) | 2001-05-07 | 2004-03-16 | Flir Systems Ab | Infrared camera system |
KR100437873B1 (en) | 2001-05-08 | 2004-06-26 | 연성정밀화학(주) | Process for preparing prostaglandin derivatives and stereospecific starting material thereof |
KR20010074565A (en) | 2001-05-08 | 2001-08-04 | 오성택 | Virtual Reality System for Screen/Vibration/Sound |
KR100411875B1 (en) | 2001-06-15 | 2003-12-24 | 한국전자통신연구원 | Method for Stereo Image Disparity Map Fusion And Method for Display 3-Dimension Image By Using it |
KR100402780B1 (en) | 2001-06-23 | 2003-10-22 | 기아자동차주식회사 | Rail-roof side part structure of vehicle |
JP4177568B2 (en) | 2001-07-10 | 2008-11-05 | 株式会社東芝 | Semiconductor device |
US6816552B2 (en) | 2001-07-11 | 2004-11-09 | Dolby Laboratories Licensing Corporation | Interpolation of video compression frames |
US7050107B1 (en) | 2001-07-13 | 2006-05-23 | Indigo Systems Corporation | Interface device for extending camcorder use over the electromagnetic spectrum |
KR100719130B1 (en) | 2001-08-01 | 2007-05-17 | 인벤텍 어플라이언시스 코퍼레이션 | Dialing method for effecting internatioal call in intelligent cellular phone |
KR100400167B1 (en) | 2001-08-21 | 2003-10-01 | 삼성전자주식회사 | Portable terminal equipment having image capture function and implementation method thereof |
JP3741014B2 (en) | 2001-09-18 | 2006-02-01 | 株式会社日立製作所 | Control method and compressor system for a plurality of compressors |
US6821625B2 (en) | 2001-09-27 | 2004-11-23 | International Business Machines Corporation | Thermal spreader using thermal conduits |
KR100459124B1 (en) | 2001-11-02 | 2004-12-03 | 엘지전자 주식회사 | Apparatus for Displaying Twin Picture of Display and Method of The Same |
US20030093805A1 (en) | 2001-11-15 | 2003-05-15 | Gin J.M. Jack | Dual camera surveillance and control system |
KR20020023719A (en) | 2001-12-12 | 2002-03-29 | 남현정 | Portable apparatus for correcting golf swing posture |
KR100415582B1 (en) | 2001-12-27 | 2004-01-24 | 한국전자통신연구원 | The connection and release method for reduction of network load of RSVP for the internet QoS |
KR20030056667A (en) | 2001-12-28 | 2003-07-04 | 동부전자 주식회사 | Method for manufacturing a buried junction of the flat cell memory device |
US20030122957A1 (en) | 2001-12-31 | 2003-07-03 | Emme Niels Peter | Mobile terminal with digital camera and method of capturing images |
GB0202506D0 (en) | 2002-02-02 | 2002-03-20 | Qinetiq Ltd | Reconfigurable detector array |
KR100407158B1 (en) | 2002-02-07 | 2003-11-28 | 삼성탈레스 주식회사 | Method for correcting time variant defect in thermal image system |
KR100454447B1 (en) | 2002-02-07 | 2004-10-28 | 주식회사 네오콤 | All in one ccd camera |
KR20030085742A (en) | 2002-05-01 | 2003-11-07 | 엘지전자 주식회사 | Carmera focusing method for image communication terminal |
WO2003093963A2 (en) | 2002-05-03 | 2003-11-13 | Koninklijke Philips Electronics N.V. | Mobile hand-held device |
KR100463988B1 (en) | 2002-05-15 | 2004-12-30 | 주식회사 스타트이십일 | Transact a civil appeal system and method for permission of put up a placard |
US6759949B2 (en) | 2002-05-23 | 2004-07-06 | Visteon Global Technologies, Inc. | Image enhancement in far infrared camera |
US20030223623A1 (en) | 2002-06-03 | 2003-12-04 | Srinivas Gutta | Face-recognition using half-face images |
KR100478251B1 (en) | 2002-06-28 | 2005-03-22 | 한국기계연구원 | Apparatus for measuring corrosion thickness of insulated pipeline |
KR20040001684A (en) | 2002-06-28 | 2004-01-07 | 강경식 | A couple table for events |
AU2003253912A1 (en) | 2002-07-10 | 2004-01-23 | Lockheed Martin Corporation | Infrared camera system and method |
JP2004048571A (en) | 2002-07-15 | 2004-02-12 | Ik System:Kk | Cellular phone capable of transmitting/receiving video, and remote monitoring system using portable communication terminal such as phs |
KR20020083961A (en) | 2002-08-19 | 2002-11-04 | 박민수 | A contact approach acting generation cellular phone |
US6898331B2 (en) | 2002-08-28 | 2005-05-24 | Bae Systems Aircraft Controls, Inc. | Image fusion system and method |
KR100491708B1 (en) | 2002-09-12 | 2005-05-27 | 주식회사 캐쉬빌 | The remote security system and program thereof |
AU2003262960A1 (en) | 2002-09-20 | 2004-04-08 | Bae Systems Information And Electronic Systems Integration Inc | Front lens shutter mount for uniformity correction |
CN102930311B (en) | 2002-09-26 | 2016-04-27 | 吉田健治 | Signal conditioning package medium being formed the method for dot pattern, uses the data inputting method of dot pattern, use the information I/O method of dot pattern, use the message input device of dot pattern, use dot pattern |
KR20040033223A (en) | 2002-10-11 | 2004-04-21 | 엘지전자 주식회사 | Portable computer system |
KR100899120B1 (en) | 2002-10-15 | 2009-05-27 | 한국항공우주산업 주식회사 | Unmanned Picture Administration System |
KR20040033993A (en) | 2002-10-16 | 2004-04-28 | (주)씨앤에스 테크놀로지 | Alpha blending apparatus and thereof method |
KR100492148B1 (en) | 2002-10-16 | 2005-06-02 | 박동윤 | The Artificial Intelligence Image Security System using the distance and direction of Moving Object |
US20040101298A1 (en) | 2002-10-18 | 2004-05-27 | Sarnoff Corporation | Method for arranging cameras and mirrors to allow panoramic visualization |
JP3993813B2 (en) | 2002-10-31 | 2007-10-17 | 有限会社リムテック | Molten metal material injection equipment |
KR20040039868A (en) | 2002-11-05 | 2004-05-12 | 주식회사 만도 | Flange bolt |
KR20040042475A (en) | 2002-11-14 | 2004-05-20 | 엘지전자 주식회사 | Image calling mobile phone having detachable folder |
KR100490702B1 (en) | 2002-11-21 | 2005-05-19 | 주성엔지니어링(주) | Multi cluster module |
KR100517979B1 (en) | 2002-12-10 | 2005-10-04 | 엘지전자 주식회사 | Video overlay apparatus for mobile communication device |
KR100480076B1 (en) | 2002-12-18 | 2005-04-07 | 엘지전자 주식회사 | Method for processing still video image |
US7263379B1 (en) | 2002-12-23 | 2007-08-28 | Sti Licensing Corp. | Communications network for emergency services personnel |
KR100502709B1 (en) | 2002-12-27 | 2005-07-22 | 주식회사 아이큐브 | Method and apparatus for synchoronizing caption text with image screen |
US6885939B2 (en) | 2002-12-31 | 2005-04-26 | Robert Bosch Gmbh | System and method for advanced 3D visualization for mobile navigation units |
KR20040062802A (en) | 2003-01-03 | 2004-07-09 | 엘지전자 주식회사 | apparatus and method for instituting mode of screen change function |
KR100466368B1 (en) | 2003-01-10 | 2005-01-13 | 김영순 | Analyzing apparatus of Golf player's playing information and Method using thereof |
US7230741B2 (en) | 2003-01-13 | 2007-06-12 | Bae Systems Information And Electronic Systems Integration, Inc., Reconnaissance And Surveillance Systems | Optimum non-uniformity correction for imaging sensors |
US20040140758A1 (en) | 2003-01-17 | 2004-07-22 | Eastman Kodak Company | Organic light emitting device (OLED) display with improved light emission using a metallic anode |
US7148484B2 (en) | 2003-01-24 | 2006-12-12 | The Regents Of The University Of California | Cellular telephone-based radiation sensor and wide-area detection network |
JP2004226873A (en) | 2003-01-27 | 2004-08-12 | Sanyo Electric Co Ltd | Camera module and its manufacturing method |
JP2004241491A (en) * | 2003-02-04 | 2004-08-26 | Seiko Epson Corp | Solid-state imaging device |
KR20040070840A (en) | 2003-02-04 | 2004-08-11 | 주식회사 비젼하이텍 | Infrared camera having auto-focusing in day and night, and method for auto-focusing thereof |
DE10304703B4 (en) | 2003-02-06 | 2023-03-16 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for visualizing the environment of a vehicle with environment-dependent fusion of an infrared and a visual image |
US6856705B2 (en) | 2003-02-25 | 2005-02-15 | Microsoft Corporation | Image blending by guided interpolation |
KR20040076308A (en) | 2003-02-25 | 2004-09-01 | 전환표 | A Multi-screen video clip replaying device using A beam Projector |
KR100477803B1 (en) | 2003-02-27 | 2005-03-22 | 한국전자통신연구원 | Optical alignment apparatus and method by using visual optical source and image |
JP2004304545A (en) * | 2003-03-31 | 2004-10-28 | Mitsubishi Electric Corp | Infrared camera |
KR20040086994A (en) | 2003-04-04 | 2004-10-13 | (주)코모콤 | A blog system and a method for uploading a document on a blog page by using e-mail |
JP4490074B2 (en) | 2003-04-17 | 2010-06-23 | ソニー株式会社 | Stereoscopic image processing apparatus, stereoscopic image display apparatus, stereoscopic image providing method, and stereoscopic image processing system |
KR100437890B1 (en) | 2003-04-18 | 2004-06-30 | (주)충청측량설계공사 | Three dimensional map producing device and method |
US7002154B2 (en) | 2003-04-25 | 2006-02-21 | Raytheon Company | Optical system for a wide field of view staring infrared sensor having improved optical symmetry |
KR20040102386A (en) | 2003-05-27 | 2004-12-08 | 에스케이 텔레콤주식회사 | Club management system using mobile communication system and method thereof |
JP4554280B2 (en) | 2003-06-12 | 2010-09-29 | マイクロソフト コーポレーション | System and method for displaying images using multiple blending |
US20040256561A1 (en) | 2003-06-17 | 2004-12-23 | Allyson Beuhler | Wide band light sensing pixel array |
KR100511227B1 (en) | 2003-06-27 | 2005-08-31 | 박상래 | Portable surveillance camera and personal surveillance system using the same |
KR20050008245A (en) | 2003-07-14 | 2005-01-21 | (주)워치비젼 | An apparatus and method for inserting 3D graphic images in video |
KR100497399B1 (en) | 2003-07-22 | 2005-06-23 | 삼성전자주식회사 | Method and apparatus for quantizing a image by parallel handling |
KR101046820B1 (en) | 2003-07-25 | 2011-07-06 | 일동제약주식회사 | Identification method of Bifidobacterium spp. |
KR20050014448A (en) | 2003-07-31 | 2005-02-07 | 김홍원 | a shower head with multi function |
KR20050015293A (en) | 2003-08-05 | 2005-02-21 | 삼성전자주식회사 | Memory cell access circuit of semiconductor memory device |
KR20050015526A (en) | 2003-08-06 | 2005-02-21 | 엘지전자 주식회사 | Temperature measurement apparatus using mobile terminal |
KR100540834B1 (en) | 2003-08-07 | 2006-01-10 | 주식회사 팬택 | Method for global positioning of mobile terminal |
IL157344A0 (en) | 2003-08-11 | 2004-06-20 | Opgal Ltd | Internal temperature reference source and mtf inverse filter for radiometry |
KR100548402B1 (en) | 2003-08-12 | 2006-02-02 | 엘지전자 주식회사 | System for making sticker photograph using mobile communication terminal |
KR100547739B1 (en) | 2003-08-16 | 2006-01-31 | 삼성전자주식회사 | How to edit an image on your mobile device |
KR100641603B1 (en) | 2003-09-04 | 2006-11-02 | 주식회사 소디프신소재 | Preparation of high purity fluorine gas |
JP3824237B2 (en) | 2003-09-05 | 2006-09-20 | ソニー株式会社 | Image processing apparatus and method, recording medium, and program |
KR100981803B1 (en) | 2003-09-18 | 2010-09-13 | 엘지전자 주식회사 | Mobile communication device having security function of camera |
JP2005100177A (en) | 2003-09-25 | 2005-04-14 | Sony Corp | Image processor and its method |
JP2005107780A (en) | 2003-09-30 | 2005-04-21 | Sony Corp | Image blending method and blended image data generation device |
KR100520774B1 (en) | 2003-09-30 | 2005-10-12 | 기아자동차주식회사 | Detachable seat for vehicles |
KR20050033308A (en) | 2003-10-06 | 2005-04-12 | 삼성전기주식회사 | Zoom camera using the liquid lens for mobile phone, control system and method thereof |
JP4556107B2 (en) | 2003-10-30 | 2010-10-06 | ソニー株式会社 | Imaging apparatus and method, and communication terminal apparatus |
KR20060111472A (en) | 2003-10-31 | 2006-10-27 | 브이케이비 인코포레이티드 | Optical apparatus for virtual interface projection and sensing |
CN1259796C (en) * | 2003-12-12 | 2006-06-14 | 北京大学 | Focal plane array read out circuit and read out method |
US7506267B2 (en) | 2003-12-23 | 2009-03-17 | Intel Corporation | Compose rate reduction for displays |
US20050219249A1 (en) | 2003-12-31 | 2005-10-06 | Feng Xie | Integrating particle rendering and three-dimensional geometry rendering |
JP4386262B2 (en) | 2004-02-04 | 2009-12-16 | キヤノン株式会社 | Image forming apparatus |
DE602004010777T2 (en) | 2004-02-18 | 2008-12-04 | Harman Becker Automotive Systems Gmbh | Alpha mix based on a lookup table |
KR20040027692A (en) | 2004-02-21 | 2004-04-01 | 양기해 | Pump that unuse axis of rotation |
US7868890B2 (en) | 2004-02-24 | 2011-01-11 | Qualcomm Incorporated | Display processor for a wireless device |
FI20045078A (en) | 2004-03-16 | 2005-09-17 | Myorigo Oy | Mobile device with wide-angle optics and radiation sensor |
KR20050093052A (en) | 2004-03-18 | 2005-09-23 | 김치경 | Phone for unmanned guard and method of preventing crimes using the same |
US7450754B2 (en) | 2004-03-23 | 2008-11-11 | Microsoft Corporation | Radiometric calibration from a single image |
KR101006660B1 (en) | 2004-03-26 | 2011-01-10 | 엘지전자 주식회사 | Infrared ray photography apparatus and method using camera in mobile terminal |
US7296747B2 (en) | 2004-04-20 | 2007-11-20 | Michael Rohs | Visual code system for camera-equipped mobile devices and applications thereof |
US7294817B2 (en) | 2004-05-06 | 2007-11-13 | Siemens Power Generation, Inc. | System and methods for determining nonuniformity correction parameters in detector-array imaging |
KR20050107206A (en) | 2004-05-08 | 2005-11-11 | 삼성전자주식회사 | Monitor |
JP2005341132A (en) | 2004-05-26 | 2005-12-08 | Toshiba Corp | Video data processor and processing method |
US7370932B2 (en) | 2004-05-27 | 2008-05-13 | Silverbrook Research Pty Ltd | Cartridge having integrated circuit for enabling validation thereof by a mobile device |
US8073895B2 (en) | 2004-06-01 | 2011-12-06 | Globaltel Media, Inc. | System and method for delivering web content to a mobile device |
US7333832B2 (en) | 2004-06-03 | 2008-02-19 | Inventec Appliances Corporation | Cellular phone capable of measuring temperature by IR means |
JP2007263563A (en) | 2004-06-03 | 2007-10-11 | Matsushita Electric Ind Co Ltd | Camera module |
US8153975B2 (en) | 2004-12-01 | 2012-04-10 | White Box, Inc. | Interfacing devices and systems |
CN2706974Y (en) | 2004-06-18 | 2005-06-29 | 鸿富锦精密工业(深圳)有限公司 | Cellphone with night visual function |
WO2006000930A1 (en) | 2004-06-21 | 2006-01-05 | Koninklijke Philips Electronics N.V. | Device and method of downscaling and blending two high resolution images |
WO2006001506A1 (en) | 2004-06-25 | 2006-01-05 | Ssd Company Limited | Image mixing apparatus and pixel mixer |
CN2708281Y (en) | 2004-06-25 | 2005-07-06 | 黄立 | Thermal imaging system |
JP4214961B2 (en) | 2004-06-28 | 2009-01-28 | セイコーエプソン株式会社 | Superdirective sound system and projector |
CN100445955C (en) | 2004-06-30 | 2008-12-24 | 沃达丰集团股份有限公司 | Linkage operation method and mobile communication terminal |
JP4349232B2 (en) | 2004-07-30 | 2009-10-21 | ソニー株式会社 | Semiconductor module and MOS solid-state imaging device |
KR100601963B1 (en) | 2004-08-23 | 2006-07-14 | 삼성전자주식회사 | Authentication apparatus and method using eye gaze |
US7208733B2 (en) | 2004-08-24 | 2007-04-24 | International Electronic Machines Corp. | Non-visible radiation imaging and inspection |
SE0402082L (en) | 2004-08-25 | 2006-04-18 | Sandvik Intellectual Property | Metal product, method of manufacturing a metal product and its use |
KR20060019715A (en) | 2004-08-30 | 2006-03-06 | 주식회사 팬택 | Mobile phone with infrared camera and method for alerting danger using the same |
US7463753B2 (en) | 2004-09-15 | 2008-12-09 | Raytheon Company | FLIR-to-missile boresight correlation and non-uniformity compensation of the missile seeker |
JP4003780B2 (en) | 2004-09-17 | 2007-11-07 | カシオ計算機株式会社 | Semiconductor device and manufacturing method thereof |
JP2006098098A (en) | 2004-09-28 | 2006-04-13 | Sanyo Electric Co Ltd | Cellular phone having moisture sensing function |
JP2006105655A (en) | 2004-10-01 | 2006-04-20 | Nippon Telegr & Teleph Corp <Ntt> | Total calorie checker for food items, and checking method |
US20060078215A1 (en) | 2004-10-12 | 2006-04-13 | Eastman Kodak Company | Image processing based on direction of gravity |
JP4777029B2 (en) | 2004-10-13 | 2011-09-21 | キヤノン株式会社 | Information processing apparatus and control method thereof |
KR100682898B1 (en) | 2004-11-09 | 2007-02-15 | 삼성전자주식회사 | Imaging apparatus using infrared ray and image discrimination method thereof |
NO323926B1 (en) | 2004-11-12 | 2007-07-23 | New Index As | Visual system and control object and apparatus for use in the system. |
KR100645746B1 (en) | 2004-11-16 | 2006-11-15 | 주식회사 팬택 | Wireless telecommunication terminal and method for receiving IrDA signal using camera module |
US8531562B2 (en) | 2004-12-03 | 2013-09-10 | Fluke Corporation | Visible light and IR combined image camera with a laser pointer |
KR20060063265A (en) | 2004-12-07 | 2006-06-12 | 삼성전자주식회사 | Method and apparatus for processing image |
KR100689465B1 (en) | 2004-12-21 | 2007-03-08 | 삼성전자주식회사 | Camera phone having infrared sensor and method thereof |
JP4534756B2 (en) | 2004-12-22 | 2010-09-01 | ソニー株式会社 | Image processing apparatus, image processing method, imaging apparatus, program, and recording medium |
US7471844B2 (en) | 2004-12-27 | 2008-12-30 | Intel Corporation | Method, apparatus and system for multi-feature programmable tap filter image processing |
KR20060077519A (en) | 2004-12-30 | 2006-07-05 | 삼성전자주식회사 | Camera lens apparatus for mobile phone |
KR100707174B1 (en) | 2004-12-31 | 2007-04-13 | 삼성전자주식회사 | High band Speech coding and decoding apparatus in the wide-band speech coding/decoding system, and method thereof |
KR100729280B1 (en) | 2005-01-08 | 2007-06-15 | 아이리텍 잉크 | Iris Identification System and Method using Mobile Device with Stereo Camera |
USD524785S1 (en) | 2005-02-07 | 2006-07-11 | Li Huang | Mobile-phone thermal imager |
KR100612890B1 (en) | 2005-02-17 | 2006-08-14 | 삼성전자주식회사 | Multi-effect expression method and apparatus in 3-dimension graphic image |
US20090297062A1 (en) | 2005-03-04 | 2009-12-03 | Molne Anders L | Mobile device with wide-angle optics and a radiation sensor |
KR101155224B1 (en) | 2005-03-09 | 2012-06-13 | 삼성전자주식회사 | Method and system for poc compatible terminal split-off by media attribute in poc multimedia session |
JP4126663B2 (en) | 2005-03-16 | 2008-07-30 | ソニー株式会社 | 3D image visual effect confirmation device and 3D image visual effect confirmation method |
WO2006112866A2 (en) | 2005-04-13 | 2006-10-26 | Scanbuy, Inc. | Visual code system for camera-equipped mobile devices and applications thereof |
TW200638739A (en) | 2005-04-26 | 2006-11-01 | Novatek Microelectronics Corp | Mobile phone with a monitoring function, system thereof and monitoring method |
US7284921B2 (en) | 2005-05-09 | 2007-10-23 | Silverbrook Research Pty Ltd | Mobile device with first and second optical pathways |
US7595904B2 (en) | 2005-05-09 | 2009-09-29 | Silverbrook Research Pty Ltd | Method of using a mobile device to determine a first rotational orientation of coded data on a print medium |
TWI279924B (en) | 2005-05-17 | 2007-04-21 | Unimems Mfg Co Ltd | Dual band reflective thermal imaging system |
KR100673427B1 (en) | 2005-05-18 | 2007-01-24 | 학교법인연세대학교 | Portable communication equipment for having iris recognition function |
KR100761027B1 (en) | 2005-05-19 | 2007-10-01 | 강원 | Mobile-phone with function of wireless mouse and home-network device and Network control system thereof |
KR101121716B1 (en) | 2005-05-19 | 2012-03-09 | 서울반도체 주식회사 | Mobile terminal having an auto-focusing camera |
KR20060121595A (en) | 2005-05-24 | 2006-11-29 | 브이케이 주식회사 | A mobile phone having a position information display function of an unmanned monitoring camera and a position informatin display method of the unmannd monitoring camera using the same |
US7420663B2 (en) | 2005-05-24 | 2008-09-02 | Bwt Property Inc. | Spectroscopic sensor on mobile phone |
KR100672377B1 (en) | 2005-05-31 | 2007-01-24 | 엘지전자 주식회사 | Method and Apparatus for digital image processing in Digital receiver |
KR100717925B1 (en) | 2005-06-01 | 2007-05-11 | 주식회사 엘지화학 | Method for preparing adhesive acrylic ester polymer syrup |
KR100889716B1 (en) | 2005-06-06 | 2009-03-23 | 아사히 가라스 가부시키가이샤 | Glass production device and component thereof, and method for conduction-heating such component |
EP1732308A2 (en) | 2005-06-08 | 2006-12-13 | Canon Kabushiki Kaisha | Imaging device, control method, and program |
GB0511883D0 (en) | 2005-06-10 | 2005-07-20 | Boc Group Plc | Manufacture of ferroalloys |
KR100811165B1 (en) | 2005-06-21 | 2008-03-07 | 삼성전자주식회사 | Printing Position Error Reduction Method and Printer |
KR100581241B1 (en) | 2005-06-22 | 2006-05-22 | 강신홍 | Image sensing module for mobile phone |
KR100722974B1 (en) | 2005-06-27 | 2007-05-30 | 엠텍비젼 주식회사 | Method and Device for measuring environmental information by employing camera module |
US20070004449A1 (en) | 2005-06-29 | 2007-01-04 | Sham John C | Mobile communication device with environmental sensors |
KR100745894B1 (en) | 2005-06-30 | 2007-08-02 | 주식회사 하이닉스반도체 | Method for forming recess gate of semiconductor device |
KR100683220B1 (en) | 2005-07-05 | 2007-02-15 | 현대모비스 주식회사 | Fluid passageway structure of defrost duct for vehicles |
KR100646966B1 (en) | 2005-07-06 | 2006-11-23 | 주식회사 팬택앤큐리텔 | Apparatus for controlling camera auto focusing in the mobile communication terminal |
WO2007006242A1 (en) | 2005-07-07 | 2007-01-18 | Jaromir Pech | Method for verifying the authenticity of objects |
KR100766554B1 (en) | 2005-07-14 | 2007-10-17 | 크루셜텍 (주) | Ultra Thin Optical Pointing Device and Personal Portable Device Having the Same |
KR100677913B1 (en) | 2005-07-18 | 2007-02-05 | 삼성탈레스 주식회사 | Apparatus for tracing a moving object by mixing the video signals from multi sensors |
US20070019103A1 (en) | 2005-07-25 | 2007-01-25 | Vkb Inc. | Optical apparatus for virtual interface projection and sensing |
US20070019099A1 (en) | 2005-07-25 | 2007-01-25 | Vkb Inc. | Optical apparatus for virtual interface projection and sensing |
KR100777428B1 (en) | 2005-07-26 | 2007-11-20 | 한국원자력연구원 | Image processing device and method |
JP3934658B2 (en) | 2005-07-28 | 2007-06-20 | 株式会社コナミデジタルエンタテインメント | Position detection system |
EP1748389A1 (en) | 2005-07-28 | 2007-01-31 | Microsoft Corporation | Image blending |
KR100586043B1 (en) | 2005-08-03 | 2006-06-08 | 주식회사 네오엠텔 | Contents display method in wap browser of mobile terminal using plug-in, and apparatus thereof |
US8126243B2 (en) | 2005-08-23 | 2012-02-28 | Nihon Medi-Physics Co., Ltd. | Image processing method, image processing program, and image processing device |
WO2007029454A1 (en) | 2005-09-07 | 2007-03-15 | Pioneer Corporation | Scene analysis device and method |
KR100746405B1 (en) | 2005-09-07 | 2007-08-03 | 쿠쿠전자주식회사 | Cooking menu information processing system of a electrical rice pot |
JP4622763B2 (en) | 2005-09-14 | 2011-02-02 | 日本電気株式会社 | Mobile communication terminal device and authentication method |
US7407092B2 (en) | 2005-09-19 | 2008-08-05 | Silverbrook Research Pty Ltd | Printing gaming information using a mobile device |
US7403796B2 (en) | 2005-09-19 | 2008-07-22 | Silverbrook Research Pty Ltd | Printing dating information using a mobile device |
US9166823B2 (en) | 2005-09-21 | 2015-10-20 | U Owe Me, Inc. | Generation of a context-enriched message including a message component and a contextual attribute |
KR100633792B1 (en) | 2005-09-21 | 2006-10-16 | 주식회사 팬택 | Camera control apparatus and method of mobile communication device |
JP4630781B2 (en) | 2005-09-26 | 2011-02-09 | キヤノン株式会社 | Wavelength component ratio detection apparatus and imaging apparatus using the same |
JP4293174B2 (en) | 2005-09-28 | 2009-07-08 | ソニー株式会社 | Imaging apparatus and image processing apparatus |
US20090278857A1 (en) | 2005-10-12 | 2009-11-12 | Active Optics Pty Limited | Method of forming an image based on a plurality of image frames, image processing system and digital camera |
JP5156152B2 (en) | 2005-10-17 | 2013-03-06 | アイ2アイシー コーポレイション | Combined video display and camera system |
KR100714672B1 (en) | 2005-11-09 | 2007-05-07 | 삼성전자주식회사 | Method for depth based rendering by using splats and system of enabling the method |
KR100791375B1 (en) | 2005-12-19 | 2008-01-07 | 삼성전자주식회사 | Apparatus and method for color correction |
KR20070068501A (en) | 2005-12-27 | 2007-07-02 | 박현 | Automatic denoising of 2d color face images using recursive pca reconstruction |
KR100663528B1 (en) | 2006-01-03 | 2007-01-02 | 삼성전자주식회사 | Method for displaying the pop-up in wireless terminal |
KR100724134B1 (en) | 2006-01-09 | 2007-06-04 | 삼성전자주식회사 | Method and apparatus for providing panoramic view with high speed image matching and mild mixed color blending |
CN2874947Y (en) | 2006-01-18 | 2007-02-28 | 秦川 | Cell phone with body temperature testing function |
KR100729813B1 (en) | 2006-01-20 | 2007-06-18 | (주)자이리스 | Photographing appararus for iris authentication, photographing module for iris authentication and terminal having the same |
US7462831B2 (en) | 2006-01-26 | 2008-12-09 | L-3 Communications Corporation | Systems and methods for bonding |
KR100752556B1 (en) | 2006-01-27 | 2007-08-29 | 삼성토탈 주식회사 | High rigidity ? anti-bacterial polypropylene resin composition for Drum Washing Machine tub |
US7872574B2 (en) | 2006-02-01 | 2011-01-18 | Innovation Specialists, Llc | Sensory enhancement systems and methods in personal electronic devices |
US20070183343A1 (en) | 2006-02-03 | 2007-08-09 | Liliana Grajales | Method and system for facilitating command of a group |
KR100660125B1 (en) | 2006-02-10 | 2006-12-20 | 주식회사 엠씨넥스 | Pointing device using camera |
KR20070081773A (en) | 2006-02-13 | 2007-08-17 | 스마트 와이어레스 가부시키가이샤 | Infrared face authenticating apparatus, and portable terminal and security apparatus including the same |
US20100066809A1 (en) | 2006-02-15 | 2010-03-18 | Cdm Optics, Inc. | Deployable Image Sensor |
JP4562668B2 (en) | 2006-02-20 | 2010-10-13 | ソニー・エリクソン・モバイルコミュニケーションズ株式会社 | Imaging device and biometric authentication device |
KR20070082960A (en) | 2006-02-20 | 2007-08-23 | 삼성전자주식회사 | Cleaning agents and method for manufacturing a display panel using the same |
JP5214151B2 (en) | 2006-02-23 | 2013-06-19 | Jx日鉱日石エネルギー株式会社 | Refrigerating machine oil for hydrocarbon refrigerant and refrigerating machine system using the same |
KR101402592B1 (en) | 2006-03-06 | 2014-06-17 | 에스케이바이오팜 주식회사 | Transdermal Composition Using Piroxicam-Inorganic Complex and Patch System Comprising the Same |
US20070211965A1 (en) | 2006-03-07 | 2007-09-13 | Helbing Rene P | Hand-held diagnostic systems and methods of use thereof |
JP2007258873A (en) | 2006-03-22 | 2007-10-04 | Toshiba Corp | Reproducer and reproducing method |
CN2899321Y (en) | 2006-04-05 | 2007-05-09 | 乐金电子(中国)研究开发中心有限公司 | Infrared alarming cell phone |
KR100743171B1 (en) | 2006-04-18 | 2007-07-27 | 대성전기공업 주식회사 | Display device |
KR100784107B1 (en) | 2006-04-24 | 2007-12-10 | 주식회사 하이닉스반도체 | Method for driving flash memory device |
DE102006020374A1 (en) | 2006-04-28 | 2007-10-31 | Uhdenora S.P.A. | Insulating frame for an electrolysis cell for producing chlorine, hydrogen and/or caustic soda comprises an edge region directly connected to an inner front surface and structured so that an electrolyte can pass through it |
WO2007129318A2 (en) | 2006-05-06 | 2007-11-15 | Irody Inc | Apparatus and method for obtaining an identification of drugs for enhanced safety |
DE102006025291B3 (en) | 2006-05-31 | 2007-12-13 | Infineon Technologies Ag | Integrated electrical module with regular and redundant elements |
US8018472B2 (en) | 2006-06-08 | 2011-09-13 | Qualcomm Incorporated | Blending multiple display layers |
JP2007325842A (en) | 2006-06-09 | 2007-12-20 | Nec Corp | Personal digital assistant with health care function |
US7697962B2 (en) | 2006-06-09 | 2010-04-13 | International Business Machines Corporation | Cellphone usage and mode detection and automatic speakerphone toggle |
KR100808610B1 (en) | 2006-06-13 | 2008-02-28 | 중앙대학교 산학협력단 | Digital Multi-focusing using Image Fusion |
KR100766953B1 (en) | 2006-06-14 | 2007-10-18 | 서울전자통신(주) | Camera module |
KR20070005553A (en) | 2006-06-23 | 2007-01-10 | 나이루스 가부시키가이샤 | Imaging system |
KR100846497B1 (en) | 2006-06-26 | 2008-07-17 | 삼성전자주식회사 | Input device with display button and portable electronic device having the same |
JP2008000534A (en) | 2006-06-26 | 2008-01-10 | Sophia Co Ltd | Game machine |
KR100846192B1 (en) | 2006-07-18 | 2008-07-14 | 주식회사 비젼하이텍 | The image saturation protection circuit of infrared camera |
US8189050B1 (en) | 2006-07-19 | 2012-05-29 | Flir Systems, Inc. | Filtering systems and methods for infrared image processing |
US7805020B2 (en) | 2006-07-25 | 2010-09-28 | Itt Manufacturing Enterprises, Inc. | Motion compensated image registration for overlaid/fused video |
KR20080013314A (en) | 2006-08-08 | 2008-02-13 | 김순동 | Card check system and method |
US7947222B2 (en) | 2006-08-15 | 2011-05-24 | Infopia Co., Ltd. | Mobile communication terminal equipped with temperature compensation function for use in bio-information measurement |
KR101224819B1 (en) | 2006-08-17 | 2013-01-21 | 엘지이노텍 주식회사 | Carmera Module |
KR100771364B1 (en) | 2006-08-22 | 2007-10-30 | 엘지이노텍 주식회사 | Camera module |
KR20080018407A (en) | 2006-08-24 | 2008-02-28 | 한국문화콘텐츠진흥원 | Computer-readable recording medium for recording of 3d character deformation program |
US7684634B2 (en) * | 2006-08-29 | 2010-03-23 | Raytheon Company | System and method for adaptive non-uniformity compensation for a focal plane array |
KR100796849B1 (en) | 2006-09-04 | 2008-01-22 | 삼성전자주식회사 | Method for photographing panorama mosaics picture in mobile device |
KR100743254B1 (en) | 2006-09-20 | 2007-07-27 | 김인균 | Ir lighting appratus |
KR100883653B1 (en) | 2006-10-02 | 2009-02-18 | 삼성전자주식회사 | Terminal having display button and method of displaying using the display button |
US7859720B2 (en) | 2006-11-13 | 2010-12-28 | Canon Kabushiki Kaisha | Image forming apparatus and method thereof |
KR100822053B1 (en) | 2006-11-17 | 2008-04-15 | 주식회사 엠씨넥스 | Apparatus and method for taking a picture |
KR100836400B1 (en) | 2006-11-20 | 2008-06-09 | 현대자동차주식회사 | Viscous damper of crank shaft for vehicle |
US7575077B2 (en) | 2006-11-22 | 2009-08-18 | Cnh America Llc | Lever connect PTO module for three-point hitch quick coupler |
US8153980B1 (en) | 2006-11-30 | 2012-04-10 | L-3 Communications Corp. | Color correction for radiation detectors |
KR20080050279A (en) | 2006-12-02 | 2008-06-05 | 한국전자통신연구원 | A reduction apparatus and method of popping artifacts for multi-level level-of-detail terrains |
DE102006057431A1 (en) | 2006-12-06 | 2008-06-12 | Robert Bosch Gmbh | Mobile telephone for collection and transmission of medical relevant data, has integrated radiation temperature sensor with evaluation electronics for preparing temperature data for transmission over mobile telephone |
KR20080053057A (en) | 2006-12-08 | 2008-06-12 | 주식회사 메디슨 | Ultrasound imaging system and method for forming and displaying fusion image of ultrasound image and external medical image |
KR20080054596A (en) | 2006-12-13 | 2008-06-18 | 삼성전자주식회사 | Apparatus for inspecting flat panel display and method thereof |
US20080151056A1 (en) | 2006-12-20 | 2008-06-26 | Chukwu Ahamefula | Automatic camera with inbuilt smoke and motion detectors that can send picture/pictures with audio or text |
KR20080059882A (en) | 2006-12-26 | 2008-07-01 | (주)네오와인 | Chipset circuit for the output of multiplex image signal in a screen synchronously and controlling method thereof |
US7725141B2 (en) | 2007-01-08 | 2010-05-25 | Wang Su | Multi-functional detached mobile phone |
KR101282973B1 (en) | 2007-01-09 | 2013-07-08 | 삼성전자주식회사 | Apparatus and method for displaying overlaid image |
GB0700917D0 (en) | 2007-01-17 | 2007-02-28 | Queen Mary & Westfield College | Structures with improved properties |
KR101349171B1 (en) | 2007-01-17 | 2014-01-09 | 삼성전자주식회사 | 3-dimensional graphics accelerator and method of distributing pixel thereof |
KR20080069007A (en) | 2007-01-22 | 2008-07-25 | 엘지이노텍 주식회사 | Mobile phone capable of infrared photographing |
JP4317879B2 (en) | 2007-01-29 | 2009-08-19 | 三菱重工業株式会社 | Positioning method of moving body |
SE531942C2 (en) | 2007-02-01 | 2009-09-15 | Flir Systems Ab | Method for image processing of infrared images including contrast enhancing filtering |
CN101241028A (en) * | 2007-02-07 | 2008-08-13 | 南京理工大学 | Infrared focal plane array image-forming demonstration system |
KR20080078315A (en) | 2007-02-23 | 2008-08-27 | (주)케이나인 | Camera module for selecting filters |
US8212877B2 (en) | 2007-03-02 | 2012-07-03 | Fujifilm Corporation | Image capturing system, image capturing method, and computer program product at which an image is captured at a predetermined time |
US8149109B2 (en) | 2007-04-23 | 2012-04-03 | Siemens Industry, Inc. | Mobile emergency device for emergency personnel |
US8164440B2 (en) | 2007-04-23 | 2012-04-24 | Siemens Industry, Inc. | Methods for emergency communication within a fire safety system |
KR100888554B1 (en) | 2007-04-26 | 2009-03-11 | 벽산건설 주식회사 | Recognition system |
KR100841243B1 (en) | 2007-04-27 | 2008-06-25 | 주식회사 쏠리테크 | Picture replying system based on 3g and method thereof |
KR100802525B1 (en) | 2007-04-27 | 2008-02-13 | 주식회사 세다스미디어 | Real time multi band camera |
KR101361359B1 (en) | 2007-04-30 | 2014-02-12 | 엘지이노텍 주식회사 | Camera Module |
KR100897170B1 (en) | 2007-05-10 | 2009-05-14 | 주식회사 대우일렉트로닉스 | Alpha blending system and its thereof method |
JP4341695B2 (en) | 2007-05-17 | 2009-10-07 | ソニー株式会社 | Image input processing device, imaging signal processing circuit, and imaging signal noise reduction method |
KR100870724B1 (en) | 2007-05-25 | 2008-11-27 | 인하대학교 산학협력단 | System for detecting image using t-test and method therefor |
KR20090003899A (en) | 2007-07-05 | 2009-01-12 | 엘지이노텍 주식회사 | Apparatus for processing image signal |
US8724895B2 (en) | 2007-07-23 | 2014-05-13 | Nvidia Corporation | Techniques for reducing color artifacts in digital images |
KR100854932B1 (en) | 2007-08-06 | 2008-08-29 | (주)씨앤에스 테크놀로지 | Image composition device with image conversion function |
KR100977516B1 (en) | 2007-08-17 | 2010-08-23 | 한국 고덴시 주식회사 | camera module visible in day and night |
DE102007039788A1 (en) | 2007-08-23 | 2009-02-26 | Testo Ag | detector |
KR101361857B1 (en) | 2007-08-24 | 2014-02-21 | 삼성전자주식회사 | Method and apparatus photographing using infrared in portable terminal |
KR100866475B1 (en) | 2007-09-05 | 2008-11-03 | 크라제비나(주) | Camera module and portable terminal having the same |
KR100866476B1 (en) | 2007-09-05 | 2008-11-03 | 크라제비나(주) | Camera module and portable terminal having the same |
KR20090036734A (en) | 2007-10-10 | 2009-04-15 | 삼성전자주식회사 | Image and telephone call communication terminal and camera tracking method of thereof |
KR100858034B1 (en) | 2007-10-18 | 2008-09-10 | (주)실리콘화일 | One chip image sensor for measuring vitality of subject |
FR2923339B1 (en) | 2007-11-05 | 2009-12-11 | Commissariat Energie Atomique | METHOD FOR READING A TWO-DIMENSIONAL PIXEL MATRIX AND DEVICE FOR IMPLEMENTING SUCH A METHOD |
KR101437849B1 (en) | 2007-11-21 | 2014-09-04 | 삼성전자주식회사 | Portable terminal and method for performing shooting mode thereof |
US20110279673A1 (en) | 2007-11-28 | 2011-11-17 | Flir Systems, Inc. | Maritime controls systems and methods |
KR100958030B1 (en) | 2007-11-28 | 2010-05-17 | 중앙대학교 산학협력단 | Emotion recognition mothod and system based on decision fusion |
KR100903348B1 (en) | 2007-11-28 | 2009-06-23 | 중앙대학교 산학협력단 | Emotion recognition mothod and system based on feature fusion |
WO2009070459A1 (en) | 2007-11-30 | 2009-06-04 | Jingyun Zhang | Miniature spectrometers working with cellular phones and other portable electronic devices |
US8086266B2 (en) | 2008-01-08 | 2011-12-27 | Block Engineering, Llc | Cell phone based MEMS fourier transform infrared (FTIR) gas sensors |
US7734171B2 (en) * | 2008-01-29 | 2010-06-08 | Autoliv Asp, Inc. | Snap-in retainer for a sensor system |
KR100935495B1 (en) | 2008-02-14 | 2010-01-06 | 중앙대학교 산학협력단 | Apparatus and method for digital auto-focusing based on iterative restoration and fusion of image |
CN100565140C (en) * | 2008-02-19 | 2009-12-02 | 东南大学 | The element circuit of infrared focal plane read-out circuit |
KR20090089931A (en) | 2008-02-20 | 2009-08-25 | 유넷웨어(주) | Real time complex image display apparatus |
KR100932752B1 (en) | 2008-02-20 | 2009-12-21 | 유넷웨어(주) | Real-time composite video output device with hot spot tracking |
KR100866573B1 (en) | 2008-02-22 | 2008-11-03 | 인하대학교 산학협력단 | A point-based rendering method using visibility map |
KR100922497B1 (en) | 2008-02-27 | 2009-10-20 | 주식회사 비츠로시스 | System of Traffic Regulation Using Image Fusion |
CN201203922Y (en) | 2008-03-25 | 2009-03-04 | 中兴通讯股份有限公司 | Mobile phone used as alarm |
WO2009122114A1 (en) | 2008-03-31 | 2009-10-08 | H-Icheck Limited | Apparatus for monitoring a human or animal subject's bodily function |
KR100962607B1 (en) | 2008-04-02 | 2010-06-11 | 주식회사 엠비젼 | Inspecting device for security printed matter using mobile telephone |
KR101475464B1 (en) | 2008-05-09 | 2014-12-22 | 삼성전자 주식회사 | Multi-layer image sensor |
KR100871916B1 (en) | 2008-05-13 | 2008-12-05 | 아람휴비스(주) | A portable clinical thermometer capable of providing visual images |
US8629398B2 (en) | 2008-05-30 | 2014-01-14 | The Regents Of The University Of Minnesota | Detection beyond the standard radiation noise limit using spectrally selective absorption |
US7995125B2 (en) * | 2008-06-10 | 2011-08-09 | Sensors Unlimited, Inc. | Apparatus and method for extending the dynamic range of a read out integrated circuit of an image sensor |
KR100968137B1 (en) | 2008-07-09 | 2010-07-06 | 안태웅 | Security system and method |
KR101004930B1 (en) | 2008-07-10 | 2010-12-28 | 성균관대학교산학협력단 | Full browsing method using gaze detection and handheld terminal performing the method |
US7723686B2 (en) | 2008-08-14 | 2010-05-25 | Hanvision Co., Ltd. | Image sensor for detecting wide spectrum and method of manufacturing the same |
KR101499960B1 (en) | 2008-08-19 | 2015-03-06 | 엘지이노텍 주식회사 | Camera module package and method of valuable the same |
US8049163B1 (en) | 2008-09-02 | 2011-11-01 | Flir Systems, Inc. | Calibration systems and methods for infrared cameras |
KR101441589B1 (en) | 2008-10-07 | 2014-09-29 | 삼성전자 주식회사 | Apparatus for combining visible images and far-infrared images optically |
US9323410B2 (en) | 2008-10-13 | 2016-04-26 | Sony Ericsson Mobile Communications Ab | User input displays for mobile devices |
US8446389B2 (en) | 2008-10-15 | 2013-05-21 | Lenovo (Singapore) Pte. Ltd | Techniques for creating a virtual touchscreen |
US8525776B2 (en) | 2008-10-27 | 2013-09-03 | Lenovo (Singapore) Pte. Ltd | Techniques for controlling operation of a device with a virtual touchscreen |
US20100113068A1 (en) | 2008-11-06 | 2010-05-06 | Lmr Inventions, Llc | Hosted imagery capture in an ad hoc for mobile computing |
KR100901784B1 (en) | 2008-11-11 | 2009-06-11 | 주식회사 창성에이스산업 | System for fire warning and the method thereof |
KR100990904B1 (en) | 2008-11-12 | 2010-11-01 | 한국과학기술원 | The apparatus for enhancing image by generating and blending multiple image and method therefor |
KR101056168B1 (en) | 2008-11-25 | 2011-08-11 | 크라제비전(주) | Camera module and portable terminal having same |
US20100131268A1 (en) | 2008-11-26 | 2010-05-27 | Alcatel-Lucent Usa Inc. | Voice-estimation interface and communication system |
CN101751714A (en) | 2008-12-05 | 2010-06-23 | 深圳富泰宏精密工业有限公司 | Multifunctional portable electronic device |
KR20100070116A (en) | 2008-12-17 | 2010-06-25 | 크루셜텍 (주) | Mobile communication device having optical joystick module |
KR20100070119A (en) | 2008-12-17 | 2010-06-25 | 크루셜텍 (주) | Mobile communication device having optical joystick module |
KR101241865B1 (en) | 2008-12-22 | 2013-03-12 | 한국전자통신연구원 | In-vehicle terminal for control forward looking infrared camera and method thereof |
US9442019B2 (en) | 2008-12-26 | 2016-09-13 | Fluke Corporation | Infrared imaging probe |
WO2010089830A1 (en) | 2009-02-03 | 2010-08-12 | パナソニック株式会社 | Image pick-up device |
KR20100089125A (en) | 2009-02-03 | 2010-08-12 | 라인식 | Image sensor for unmanned surveillance sensor |
KR20100090521A (en) | 2009-02-06 | 2010-08-16 | 크라제비전(주) | Camera module having flash wide-angle extension structure and portable terminal unit having the same |
JP2010181324A (en) | 2009-02-06 | 2010-08-19 | Sanyo Electric Co Ltd | Temperature measuring/display device and mobile information communication terminal |
KR20100091758A (en) | 2009-02-11 | 2010-08-19 | 삼성전자주식회사 | Mobile communication termnial with surveillance function and method for executing the surveillance function |
KR20110131247A (en) | 2009-02-27 | 2011-12-06 | 파운데이션 프로덕션, 엘엘씨 | Headset-based telecommunications platform |
WO2012170946A2 (en) | 2011-06-10 | 2012-12-13 | Flir Systems, Inc. | Low power and small form factor infrared imaging |
US8208026B2 (en) | 2009-03-02 | 2012-06-26 | Flir Systems, Inc. | Systems and methods for processing infrared images |
US9171361B2 (en) | 2010-04-23 | 2015-10-27 | Flir Systems Ab | Infrared resolution and contrast enhancement with fusion |
US9208542B2 (en) * | 2009-03-02 | 2015-12-08 | Flir Systems, Inc. | Pixel-wise noise reduction in thermal images |
KR20100098958A (en) | 2009-03-02 | 2010-09-10 | 크라제비전(주) | Portable terminal with mouse-pen |
US8520970B2 (en) | 2010-04-23 | 2013-08-27 | Flir Systems Ab | Infrared resolution and contrast enhancement with fusion |
US20100245582A1 (en) | 2009-03-25 | 2010-09-30 | Syclipse Technologies, Inc. | System and method of remote surveillance and applications therefor |
US20120276954A1 (en) | 2009-04-29 | 2012-11-01 | Jerry Kowalsky | Portable camera and surveillance device |
KR101016420B1 (en) | 2009-05-14 | 2011-02-21 | 동국대학교 산학협력단 | Non contacting finger vein image capturing device and portable mobile having the same |
CN101635754A (en) | 2009-05-22 | 2010-01-27 | 吴江市江南工贸有限公司 | Key of mobile phone for blind |
US8766808B2 (en) | 2010-03-09 | 2014-07-01 | Flir Systems, Inc. | Imager with multiple sensor arrays |
KR101048768B1 (en) | 2009-06-10 | 2011-07-15 | (주)실리콘화일 | Image sensor for measuring illuminance, proximity and color temperature |
CN201481406U (en) | 2009-06-19 | 2010-05-26 | 中卫莱康科技发展(北京)有限公司 | Mobile terminal |
US8428385B2 (en) * | 2009-06-24 | 2013-04-23 | Flir Systems, Inc. | Non-uniformity error correction with a bilateral filter |
KR101599435B1 (en) | 2009-07-14 | 2016-03-03 | 엘지이노텍 주식회사 | Camera module and manufacturing method thereof |
KR101038879B1 (en) | 2009-07-28 | 2011-06-03 | 삼성전기주식회사 | Camera module |
JP2011065133A (en) | 2009-08-20 | 2011-03-31 | Toppan Printing Co Ltd | Liquid crystal display device, black matrix substrate and color filter substrate |
KR20110019994A (en) | 2009-08-21 | 2011-03-02 | 삼성전기주식회사 | Camera module |
KR101038830B1 (en) | 2009-09-01 | 2011-06-03 | 삼성전기주식회사 | Camera module for day and night |
EP2478464B1 (en) | 2009-09-14 | 2019-05-08 | VIION Systems Inc. | Saccadic dual-resolution video analytics camera |
CN201550169U (en) | 2009-09-22 | 2010-08-11 | 陈章华 | Infrared night-vision zoom camera mobile phone |
CN102045423A (en) | 2009-10-16 | 2011-05-04 | 英华达(上海)电子有限公司 | Mobile terminal and method for detecting temperatures by using same |
TW201116030A (en) | 2009-10-23 | 2011-05-01 | Inventec Appliances Corp | A thermal imaging mobile phone |
KR101111167B1 (en) | 2009-10-23 | 2012-02-24 | (주)이지템 | Apparatus and method for photographing temperature picture in mobile communication device |
KR20110046941A (en) | 2009-10-29 | 2011-05-06 | 삼성전자주식회사 | A mobile terminal with image projector and a method of control thereof |
CN102055836B (en) | 2009-11-04 | 2013-12-11 | Tcl集团股份有限公司 | Mobile terminal with action recognition function and action recognition method thereof |
WO2011055772A1 (en) | 2009-11-05 | 2011-05-12 | 日本電気株式会社 | Image target identification device, image target identification method, and image target identification program |
US8837855B2 (en) | 2009-11-16 | 2014-09-16 | Verizon Patent And Licensing Inc. | Image compositing via multi-spectral detection |
KR20110056892A (en) | 2009-11-23 | 2011-05-31 | 삼성전자주식회사 | Multi touch detecting apparatus for lcd display unit and multi touch detecting method using the same |
US8153971B2 (en) | 2009-11-23 | 2012-04-10 | Flir Systems Ab | Camera with two visual imaging subsystems for determining parallax and for focusing an IR imaging subsystem |
US8848059B2 (en) | 2009-12-02 | 2014-09-30 | Apple Inc. | Systems and methods for receiving infrared data with a camera designed to detect images based on visible light |
CN201628839U (en) | 2010-01-19 | 2010-11-10 | 柯名会 | Infrared monitoring camera mobile phone |
CN103098110B (en) | 2010-03-17 | 2015-04-15 | 本田技研工业株式会社 | Vehicle surroundings monitoring device |
KR100985816B1 (en) | 2010-03-25 | 2010-10-08 | 주식회사 창성에이스산업 | System and method for detecting and automatically breaking fire and emergency in tunnel |
US8781420B2 (en) | 2010-04-13 | 2014-07-15 | Apple Inc. | Adjustable wireless circuitry with antenna-based proximity detector |
CN101825516A (en) * | 2010-05-04 | 2010-09-08 | 电子科技大学 | Device and method for testing infrared focal plane array device |
CN101859209A (en) | 2010-05-28 | 2010-10-13 | 程宇航 | Infrared detection device and method, infrared input device and figure user equipment |
US20120007987A1 (en) | 2010-07-06 | 2012-01-12 | American Technologies Network Corporation | Optical system with automatic switching between operation in daylight and thermovision modes |
WO2012027739A2 (en) | 2010-08-27 | 2012-03-01 | Milwaukee Electric Tool Corporation | Thermal detection systems, methods, and devices |
US9237401B2 (en) * | 2010-08-31 | 2016-01-12 | Apple Inc. | Electronic devices with adjustable bias impedances and adjustable bias voltages for accessories |
CN101945154A (en) | 2010-08-31 | 2011-01-12 | 上海酷吧信息技术有限公司 | Mobile phone with infrared safety device |
US20120083314A1 (en) | 2010-09-30 | 2012-04-05 | Ng Hock M | Multimedia Telecommunication Apparatus With Motion Tracking |
CN201897853U (en) | 2010-10-25 | 2011-07-13 | 山东神戎电子股份有限公司 | Monitor device with overtemperature alarm function |
US8305577B2 (en) | 2010-11-04 | 2012-11-06 | Nokia Corporation | Method and apparatus for spectrometry |
JP5625833B2 (en) | 2010-12-02 | 2014-11-19 | 株式会社島津製作所 | Radiation detector and radiography apparatus |
CN201869255U (en) | 2010-12-08 | 2011-06-15 | 河海大学 | Novel multifunctional mobile phone |
CN102564601A (en) | 2010-12-22 | 2012-07-11 | 精工爱普生株式会社 | Thermal detector, thermal detection device, electronic instrument, and thermal detector manufacturing method |
CN202261481U (en) | 2010-12-30 | 2012-05-30 | 广州宝胆医疗器械科技有限公司 | Mobile phone combining infrared thermal scanning and color Doppler ultrasonic scanning functions |
CN102045448A (en) | 2010-12-30 | 2011-05-04 | 广州宝胆医疗器械科技有限公司 | Mobile phone with infrared thermal scanning function |
US8760509B2 (en) | 2010-12-31 | 2014-06-24 | Fluke Corporation | Thermal imager with non-uniformity correction |
US20120184252A1 (en) | 2011-01-17 | 2012-07-19 | Alexander Samson Hirsch | Thermographic augmented reality display in an electronic device |
EP2477391A1 (en) | 2011-01-17 | 2012-07-18 | Research In Motion Limited | Thermographic augmented reality display in an electronic device |
US9167179B2 (en) | 2011-02-21 | 2015-10-20 | Vectronix, Inc. | On-board non-uniformity correction calibration methods for microbolometer focal plane arrays |
US9245332B2 (en) | 2011-03-09 | 2016-01-26 | Alcatel Lucent | Method and apparatus for image production |
CN102178510A (en) | 2011-03-25 | 2011-09-14 | 清华大学 | Mobile phone infrared imaging system |
JP5680475B2 (en) | 2011-04-26 | 2015-03-04 | 日本アビオニクス株式会社 | Portable wireless terminal |
TWI443362B (en) | 2011-04-29 | 2014-07-01 | Nat Applied Res Laboratoires | Non-visible particle detection device |
US9018576B2 (en) | 2011-05-10 | 2015-04-28 | Stmicroelectronics Asia Pacific Pte Ltd | Low drop-out regulator with distributed output network |
US9069083B2 (en) | 2011-05-19 | 2015-06-30 | Danimar Ltd. | Portable radiation detector |
US9143703B2 (en) | 2011-06-10 | 2015-09-22 | Flir Systems, Inc. | Infrared camera calibration techniques |
WO2012170954A2 (en) | 2011-06-10 | 2012-12-13 | Flir Systems, Inc. | Line based image processing and flexible memory system |
CN103931172A (en) | 2011-06-10 | 2014-07-16 | 菲力尔系统公司 | Systems and methods for intelligent monitoring of thoroughfares using thermal imaging |
KR101778353B1 (en) | 2011-06-10 | 2017-09-13 | 플리어 시스템즈, 인크. | Non-uniformity correction techniques for infrared imaging devices |
JP5772272B2 (en) | 2011-06-16 | 2015-09-02 | 富士通株式会社 | Information processing apparatus and information processing method |
US8275413B1 (en) | 2011-09-17 | 2012-09-25 | Fraden Corp. | Wireless communication device with integrated electromagnetic radiation sensors |
US20130204570A1 (en) | 2012-02-06 | 2013-08-08 | Tzila Mendelson | Cellular telephone and camera thermometers |
US20130320220A1 (en) | 2012-06-05 | 2013-12-05 | Michelle Donowsky | Portable Radiation Detector |
KR101499081B1 (en) | 2012-06-20 | 2015-03-05 | 엘시스템 주식회사 | Thermal imaging camera module and smart phone |
CN102880289B (en) | 2012-08-20 | 2016-03-30 | 广东步步高电子工业有限公司 | Detect control system and method that eyeball fixes point can realize video playback and time-out |
CN202998279U (en) | 2012-11-15 | 2013-06-12 | 郭家接 | Intelligent human body heat releasing infrared ray supervising device |
US8825112B1 (en) | 2014-02-26 | 2014-09-02 | Jacob Fraden | Mobile communication device with electromagnetic radiation sensors |
-
2012
- 2012-06-08 WO PCT/US2012/041744 patent/WO2012170946A2/en unknown
- 2012-06-08 EP EP12737632.5A patent/EP2719167B1/en active Active
- 2012-06-08 CN CN201811582716.1A patent/CN109618084B/en active Active
- 2012-06-08 CN CN201280038760.0A patent/CN103748867B/en active Active
- 2012-06-08 KR KR1020147000703A patent/KR101808375B1/en active IP Right Grant
-
2013
- 2013-12-09 US US14/101,245 patent/US9706139B2/en active Active
- 2013-12-18 US US14/133,095 patent/US9716844B2/en active Active
-
2017
- 2017-07-10 US US15/645,949 patent/US10122944B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6028309A (en) | 1997-02-11 | 2000-02-22 | Indigo Systems Corporation | Methods and circuitry for correcting temperature-induced errors in microbolometer focal plane array |
US6812465B2 (en) | 2002-02-27 | 2004-11-02 | Indigo Systems Corporation | Microbolometer focal plane array methods and circuitry |
US7034301B2 (en) | 2002-02-27 | 2006-04-25 | Indigo Systems Corporation | Microbolometer focal plane array systems and methods |
US7470904B1 (en) | 2006-03-20 | 2008-12-30 | Flir Systems, Inc. | Infrared camera packaging |
US7470902B1 (en) | 2006-03-20 | 2008-12-30 | Flir Systems, Inc. | Infrared camera electronic architectures |
US7679048B1 (en) | 2008-04-18 | 2010-03-16 | Flir Systems, Inc. | Systems and methods for selecting microbolometers within microbolometer focal plane arrays |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9843742B2 (en) | 2009-03-02 | 2017-12-12 | Flir Systems, Inc. | Thermal image frame capture using de-aligned sensor array |
US9235876B2 (en) | 2009-03-02 | 2016-01-12 | Flir Systems, Inc. | Row and column noise reduction in thermal images |
US10033944B2 (en) | 2009-03-02 | 2018-07-24 | Flir Systems, Inc. | Time spaced infrared image enhancement |
US9451183B2 (en) | 2009-03-02 | 2016-09-20 | Flir Systems, Inc. | Time spaced infrared image enhancement |
US9208542B2 (en) | 2009-03-02 | 2015-12-08 | Flir Systems, Inc. | Pixel-wise noise reduction in thermal images |
US11445131B2 (en) | 2009-06-03 | 2022-09-13 | Teledyne Flir, Llc | Imager with array of multiple infrared imaging modules |
US9706139B2 (en) | 2011-06-10 | 2017-07-11 | Flir Systems, Inc. | Low power and small form factor infrared imaging |
US10122944B2 (en) | 2011-06-10 | 2018-11-06 | Flir Systems, Inc. | Low power and small form factor infrared imaging |
US9716844B2 (en) | 2011-06-10 | 2017-07-25 | Flir Systems, Inc. | Low power and small form factor infrared imaging |
US9723227B2 (en) | 2011-06-10 | 2017-08-01 | Flir Systems, Inc. | Non-uniformity correction techniques for infrared imaging devices |
US9723228B2 (en) | 2011-06-10 | 2017-08-01 | Flir Systems, Inc. | Infrared camera system architectures |
US10230910B2 (en) | 2011-06-10 | 2019-03-12 | Flir Systems, Inc. | Infrared camera system architectures |
WO2014100786A1 (en) * | 2012-12-21 | 2014-06-26 | Flir Systems, Inc. | Selective image correction for infrared imaging devices |
WO2014100783A1 (en) * | 2012-12-21 | 2014-06-26 | Flir Systems, Inc. | Time spaced infrared image enhancement |
WO2014100784A1 (en) * | 2012-12-21 | 2014-06-26 | Flir Systems, Inc. | Infrared imaging enhancement with fusion |
WO2014100787A1 (en) * | 2012-12-21 | 2014-06-26 | Flir Systems, Inc. | Compact multi-spectrum imaging with fusion |
US20140196131A1 (en) * | 2013-01-07 | 2014-07-10 | Salutron, Inc. | User authentication based on a wrist vein pattern |
EP2962451A1 (en) * | 2013-02-28 | 2016-01-06 | Raytheon Company | Method and apparatus for gain and level correction of multi-tap ccd cameras |
US9736399B2 (en) | 2013-03-14 | 2017-08-15 | Drs Network & Imaging Systems, Llc | System architecture for thermal imaging and thermography cameras |
US10362244B2 (en) | 2013-03-14 | 2019-07-23 | Drs Network & Imaging Systems, Llc | Parallax reduction for multi-sensor camera systems |
US10602082B2 (en) | 2014-09-17 | 2020-03-24 | Fluke Corporation | Triggered operation and/or recording of test and measurement or imaging tools |
US10271020B2 (en) | 2014-10-24 | 2019-04-23 | Fluke Corporation | Imaging system employing fixed, modular mobile, and portable infrared cameras with ability to receive, communicate, and display data and images with proximity detection |
US10530977B2 (en) | 2015-09-16 | 2020-01-07 | Fluke Corporation | Systems and methods for placing an imaging tool in a test and measurement tool |
US10586319B2 (en) | 2015-10-23 | 2020-03-10 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
US10083501B2 (en) | 2015-10-23 | 2018-09-25 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
US11210776B2 (en) | 2015-10-23 | 2021-12-28 | Fluke Corporation | Imaging tool for vibration and/or misalignment analysis |
US11113791B2 (en) | 2017-01-03 | 2021-09-07 | Flir Systems, Inc. | Image noise reduction using spectral transforms |
US11227365B2 (en) | 2017-01-03 | 2022-01-18 | Flir Systems, Inc. | Image noise reduction using spectral transforms |
Also Published As
Publication number | Publication date |
---|---|
CN109618084A (en) | 2019-04-12 |
EP2719167B1 (en) | 2018-08-08 |
EP2719167A2 (en) | 2014-04-16 |
KR20140035491A (en) | 2014-03-21 |
CN103748867A (en) | 2014-04-23 |
US20140139685A1 (en) | 2014-05-22 |
US10122944B2 (en) | 2018-11-06 |
WO2012170946A3 (en) | 2013-05-23 |
CN109618084B (en) | 2021-03-05 |
US9716844B2 (en) | 2017-07-25 |
CN103748867B (en) | 2019-01-18 |
WO2012170946A9 (en) | 2013-04-04 |
US9706139B2 (en) | 2017-07-11 |
KR101808375B1 (en) | 2017-12-12 |
US20160366345A1 (en) | 2016-12-15 |
US20170318237A1 (en) | 2017-11-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10122944B2 (en) | Low power and small form factor infrared imaging | |
US9723227B2 (en) | Non-uniformity correction techniques for infrared imaging devices | |
US9900526B2 (en) | Techniques to compensate for calibration drifts in infrared imaging devices | |
US10051210B2 (en) | Infrared detector array with selectable pixel binning systems and methods | |
US9521289B2 (en) | Line based image processing and flexible memory system | |
US9948878B2 (en) | Abnormal clock rate detection in imaging sensor arrays | |
US10079982B2 (en) | Determination of an absolute radiometric value using blocked infrared sensors | |
EP2939413B1 (en) | Techniques to compensate for calibration drifts in infrared imaging devices | |
US9961277B2 (en) | Infrared focal plane array heat spreaders | |
US9207708B2 (en) | Abnormal clock rate detection in imaging sensor arrays | |
WO2013052196A1 (en) | Determination of an absolute radiometric value using blocked infrared sensors | |
EP2932529B1 (en) | Segmented focal plane array architecture | |
WO2014093721A2 (en) | Abnormal clock rate detection in imaging sensor arrays | |
WO2014105904A1 (en) | Infrared focal plane array heat spreaders | |
WO2014085699A1 (en) | Infrared imager with integrated metal layers | |
US9848134B2 (en) | Infrared imager with integrated metal layers | |
WO2014105897A1 (en) | Infrared detector array with selectable pixel binning systems and methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12737632 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20147000703 Country of ref document: KR Kind code of ref document: A |