WO2022010456A1 - Event-based thermal camera system - Google Patents

Event-based thermal camera system Download PDF

Info

Publication number
WO2022010456A1
WO2022010456A1 PCT/US2020/040934 US2020040934W WO2022010456A1 WO 2022010456 A1 WO2022010456 A1 WO 2022010456A1 US 2020040934 W US2020040934 W US 2020040934W WO 2022010456 A1 WO2022010456 A1 WO 2022010456A1
Authority
WO
WIPO (PCT)
Prior art keywords
thermal
imaging sensor
events
motion
pixel
Prior art date
Application number
PCT/US2020/040934
Other languages
French (fr)
Inventor
Manu RASTOGI
Madhu Sudan ATHREYA
M Anthony Lewis
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2020/040934 priority Critical patent/WO2022010456A1/en
Publication of WO2022010456A1 publication Critical patent/WO2022010456A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • B29C64/393Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F10/00Additive manufacturing of workpieces or articles from metallic powder
    • B22F10/80Data acquisition or data processing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F10/00Additive manufacturing of workpieces or articles from metallic powder
    • B22F10/80Data acquisition or data processing
    • B22F10/85Data acquisition or data processing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F12/00Apparatus or devices specially adapted for additive manufacturing; Auxiliary means for additive manufacturing; Combinations of additive manufacturing apparatus or devices with other processing apparatus or devices
    • B22F12/22Driving means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F12/00Apparatus or devices specially adapted for additive manufacturing; Auxiliary means for additive manufacturing; Combinations of additive manufacturing apparatus or devices with other processing apparatus or devices
    • B22F12/90Means for process control, e.g. cameras or sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • B33Y50/02Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F2999/00Aspects linked to processes or compositions used in powder metallurgy
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P10/00Technologies related to metal processing
    • Y02P10/25Process efficiency

Definitions

  • Some three-dimensional (3D) printing systems monitor the temperature of the 3D print bed or chamber to attain target temperature gradients thereon or therein. Some 3D printing systems may additionally or alternatively monitor the temperature of a 3D printed part to allow for sufficient cooling between print layers. Some 3D print systems utilize a thermal camera to capture a sequence of still images or video for periodic or continuous thermal monitoring.
  • Figure 1 illustrates an example functional block diagram of an event-based thermal imaging system with panning feedback control.
  • Figure 2 illustrates an example circuit diagram of an integrating event generator subsystem to generate a time-encoded output signal for each pixel, according to various examples described herein.
  • Figure 3A illustrates an example time-encoded output signal for a constant input signal, according to various examples described herein.
  • Figure 3B illustrates an example time-encoded output signal for a linearly increasing input signal, according to various examples described herein.
  • Figure 4 illustrates an example of positive and negative channel time- encoded pulse outputs for a sine wave input signal, according to various examples described herein.
  • Figure 5A illustrates an example block diagram of a thermal imaging system.
  • Figure 5B illustrates another example block diagram of a thermal imaging system.
  • Figure 5C illustrates another example block diagram of a thermal imaging system.
  • Figure 6 illustrates a flow diagram of an example method for generating a thermal image based on a time-multiplexed spatial encoding of time-encoded output signals of a panned event-based thermal imaging sensor.
  • Thermal cameras may be used to measure and monitor temperatures of a three-dimensional (3D) printer during operation and/or an object being 3D-printed.
  • a 3D object may be printed in layers that are successively deposited as each previously deposited layer cools sufficiently.
  • a thermal camera may be used to monitor the temperature of each deposited layer to ensure sufficient cooling between layers.
  • Thermal cameras may additionally, or alternatively, be used to monitor other processes and/or components of a 3D printer.
  • Thermal cameras e.g., infrared camaras
  • a 3D printer may utilize many thermal cameras and/or expensive and complex high- resolution thermal cameras to monitor multiple portions of the 3D printer.
  • an asynchronous event-based thermal camera may be used to monitor a 3D printer and/or object being printed via the 3D printer.
  • the event-based thermal sensors described herein are relatively high speed compared to frame-based thermal sensors.
  • the relatively high speed of the event- based thermal sensors described herein allow the camera system to be panned while continuously capturing signals that can be combined in post-processing using deep learning techniques to provide for a much higher effective spatial resolution than would otherwise be possible with a frame-based imaging system.
  • an event-based thermal camera may receive thermal radiation (e.g., infrared radiation) and generate an electric current corresponding to the intensity of the thermal radiation.
  • An event generator array may encode thermal events as electrical pulses with changes in the spacings between electrical pulses corresponding to thermal variations.
  • An image generation subsystem may generate a thermal image based on the encoded thermal events.
  • a controller such as a motor controller may move the thermal imaging sensor between various poses (e.g., to various angles and/or positions).
  • thermal camera thermal imaging, thermal sensor, and other terms preceded by “thermal” are used to describe systems, subsystems, and/or components that operate in one or more thermal radiation bands.
  • thermal radiation is used herein to describe any number of bands of electromagnetic radiation between microwave radiation and visible light radiation.
  • Thermal radiation may, for example, comprise the IR-A, IR-B, and/or IR-C bands as defined by the international commission on illumination (CIE).
  • CIE international commission on illumination
  • a thermal camera may be embodied in various examples as any one of a broad spectrum infrared camera, a mid-infrared camera, or a far-infrared camera, as referred to by the International Organization for Standards (ISO).
  • ISO International Organization for Standards
  • a motion cancelation subsystem may suppress noise events associated with the movement of the thermal imaging sensor.
  • the image generation subsystem may generate a time-multiplexed spatially encoded thermal image based on the salient thermal events (e.g., thermal events not caused by the motion of the thermal imaging sensor).
  • the motion cancelation subsystem may comprise an inertial measurement unit to detect motion and facilitate suppression of noise events associated with the movement of the thermal imaging sensor.
  • a system may additionally or alternatively include a panning controller to facilitate self calibration of the motion cancelation subsystem to detect the noise events associated with the movement of the thermal imaging sensor.
  • the system may move a thermal imaging sensor (e.g., a multi-pixel thermal imaging sensor) relative to a 3D printer between various poses.
  • the thermal imaging sensor may receive infrared and/or other thermal radiation via an array of detectors.
  • Each pixel of a multi-pixel thermal imaging sensor comprises a single thermal detector (e.g., an infrared detector).
  • a pixel may be formed from multiple thermal detectors.
  • the thermal radiation received by each pixel may be converted to an electric current.
  • An integrating event generation circuit may generate a time-encoded output signal for each pixel comprising a series of electrical pulses spaced according to the intensity of the received thermal radiation.
  • Some thermal cameras may produce an output signal with a voltage magnitude corresponding to the intensity of the received thermal radiation (e.g., the number of photons received in a given time period).
  • many of the examples described herein generate a current signal with a fixed or irrelevant voltage. The spacing between successive current signals encodes the relative intensity of the received thermal radiation for a given pixel during a given time period.
  • the event generator array may distinguish between salient thermal events and noise events associated with moving the multi-pixel thermal imaging sensor between different poses.
  • the system may further identify motion of the object being printed, a printhead of the 3D printer moving, motion of a 3D printer print bed, and/or another operational motion.
  • the event generator may be programmed to detect or ignore any of these operational motion events as salient or non-salient, depending on the specific application and programming. Non-salient motion events, including operational motion in some instances, may be suppressed, filtered, or otherwise removed or omitted from subsequent processing for generating the output thermal image.
  • a relatively high-resolution thermal image may be generated that includes salient thermal events based on a time-multiplexed spatial encoding of the time- encoded output signals as the multi-pixel thermal imaging sensor is moved.
  • an event-based thermal camera with a 640 x 480 array of pixels may be moved to various poses to capture salient thermal events.
  • the received thermal radiation may be time-multiplexed and spatially encoded to generate a thermal image of the 3D printer and/or an object being printed that has a much higher resolution (e.g., 1-100 megapixels).
  • the system may move the multi-pixel thermal imaging sensor relative to the 3D printer (including relative to an object being printed) by pivoting the multi-pixel thermal imaging sensor, refocusing the multi-pixel thermal imaging sensor, or by spatial translation of the multi-pixel thermal imaging sensor (e.g., translating the multi-pixel thermal imaging sensor up, down, left, right, forward, backward, or a combination thereof).
  • Various poses can be obtained by panning or otherwise moving the multi-pixel thermal imaging sensor in any of the three axes of direction relative to the 3D printer.
  • the event generator array is in communication with a controller of a 3D printer and operates in conjunction therewith.
  • the event generator array and/or the thermal imaging system as a whole may operate independently of the controller of the 3D printer.
  • the event generator may distinguish salient thermal events from noise events via a sensor- integrated motion suppression system, via motion estimation during post-processing, via post-processing filtering based on motion information from an inertial measurement unit (IMU), and/or via combination thereof.
  • IMU inertial measurement unit
  • modules, systems, and subsystems are described herein as implementing functions and/or as performing actions. In many instances, modules, systems, and subsystems may be divided into sub-modules, subsystems, or even as sub-portions of subsystems. Modules, systems, and subsystems may be implemented in hardware, software, hardware, and/or combinations thereof.
  • FIG. 1 illustrates an example functional block diagram of an event-based thermal imaging system 100 with panning feedback control.
  • an infrared (IR) to current converter 120 such as an event-based thermal camera, receive an infrared signal 110.
  • the IR-to-current converter 120 feeds a continuous-time signal into an event generator subsystem 130.
  • the event generator subsystem 130 may include an IMU 132, an event generator array 134, and/or a motor controller 136.
  • alternative thermal to current converters may be utilized to receive thermal signals that are outside the infrared band and/or include other thermal bands in addition to the infrared band.
  • the event-based thermal imaging system 100 may be specifically configured to operate in the infrared band, another thermal band, and/or a wider thermal band that includes the infrared band.
  • the motor controller 136 may move the event generator array 134 to various poses with respect to a subject to be imaged (e.g., a 3D printer or 3D-printed object).
  • the event generator array 134 may be moved to various poses by panning in one, two, or three axes, tilting, and/or refocusing.
  • a global motion suppressor may facilitate removal or suppression of thermal events due to movement of the event generator array 134.
  • Thermal events due to the motion of the event generator array 134 are considered noise or non-salient thermal events.
  • Thermal events due to the motion (e.g., vibrations) of the print bed, the object being printed, and/or other components of the 3D printer may also be considered noise or non-salient thermal events.
  • a self- motion cancellation subsystem 140 may cancel or otherwise suppress such noise.
  • an image generation subsystem may use the salient thermal events to reconstruct or generate an infrared image 160 that corresponds to the original infrared signal.
  • salient thermal events may be spatially and/or temporally combined as a single, high-resolution thermal image.
  • a global motion suppressor may be integrated or connected as part of the sensor.
  • global motion detection circuitry may detect global motion of an event-driven camera based on a plurality of pixels of the event- driven image sensor detecting events.
  • the global motion circuity may suppress the global motion by detecting when a pixel fire signal of a given pixel reaches an event threshold, resetting the pixel integration output of the given pixel to interrupt the global motion suppression using a pixel fire delay component.
  • the system may estimate motion during post processing through an optical flow and corresponding filter.
  • the system may evaluate discrete events and interpolate motion using the optical flow.
  • the system may utilize an IMU to relay movement of the event generator.
  • the relayed movements are used for global motion estimation and filtering during post-processing. Once the thermal events captured by the thermal events are filtered and transformed to account for the self-motion, the continuous-time signal can be reconstructed.
  • the movement of the event generator array is controlled by the motor controller 136.
  • the pose controller 170 may provide panning and/or tilting information to the motion controller 136.
  • the motion controller 136 may cause the event generator array 134 to move to another pose.
  • Thermal events associated with the motion may again be canceled or otherwise suppressed by the self-motion cancelation subsystem 140, and the salient thermal events may be recorded for spatial and temporal combination to generate a thermal image.
  • event-based thermal camera systems In contrast to relatively slow frame-based imaging systems that introduce motion blur, event-based thermal camera systems independently capture thermal events with a temporal resolution of up to 1 microsecond.
  • the relatively high speed of event-based thermal camera systems allows for high-speed movement and subsequent image acquisition for spatial and temporal combination.
  • Many of the examples described herein are described in terms of a single event generator or a one-dimensional array of event generators. However, it is appreciated that the same principles can be applied to a two-dimensional array that independently encodes a thermal signal into an event stream.
  • FIG. 2 illustrates an example circuit diagram 200 of an integrating event generator subsystem to generate a time-encoded output signal for each pixel, according to various examples described herein.
  • the integrating event generator subsystem generates a distinct positive output channel 290, Vout+, and a distinct negative output channel 295, Vout-
  • An input signal, modeled as a current source 210 corresponds to an intensity of received thermal (e.g., infrared) radiation.
  • the input current corresponding to the intensity of the received thermal radiation is integrated on the capacitor 220, Cm, resulting in a voltage, V m , 230 as input to operational amplifiers 240 and 241.
  • the operational amplifiers 240 and 241 compare the received voltage, V m , 230 with maximum and minimum voltages, Vth+ and Vth-, as illustrated.
  • the output of the comparator goes high, generating an output pulse and resetting the voltage V m to the Vmid value via the reset transistors 299 controlled by the positive output channel 290, Vout+, or the negative output channel 295, V 0 ut-
  • the capacitor 220, Cm may be held at the reset voltage value, Vmi d , for a predefined time, t r , after which the process repeats.
  • higher intensity thermal radiation will more quickly integrate to reach the threshold voltage (positive or negative). Accordingly, higher intensity thermal radiation results in closely spaced output pulses, while lower intensity thermal radiation results in wider spaced output pulses.
  • the dynamic range representable by the spaced pulses is limited only by the reset time, t r , which sets a minimum spacing between pulses.
  • the current source 210 is used to model the output of an infrared or thermal sensor, such as a photocell, that converts infrared light to electrical current (or other thermal radiation to an electric current).
  • an event converter (alternatively referred to herein as an event generator) encodes the thermal radiation intensity information in time.
  • An ADC encoding thermal radiation intensity in voltage levels has a limited dynamic range based on the maximum and minimum voltage levels available in the system. Instead, the event generator is limited only by how precisely the time can be represented by the current pulse. In practice, an event generator is able to capture a significantly higher dynamic range than an ADC of similar size, complexity, and cost.
  • FIG. 3A illustrates an example time-encoded output signal, V 0 ut+, for a constant input signal, lm, is integrated to produce a voltage, V m , until the voltage reaches a maximum threshold value, Vth+, and generates an output pulse of current, Vout+.
  • the constant input signal, lin is integrated at the same rate during each cycle resulting in uniformly spaced output pulses of current, V 0 ut+.
  • FIG. 3B illustrates an example time-encoded output signal, V 0 ut+, for a linearly increasing input signal, lin. Since the intensity of the received thermal radiation is continually increasing with respect to time, the capacitor integrates the received current and reaches the maximum threshold value, Vt h+ , quicker each cycle. Accordingly, the space between the output pulses, V 0ut+ , decreases with time.
  • Figure 4 illustrates an example of positive and negative channel time- encoded pulse outputs for a sine wave input signal, according to various examples described herein. As illustrated, the number of pulses, or detected events, are directly proportional to the amplitude of the sinusoidal input signal. Negative and positive channels of the output signal may be used to independently encode the amplitude (intensity) of the input signal, whether positive or negative.
  • An image generation subsystem can reconstruct the original signal using the asynchronous pulses of the output signals of each of a plurality of pixels of a thermal imaging sensor.
  • a bandlimited signal can be expressed as a low-pass filtered version of an appropriately weighted sum of delayed impulse functions.
  • a signal x(t) bandlimited to [-Qs, Qs] can be represented as either of Equations 1 A or 1 B below:
  • Equations 1A and 1B Wj is a scalar weight
  • h(t) is the impulse response to the low pass filter equal to sin(Q. s t)/(Q. s t)
  • the symbol * denotes convolution
  • Equation 2A Q is defined as V th x C m and t a and t b are the times of two consecutive events. Accordingly, Equation 2A can be expressed as the equivalent Equations 2B, 2C, and 2D below:
  • Equation 2D the constants Cy, are numerically computed as: [0052]
  • the resulting set of linear equations given by CW Q in the matrix form where W is a column vector with Wj as the j th row element, C is a square matrix with cij as the i th row and j th column element, and Q is a column vector with as the i th row element.
  • W the weight vector
  • Equation 5A and 5B is the j th row and the i th column element of the inverse matrix C -1 and:
  • any number of pulses may be spatially and temporally combined in a reconstruction process to generate a thermal image using the equations above.
  • the system may generate a thermal image based on a mathematical reconstruction of a sequence of pulses from each pixel of a thermal imaging sensor moved to various positions relative to the subject for imaging (e.g., a 3D printer or 3D-printed part).
  • the system may not use a mathematical reconstruction process, such as the example mathematical reconstruction process described above. Instead, the system may use a trained deep learning network to reconstruct an input signal x(t) from a sequence of pulses recorded by each pixel of a thermal imaging sensor moved to any number of poses.
  • FIG. 5A illustrates an example block diagram of a thermal imaging system 500 that includes an event-based thermal camera subsystem 510, an event generator array 520, and an image generation subsystem 530.
  • the event- based thermal camera subsystem 510 may receive infrared radiation (or other thermal radiation) and generate an electric current corresponding to the intensity of the infrared or other thermal radiation.
  • the event generator array 520 may encode thermal events as electrical pulses with changes in the spacings between electrical pulses corresponding to thermal variations. As described herein, the event generator array 520 may suppress noise events arising from the movement of the event-based thermal camera subsystem and/or associated event generator array.
  • the image generation subsystem 530 may use the sequence of electrical pulses generated by the event generator array to generate a thermal image based on the encoded thermal events.
  • Figure 5B illustrates another example block diagram of a thermal imaging system 501 that further includes a motor controller 540 and a motion cancelation subsystem 550.
  • the motor controller 540 may move the thermal imaging sensor between various poses. In each pose, pixels of the event-based thermal camera subsystem 510 may receive infrared radiation and generate electric currents corresponding to the intensity of the received infrared radiation.
  • the event generator array 520 may convert the received infrared radiation of each pixel to electrical pulses with spacings that correspond to the relative intensity of the thermal radiation using, for example, the integrating circuit described in conjunction with Figure 2.
  • the motion cancelation subsystem 550 may suppress noise events associated with the movement of the thermal imaging sensor.
  • the image generation subsystem 530 generates a time-multiplexed spatially encoded thermal image based on the salient thermal events as the thermal imaging sensor is moved by the motor controller 540.
  • Figure 5C illustrates another example block diagram of a thermal imaging system 502 that further includes a panning controller 560 that facilitates self-calibration of the motion cancelation subsystem 550 to detect the noise events associated with the movement of the thermal imaging sensor 510.
  • Figure 6 illustrates a flow diagram of an example method 600 for generating a thermal image based on a time-multiplexed spatial encoding of time-encoded output signals of a panned event-based thermal imaging sensor.
  • a motor controller may move, at 610, a multi-pixel thermal imaging sensor relative to a 3D printer.
  • the multi pixel thermal imaging sensor may be moved to various poses during a continuous time period.
  • the thermal imaging sensor may convert, at 620, infrared radiation received by each pixel of the multi-pixel thermal imaging sensor to an electric current during the continuous time period.
  • An integrated event generation circuit may generate, at 630, a time-encoded output signal as electrical pulses with changes in the spacings between electrical pulses corresponding to thermal variations.
  • the system may identify and/or distinguish between, at 640, thermal events as either noise events associated with the motion of the multi-pixel thermal imaging sensor or salient thermal events. Noise events ay be suppressed, at 650, or otherwise ignored.
  • Salient thermal events may be used for image generation, at 655.
  • the system may generate, at 660, a thermal image that includes the salient thermal events based on a time-multiplexed spatial encoding of the time-encoded output signals as the multi-pixel thermal imaging sensor is moved.
  • moving the multi-pixel thermal imaging sensor may comprise pivoting, panning, three-axis movement, and/or refocusing of the multi-pixel thermal imaging sensor relative to a 3D printer or 3D printed object.
  • Motion events associated with the movement of the thermal imaging sensor may be suppressed via a sensor-integrated motion suppression system, via motion estimation during post processing, and/or via post-processing filtering based on motion information from an IMU.

Abstract

Systems and methods are described herein for moving a multi-pixel thermal imaging sensor relative to a three-dimensional (3D) printer to various poses during a continuous time period. Thermal radiation received by each pixel of the multi-pixel thermal imaging sensor may be converted to an electric current. An integrated event generation circuit may generate a time-encoded output signal for each pixel. A thermal image may be generated that includes salient thermal events based on a time-multiplexed spatial encoding of the received signals as the multi-pixel thermal imaging sensor is moved. Non-salient thermal events, such as noise events associated with moving the multi-pixel thermal imaging sensor, are distinguished from salient thermal events.

Description

Event-Based Thermal Camera System
BACKGROUND
[0001] Some three-dimensional (3D) printing systems monitor the temperature of the 3D print bed or chamber to attain target temperature gradients thereon or therein. Some 3D printing systems may additionally or alternatively monitor the temperature of a 3D printed part to allow for sufficient cooling between print layers. Some 3D print systems utilize a thermal camera to capture a sequence of still images or video for periodic or continuous thermal monitoring.
BRIEF DESCRIPTION OF THE DRAWINGS [0002] The written disclosure herein describes illustrative examples that are non limiting and non-exhaustive. Reference is made to certain of such illustrative examples that are depicted in the figures described below.
[0003] Figure 1 illustrates an example functional block diagram of an event-based thermal imaging system with panning feedback control.
[0004] Figure 2 illustrates an example circuit diagram of an integrating event generator subsystem to generate a time-encoded output signal for each pixel, according to various examples described herein.
[0005] Figure 3A illustrates an example time-encoded output signal for a constant input signal, according to various examples described herein.
[0006] Figure 3B illustrates an example time-encoded output signal for a linearly increasing input signal, according to various examples described herein.
[0007] Figure 4 illustrates an example of positive and negative channel time- encoded pulse outputs for a sine wave input signal, according to various examples described herein.
[0008] Figure 5A illustrates an example block diagram of a thermal imaging system. [0009] Figure 5B illustrates another example block diagram of a thermal imaging system.
[0010] Figure 5C illustrates another example block diagram of a thermal imaging system.
[0011] Figure 6 illustrates a flow diagram of an example method for generating a thermal image based on a time-multiplexed spatial encoding of time-encoded output signals of a panned event-based thermal imaging sensor. DETAILED DESCRIPTION
[0012] Thermal cameras may be used to measure and monitor temperatures of a three-dimensional (3D) printer during operation and/or an object being 3D-printed. For example, a 3D object may be printed in layers that are successively deposited as each previously deposited layer cools sufficiently. In such an example, a thermal camera may be used to monitor the temperature of each deposited layer to ensure sufficient cooling between layers. Thermal cameras may additionally, or alternatively, be used to monitor other processes and/or components of a 3D printer.
[0013] Thermal cameras (e.g., infrared camaras) generally have a lower resolution than an visible light cameras of similar cost, complexity, and size. In some examples, a 3D printer may utilize many thermal cameras and/or expensive and complex high- resolution thermal cameras to monitor multiple portions of the 3D printer. According to various examples described herein, an asynchronous event-based thermal camera may be used to monitor a 3D printer and/or object being printed via the 3D printer. [0014] The event-based thermal sensors described herein are relatively high speed compared to frame-based thermal sensors. The relatively high speed of the event- based thermal sensors described herein allow the camera system to be panned while continuously capturing signals that can be combined in post-processing using deep learning techniques to provide for a much higher effective spatial resolution than would otherwise be possible with a frame-based imaging system.
[0015] As described according to various examples herein, an event-based thermal camera may receive thermal radiation (e.g., infrared radiation) and generate an electric current corresponding to the intensity of the thermal radiation. An event generator array may encode thermal events as electrical pulses with changes in the spacings between electrical pulses corresponding to thermal variations. An image generation subsystem may generate a thermal image based on the encoded thermal events. In some examples, a controller, such as a motor controller may move the thermal imaging sensor between various poses (e.g., to various angles and/or positions).
[0016] The terms thermal camera, thermal imaging, thermal sensor, and other terms preceded by “thermal” are used to describe systems, subsystems, and/or components that operate in one or more thermal radiation bands. The term thermal radiation is used herein to describe any number of bands of electromagnetic radiation between microwave radiation and visible light radiation. Thermal radiation may, for example, comprise the IR-A, IR-B, and/or IR-C bands as defined by the international commission on illumination (CIE). Thus, a thermal camera may be embodied in various examples as any one of a broad spectrum infrared camera, a mid-infrared camera, or a far-infrared camera, as referred to by the International Organization for Standards (ISO).
[0017] In some examples, as described herein, a motion cancelation subsystem may suppress noise events associated with the movement of the thermal imaging sensor. The image generation subsystem may generate a time-multiplexed spatially encoded thermal image based on the salient thermal events (e.g., thermal events not caused by the motion of the thermal imaging sensor).
[0018] The motion cancelation subsystem, in some examples, may comprise an inertial measurement unit to detect motion and facilitate suppression of noise events associated with the movement of the thermal imaging sensor. In some examples, a system may additionally or alternatively include a panning controller to facilitate self calibration of the motion cancelation subsystem to detect the noise events associated with the movement of the thermal imaging sensor.
[0019] In various examples, the system may move a thermal imaging sensor (e.g., a multi-pixel thermal imaging sensor) relative to a 3D printer between various poses. The thermal imaging sensor may receive infrared and/or other thermal radiation via an array of detectors. Each pixel of a multi-pixel thermal imaging sensor comprises a single thermal detector (e.g., an infrared detector). In other examples, a pixel may be formed from multiple thermal detectors. The thermal radiation received by each pixel may be converted to an electric current. An integrating event generation circuit may generate a time-encoded output signal for each pixel comprising a series of electrical pulses spaced according to the intensity of the received thermal radiation.
[0020] Some thermal cameras may produce an output signal with a voltage magnitude corresponding to the intensity of the received thermal radiation (e.g., the number of photons received in a given time period). In contrast, many of the examples described herein generate a current signal with a fixed or irrelevant voltage. The spacing between successive current signals encodes the relative intensity of the received thermal radiation for a given pixel during a given time period.
[0021] As described herein, the event generator array may distinguish between salient thermal events and noise events associated with moving the multi-pixel thermal imaging sensor between different poses. In some examples, the system may further identify motion of the object being printed, a printhead of the 3D printer moving, motion of a 3D printer print bed, and/or another operational motion. The event generator may be programmed to detect or ignore any of these operational motion events as salient or non-salient, depending on the specific application and programming. Non-salient motion events, including operational motion in some instances, may be suppressed, filtered, or otherwise removed or omitted from subsequent processing for generating the output thermal image.
[0022] A relatively high-resolution thermal image may be generated that includes salient thermal events based on a time-multiplexed spatial encoding of the time- encoded output signals as the multi-pixel thermal imaging sensor is moved. For example, an event-based thermal camera with a 640 x 480 array of pixels may be moved to various poses to capture salient thermal events. The received thermal radiation may be time-multiplexed and spatially encoded to generate a thermal image of the 3D printer and/or an object being printed that has a much higher resolution (e.g., 1-100 megapixels). As described herein, the system may move the multi-pixel thermal imaging sensor relative to the 3D printer (including relative to an object being printed) by pivoting the multi-pixel thermal imaging sensor, refocusing the multi-pixel thermal imaging sensor, or by spatial translation of the multi-pixel thermal imaging sensor (e.g., translating the multi-pixel thermal imaging sensor up, down, left, right, forward, backward, or a combination thereof). Various poses can be obtained by panning or otherwise moving the multi-pixel thermal imaging sensor in any of the three axes of direction relative to the 3D printer.
[0023] In some examples, the event generator array is in communication with a controller of a 3D printer and operates in conjunction therewith. In other examples, the event generator array and/or the thermal imaging system as a whole may operate independently of the controller of the 3D printer. As described herein, the event generator may distinguish salient thermal events from noise events via a sensor- integrated motion suppression system, via motion estimation during post-processing, via post-processing filtering based on motion information from an inertial measurement unit (IMU), and/or via combination thereof.
[0024] Various modules, systems, and subsystems are described herein as implementing functions and/or as performing actions. In many instances, modules, systems, and subsystems may be divided into sub-modules, subsystems, or even as sub-portions of subsystems. Modules, systems, and subsystems may be implemented in hardware, software, hardware, and/or combinations thereof.
[0025] Figure 1 illustrates an example functional block diagram of an event-based thermal imaging system 100 with panning feedback control. As illustrated, an infrared (IR) to current converter 120, such as an event-based thermal camera, receive an infrared signal 110. The IR-to-current converter 120 feeds a continuous-time signal into an event generator subsystem 130. The event generator subsystem 130 may include an IMU 132, an event generator array 134, and/or a motor controller 136. [0026] As generally described herein, alternative thermal to current converters may be utilized to receive thermal signals that are outside the infrared band and/or include other thermal bands in addition to the infrared band. Accordingly, the event-based thermal imaging system 100 may be specifically configured to operate in the infrared band, another thermal band, and/or a wider thermal band that includes the infrared band.
[0027] The motor controller 136 may move the event generator array 134 to various poses with respect to a subject to be imaged (e.g., a 3D printer or 3D-printed object). The event generator array 134 may be moved to various poses by panning in one, two, or three axes, tilting, and/or refocusing. In various examples, as described herein, a global motion suppressor may facilitate removal or suppression of thermal events due to movement of the event generator array 134.
[0028] Thermal events due to the motion of the event generator array 134 are considered noise or non-salient thermal events. Thermal events due to the motion (e.g., vibrations) of the print bed, the object being printed, and/or other components of the 3D printer may also be considered noise or non-salient thermal events. A self- motion cancellation subsystem 140 may cancel or otherwise suppress such noise. Once the noise is suppressed or otherwise canceled, an image generation subsystem may use the salient thermal events to reconstruct or generate an infrared image 160 that corresponds to the original infrared signal. As described herein, salient thermal events may be spatially and/or temporally combined as a single, high-resolution thermal image.
[0029] In one example, a global motion suppressor may be integrated or connected as part of the sensor. In such an example, global motion detection circuitry may detect global motion of an event-driven camera based on a plurality of pixels of the event- driven image sensor detecting events. The global motion circuity may suppress the global motion by detecting when a pixel fire signal of a given pixel reaches an event threshold, resetting the pixel integration output of the given pixel to interrupt the global motion suppression using a pixel fire delay component.
[0030] In another example, the system may estimate motion during post processing through an optical flow and corresponding filter. The system may evaluate discrete events and interpolate motion using the optical flow.
[0031] In another example, the system may utilize an IMU to relay movement of the event generator. The relayed movements are used for global motion estimation and filtering during post-processing. Once the thermal events captured by the thermal events are filtered and transformed to account for the self-motion, the continuous-time signal can be reconstructed.
[0032] As described herein, the movement of the event generator array is controlled by the motor controller 136. After salient thermal images are captured in a first pose, the pose controller 170 may provide panning and/or tilting information to the motion controller 136. The motion controller 136 may cause the event generator array 134 to move to another pose. Thermal events associated with the motion may again be canceled or otherwise suppressed by the self-motion cancelation subsystem 140, and the salient thermal events may be recorded for spatial and temporal combination to generate a thermal image.
[0033] In contrast to relatively slow frame-based imaging systems that introduce motion blur, event-based thermal camera systems independently capture thermal events with a temporal resolution of up to 1 microsecond. The relatively high speed of event-based thermal camera systems allows for high-speed movement and subsequent image acquisition for spatial and temporal combination. Many of the examples described herein are described in terms of a single event generator or a one-dimensional array of event generators. However, it is appreciated that the same principles can be applied to a two-dimensional array that independently encodes a thermal signal into an event stream.
[0034] Figure 2 illustrates an example circuit diagram 200 of an integrating event generator subsystem to generate a time-encoded output signal for each pixel, according to various examples described herein. In the illustrated example, the integrating event generator subsystem generates a distinct positive output channel 290, Vout+, and a distinct negative output channel 295, Vout- An input signal, modeled as a current source 210, corresponds to an intensity of received thermal (e.g., infrared) radiation. The input current corresponding to the intensity of the received thermal radiation is integrated on the capacitor 220, Cm, resulting in a voltage, Vm, 230 as input to operational amplifiers 240 and 241. The operational amplifiers 240 and 241 compare the received voltage, Vm, 230 with maximum and minimum voltages, Vth+ and Vth-, as illustrated.
[0035] When a threshold is reached, the output of the comparator goes high, generating an output pulse and resetting the voltage Vm to the Vmid value via the reset transistors 299 controlled by the positive output channel 290, Vout+, or the negative output channel 295, V0ut- The capacitor 220, Cm, may be held at the reset voltage value, Vmid, for a predefined time, tr, after which the process repeats. As can be readily appreciated, higher intensity thermal radiation will more quickly integrate to reach the threshold voltage (positive or negative). Accordingly, higher intensity thermal radiation results in closely spaced output pulses, while lower intensity thermal radiation results in wider spaced output pulses. The dynamic range representable by the spaced pulses is limited only by the reset time, tr, which sets a minimum spacing between pulses. [0036] As previously noted, the current source 210 is used to model the output of an infrared or thermal sensor, such as a photocell, that converts infrared light to electrical current (or other thermal radiation to an electric current). Unlike an analog to digital converter (ADC) that encodes infrared intensity in voltage levels, an event converter (alternatively referred to herein as an event generator) encodes the thermal radiation intensity information in time. An ADC encoding thermal radiation intensity in voltage levels has a limited dynamic range based on the maximum and minimum voltage levels available in the system. Instead, the event generator is limited only by how precisely the time can be represented by the current pulse. In practice, an event generator is able to capture a significantly higher dynamic range than an ADC of similar size, complexity, and cost.
[0037] Figure 3A illustrates an example time-encoded output signal, V0ut+, for a constant input signal, lm, is integrated to produce a voltage, Vm, until the voltage reaches a maximum threshold value, Vth+, and generates an output pulse of current, Vout+. As illustrated, the constant input signal, lin, is integrated at the same rate during each cycle resulting in uniformly spaced output pulses of current, V0ut+.
[0038] Figure 3B illustrates an example time-encoded output signal, V0ut+, for a linearly increasing input signal, lin. Since the intensity of the received thermal radiation is continually increasing with respect to time, the capacitor integrates the received current and reaches the maximum threshold value, Vth+, quicker each cycle. Accordingly, the space between the output pulses, V0ut+, decreases with time.
[0039] Figure 4 illustrates an example of positive and negative channel time- encoded pulse outputs for a sine wave input signal, according to various examples described herein. As illustrated, the number of pulses, or detected events, are directly proportional to the amplitude of the sinusoidal input signal. Negative and positive channels of the output signal may be used to independently encode the amplitude (intensity) of the input signal, whether positive or negative.
[0040] An image generation subsystem can reconstruct the original signal using the asynchronous pulses of the output signals of each of a plurality of pixels of a thermal imaging sensor. According to various examples, a bandlimited signal can be expressed as a low-pass filtered version of an appropriately weighted sum of delayed impulse functions. For example, a signal x(t) bandlimited to [-Qs, Qs] can be represented as either of Equations 1 A or 1 B below:
Figure imgf000010_0001
[0043] In Equations 1A and 1B, Wj is a scalar weight, h(t) is the impulse response to the low pass filter equal to sin(Q.st)/(Q.st), the symbol * denotes convolution, and Sj represents the timings of the impulse train (i.e. , the sequence of impulses or pulses). It is assumed for Equations 1A and 1B that the maximum adjacent sampling timing is less than the Nyquist period T = tt/W5.
[0044] The events generated by the event generator, or event converter, are governed by principles mathematically represented as:
[0045] ac(ϋ) dt = Q Equation 2 A
[0046] In Equation 2A, Q is defined as Vth x Cm and ta and tb are the times of two consecutive events. Accordingly, Equation 2A can be expressed as the equivalent Equations 2B, 2C, and 2D below:
Figure imgf000010_0002
[0050] In Equation 2D, the constants Cy, are numerically computed as:
Figure imgf000010_0003
[0052] The resulting set of linear equations given by CW = Q in the matrix form where W is a column vector with Wj as the jth row element, C is a square matrix with cij as the ith row and jth column element, and Q is a column vector with
Figure imgf000011_0001
as the ith row element. Thus, the weight vector, W, can be estimated as:
[0053] W = C_10 Equation 4
[0054] Accordingly, x(t) can be estimated using the equations above as follows:
Figure imgf000011_0002
[0058] In Equations 5A and 5B,
Figure imgf000011_0003
is the jth row and the ith column element of the inverse matrix C-1 and:
[0059]
Figure imgf000011_0004
Equation 5D
[0060] Any number of pulses may be spatially and temporally combined in a reconstruction process to generate a thermal image using the equations above. Thus, the system may generate a thermal image based on a mathematical reconstruction of a sequence of pulses from each pixel of a thermal imaging sensor moved to various positions relative to the subject for imaging (e.g., a 3D printer or 3D-printed part). [0061] In other examples, the system may not use a mathematical reconstruction process, such as the example mathematical reconstruction process described above. Instead, the system may use a trained deep learning network to reconstruct an input signal x(t) from a sequence of pulses recorded by each pixel of a thermal imaging sensor moved to any number of poses.
[0062] Figure 5A illustrates an example block diagram of a thermal imaging system 500 that includes an event-based thermal camera subsystem 510, an event generator array 520, and an image generation subsystem 530. As described above, the event- based thermal camera subsystem 510 may receive infrared radiation (or other thermal radiation) and generate an electric current corresponding to the intensity of the infrared or other thermal radiation.
[0063] The event generator array 520 may encode thermal events as electrical pulses with changes in the spacings between electrical pulses corresponding to thermal variations. As described herein, the event generator array 520 may suppress noise events arising from the movement of the event-based thermal camera subsystem and/or associated event generator array. The image generation subsystem 530 may use the sequence of electrical pulses generated by the event generator array to generate a thermal image based on the encoded thermal events.
[0064] Figure 5B illustrates another example block diagram of a thermal imaging system 501 that further includes a motor controller 540 and a motion cancelation subsystem 550. The motor controller 540 may move the thermal imaging sensor between various poses. In each pose, pixels of the event-based thermal camera subsystem 510 may receive infrared radiation and generate electric currents corresponding to the intensity of the received infrared radiation. The event generator array 520 may convert the received infrared radiation of each pixel to electrical pulses with spacings that correspond to the relative intensity of the thermal radiation using, for example, the integrating circuit described in conjunction with Figure 2.
[0065] The motion cancelation subsystem 550 may suppress noise events associated with the movement of the thermal imaging sensor. In such examples, the image generation subsystem 530 generates a time-multiplexed spatially encoded thermal image based on the salient thermal events as the thermal imaging sensor is moved by the motor controller 540.
[0066] Figure 5C illustrates another example block diagram of a thermal imaging system 502 that further includes a panning controller 560 that facilitates self-calibration of the motion cancelation subsystem 550 to detect the noise events associated with the movement of the thermal imaging sensor 510.
[0067] Figure 6 illustrates a flow diagram of an example method 600 for generating a thermal image based on a time-multiplexed spatial encoding of time-encoded output signals of a panned event-based thermal imaging sensor. A motor controller may move, at 610, a multi-pixel thermal imaging sensor relative to a 3D printer. The multi pixel thermal imaging sensor may be moved to various poses during a continuous time period. The thermal imaging sensor may convert, at 620, infrared radiation received by each pixel of the multi-pixel thermal imaging sensor to an electric current during the continuous time period.
[0068] An integrated event generation circuit may generate, at 630, a time-encoded output signal as electrical pulses with changes in the spacings between electrical pulses corresponding to thermal variations. The system may identify and/or distinguish between, at 640, thermal events as either noise events associated with the motion of the multi-pixel thermal imaging sensor or salient thermal events. Noise events ay be suppressed, at 650, or otherwise ignored. Salient thermal events may be used for image generation, at 655. The system may generate, at 660, a thermal image that includes the salient thermal events based on a time-multiplexed spatial encoding of the time-encoded output signals as the multi-pixel thermal imaging sensor is moved. [0069] In various examples, moving the multi-pixel thermal imaging sensor may comprise pivoting, panning, three-axis movement, and/or refocusing of the multi-pixel thermal imaging sensor relative to a 3D printer or 3D printed object. Motion events associated with the movement of the thermal imaging sensor may be suppressed via a sensor-integrated motion suppression system, via motion estimation during post processing, and/or via post-processing filtering based on motion information from an IMU.
[0070] Many of the examples described herein are provided in the context of 3D printers and 3D-printed objects. However, it is appreciated that any combination of the various systems and methods described herein may be used to generate thermal images of other systems, assemblies, or objects other than or in addition to 3D printers and 3D-printed objects.
[0071] Specific examples of the disclosure are described above and illustrated in the figures. It is, however, appreciated that many adaptations and modifications could be made to the specific configurations and components detailed above. In some cases, well-known features, structures, and/or operations are not shown or described in detail. Furthermore, the described features, structures, or operations may be combined in any suitable manner. It is also appreciated that the components of the examples, as generally described, and as described in conjunction with the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, all feasible permutations and combinations of examples are contemplated. Furthermore, it is appreciated that changes may be made to the details of the above- described examples without departing from the underlying principles thereof.
[0072] In the description above, various features are sometimes grouped together in a single example, figure, or description thereof for the purpose of streamlining the disclosure. This method of disclosure, however, is not to be interpreted as reflecting an intention that any claim now presented or presented in the future requires more features than those expressly recited in that claim. Rather, it is appreciated that inventive aspects lie in a combination of fewer than all features of any single foregoing disclosed example. The claims are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate example. This disclosure includes all permutations and combinations of the independent claims with their dependent claims.

Claims

What is claimed is:
1. A system, comprising: an event-based thermal camera subsystem to receive thermal radiation and generate an electric current corresponding to the intensity of the thermal radiation; an event generator array to encode thermal events as electrical pulses with changes in the spacings between electrical pulses corresponding to thermal variations; and an image generation subsystem to generate a thermal image based on the encoded thermal events.
2. The system of claim 1 , further comprising: a motor controller to move the thermal imaging sensor between various poses.
3. The system of claim 2, further comprising: a motion cancelation subsystem to suppress noise events associated with the movement of the thermal imaging sensor, and wherein the image generation subsystem generates a time-multiplexed spatially encoded thermal image based on the salient thermal events as the thermal imaging sensor is moved by the motor controller.
4. The system of claim 3, wherein the motion cancelation subsystem comprises an inertial measurement unit to detect motion and facilitate suppression of noise events associated with the movement of the thermal imaging sensor.
5. The system of claim 2, further comprising a panning controller to facilitate self-calibration of the motion cancelation subsystem to detect the noise events associated with the movement of the thermal imaging sensor
6. A method, comprising: moving a multi-pixel thermal imaging sensor relative to a three-dimensional (3D) printer to various poses during a continuous time period; converting, via the thermal imaging sensor, thermal radiation received by each pixel of the multi-pixel thermal imaging sensor to an electric current during the continuous time period; generating, via an integrating event generation circuit, a time-encoded output signal for each pixel comprising a series of electrical pulses spaced according to the intensity of the received thermal radiation; distinguishing between salient thermal events and noise events associated with moving the multi-pixel thermal imaging sensor; and generating a thermal image that includes the salient thermal events based on a time-multiplexed spatial encoding of the time-encoded output signals as the multi pixel thermal imaging sensor is moved.
7. The method of claim 6, wherein moving the multi-pixel thermal imaging sensor comprises pivoting the multi-pixel thermal imaging sensor relative to the 3D printer.
8. The method of claim 6, wherein moving the thermal imaging sensor comprises panning the thermal imaging sensor relative to the 3D printer.
9. The method of claim 6, wherein distinguishing between salient thermal events and noise events comprises suppressing motion events associated with vibrations of a portion of the 3D printer during operation.
10. The method of claim 6, wherein distinguishing between salient thermal events and noise events comprises suppressing motion events associated with the movement of the thermal imaging sensor.
11. The method of claim 10, wherein the motion associated with the movement of the thermal imaging sensor is suppressed via a sensor-integrated motion suppression system.
12. The method of claim 10, wherein the motion associated with the movement of the thermal imaging sensor is suppressed via one of: motion estimation during post-processing, and post-processing filtering based on motion information from an inertial measurement unit (IMU).
13. The method of claim 6, wherein the thermal radiation comprises at least one of IR-A, IR-B, and IR-C electromagnetic radiation.
14. A thermal imaging system, comprising: an event-based thermal camera system to detect thermal events based on thermal radiation received via a thermal sensor; a controller to scan the event-based thermal camera system relative to a print bed of a three-dimensional (3D) printer; and an image reconstruction subsystem to: distinguish between thermal motion noise due to movement of the event-based thermal camera system and salient thermal events, and generate a composite image of the print bed based on the salient thermal events.
15. The system of claim 15, wherein the event-based thermal camera system includes an event generator array to encode thermal events as electrical pulses with changes in the spacings between electrical pulses corresponding to thermal motion
PCT/US2020/040934 2020-07-06 2020-07-06 Event-based thermal camera system WO2022010456A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/040934 WO2022010456A1 (en) 2020-07-06 2020-07-06 Event-based thermal camera system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/040934 WO2022010456A1 (en) 2020-07-06 2020-07-06 Event-based thermal camera system

Publications (1)

Publication Number Publication Date
WO2022010456A1 true WO2022010456A1 (en) 2022-01-13

Family

ID=79552680

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/040934 WO2022010456A1 (en) 2020-07-06 2020-07-06 Event-based thermal camera system

Country Status (1)

Country Link
WO (1) WO2022010456A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018194737A1 (en) * 2017-04-18 2018-10-25 Facebook Technologies, Llc Event camera
US20190180113A1 (en) * 2017-02-03 2019-06-13 Raytheon Company Pixel-based event detection for tracking, hostile fire indication, glint suppression, and other applications
WO2019143346A1 (en) * 2018-01-19 2019-07-25 Hewlett-Packard Development Company, L.P. Three-dimension printing system and method
WO2020102110A1 (en) * 2018-11-13 2020-05-22 Magic Leap, Inc. Event-based ir camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190180113A1 (en) * 2017-02-03 2019-06-13 Raytheon Company Pixel-based event detection for tracking, hostile fire indication, glint suppression, and other applications
WO2018194737A1 (en) * 2017-04-18 2018-10-25 Facebook Technologies, Llc Event camera
WO2019143346A1 (en) * 2018-01-19 2019-07-25 Hewlett-Packard Development Company, L.P. Three-dimension printing system and method
WO2020102110A1 (en) * 2018-11-13 2020-05-22 Magic Leap, Inc. Event-based ir camera

Similar Documents

Publication Publication Date Title
CN104796661B (en) Monitoring camera and digital video recorder
US6211515B1 (en) Adaptive non-uniformity compensation using feedforward shunting and wavelet filter
Agrawal et al. Optimal single image capture for motion deblurring
US20190028658A1 (en) Data digitization and display for an imaging system
Liang et al. Analysis and compensation of rolling shutter effect
WO2018164969A1 (en) Multi-channel compressive sensing-based object recognition
US20140008520A1 (en) Global shutter with dual storage
JPS60500386A (en) Resolution improvement and zoom
US9648265B2 (en) Imaging systems and methods for mitigating pixel data quantization error
KR20200005452A (en) A method and apparatus for capturing dynamic images
US8975570B2 (en) CMOS time delay and integration image sensor
CN107211098B (en) Method and apparatus for imaging a scene
WO2022010456A1 (en) Event-based thermal camera system
US20110063459A1 (en) System and method for acquiring a still image from a moving image
US8743220B2 (en) System and method for acquiring a still image from a moving image
Posch et al. Live demonstration: Asynchronous time-based image sensor (atis) camera with full-custom ae processor
WO2000024193A1 (en) Adaptive non-uniformity compensation using feedforward shunting
WO2020004074A1 (en) Image processing device, image processing method, mobile device, and program
JP6159811B2 (en) Resolution and focus improvement
JP2020108022A5 (en)
Hardie et al. A map estimator for simultaneous superresolution and detector nonunifomity correction
JP5161752B2 (en) Imaging device
JP3783863B2 (en) Astronomical detector
Schuler et al. TARID-based image super-resolution
Mitra et al. Toward compressive camera networks

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20944326

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20944326

Country of ref document: EP

Kind code of ref document: A1