WO2015054601A2 - Imagerie compressive à capteurs multiples - Google Patents

Imagerie compressive à capteurs multiples Download PDF

Info

Publication number
WO2015054601A2
WO2015054601A2 PCT/US2014/060080 US2014060080W WO2015054601A2 WO 2015054601 A2 WO2015054601 A2 WO 2015054601A2 US 2014060080 W US2014060080 W US 2014060080W WO 2015054601 A2 WO2015054601 A2 WO 2015054601A2
Authority
WO
WIPO (PCT)
Prior art keywords
antenna
region
interest
adjustable
frequencies
Prior art date
Application number
PCT/US2014/060080
Other languages
English (en)
Other versions
WO2015054601A3 (fr
Inventor
David Brady
Tom Driscoll
John Hunt
Daniel Marks
Alexander Mrozack
Matthew Reynolds
David R. Smith
Original Assignee
Duke University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Duke University filed Critical Duke University
Publication of WO2015054601A2 publication Critical patent/WO2015054601A2/fr
Publication of WO2015054601A3 publication Critical patent/WO2015054601A3/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/887Radar or analogous systems specially adapted for specific applications for detection of concealed objects, e.g. contraband or weapons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01QANTENNAS, i.e. RADIO AERIALS
    • H01Q15/00Devices for reflection, refraction, diffraction or polarisation of waves radiated from an antenna, e.g. quasi-optical devices
    • H01Q15/0006Devices acting selectively as reflecting surface, as diffracting or as refracting device, e.g. frequency filtering or angular spatial filtering devices
    • H01Q15/0086Devices acting selectively as reflecting surface, as diffracting or as refracting device, e.g. frequency filtering or angular spatial filtering devices said selective devices having materials with a synthesized negative refractive index, e.g. metamaterials or left-handed materials
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object

Definitions

  • FIG. 1 depicts imaging with a multi-sensor compressive imaging system.
  • FIG. 2A depicts a Shottky junction-tuned metamaterial element.
  • FIG. 2B depicts a tunable frequency response of a Shottky junction-tuned metamaterial element.
  • FIG. 3A depicts a MOSFET-tuned metamaterial element.
  • FIG. 3B depicts a tunable frequency response of a MOSFET-tuned metamaterial element.
  • FIG. 4 depicts a three-dimensional image combining visual and RF data.
  • FIG. 5 depicts a system block diagram.
  • Compressive imaging systems such as those described in PCT Application No. PCT/US 13/40444, provide an imaging tool suitable for applications including holography, microwave imaging, microwave/mmW imaging, human tracking, security imaging, and threat detection.
  • Embodiments may utilize metamaterial aperture antennas for illuminating a scene and/or measuring a scene, with various examples of metamaterial aperture antennas described in: the above-mentioned PCT application (published as PCT Publication No. WO/2014/025425); A. Bily et al, "Surface Scattering Antennas," U.S. Patent Publication No. 2012/0194399; and A. Bily et al, "Surface Scattering Antenna Improvements," U.S.
  • a compressive imaging system includes both an RF imaging component (such an a metamaterial surface antenna) and an auxiliary sensing component such (as an EO/IR sensor) to provide a multi-sensor integrated imaging system.
  • An illustrative approach is depicted in FIG. 1. The figure shows an illumination antenna 100 and a measurement antenna 110 addressing a field of view 120. This configuration is not intended to be limiting: in other approaches, the same antenna is utilized both for illumination and measurement; in yet other approaches, multiple antennas are utilized for illumination and/or multiple antennas are utilized for measurement.
  • the field of view is also addressed by an auxiliary sensor unit 150 that is operable to identify a region of interest within the field of view.
  • the sensor unit may include a structured light sensor unit.
  • a structured light sensor such as a PRIMESENSE sensor or a MICROSOFT KF ECT unit (which embeds a PRIMESENSE sensor), can obtain depth information about a scene by projecting a pattern of light (such as infrared light) on the scene and then observing how the pattern falls on the elements of the scene. See, for example, Z. Zalevsky et al., "Method and system for object reconstruction," U.S. Patent Publication No. 2013/0155195, herein incorporated by reference.
  • the depth information can be used for object recognition, especially to identify the location and/or posture of a human target within the field of view of the sensor unit. See, for example, J. Shotton et al, "Real-time human pose recognition in parts from single depth images,"
  • the sensor unit may include: other EO/IR sensors such as a LIDAR unit, an optical camera, a video camera, or a stereo camera; acoustic sensors such as an ultrasonic sonar unit; tactile sensors such as touch-sensitive floor coverings; other RF sensors; or combinations thereof. While the figure depicts a single sensor unit 150, other embodiments may deploy a plurality of sensor units, e.g. to cover the field of view from multiple vantages (such as front, back, side, top, and/or bottom) or to provide an extended field of view (such as along a corridor).
  • the sensor unit 150 may identify a region of interest by determining spatial information about a subject within the field of view 120.
  • the spatial information can include the position, orientation, or posture of a human subject, and the region of interest may be defined as a volume that encloses or partially encloses the human subject. In the illustrative depiction of FIG. 1, this volume is depicted as a box 160 that encloses a subject 161 within the field of view.
  • the volume is a human-shaped volume enclosing and matching the posture of the human subject.
  • the volume is determined using a depth map characterizing a surface region of the subject, e.g. as provided by a structured light sensor unit.
  • the volume may be defined as a curved slab-like volume that hugs the contours of the human subject; in other words, for a depth map that defines a two- dimensional manifold corresponding to the surface region, the region of interest is a curved slab corresponding to a three-dimensional neighborhood of the two- dimensional manifold.
  • the thickness of this curved slab may be selected as appropriate for the imaging application, ranging, for example, from about 1 centimeter to about 10 centimeters.
  • the illumination antenna 100 is depicted as a metamaterial aperture antenna.
  • the metamaterial aperture antenna is a reconfigurable antenna that includes a two-dimensional waveguide 101 with a plurality of waveguide feeds 102 spatially distributed across the extent of the waveguide.
  • An RF switch 103 is configured to direct RF energy from a radio unit 130 to any of the various feed locations.
  • the waveguide is coupled to an array of complementary metamaterial elements (not shown) having a diversity of resonance frequencies, e.g. as described in PCT Publication No. WO/2014/025425.
  • the RF switch is sequentially adjusted to direct RF energy to each of the various feed locations, and for each position of the RF switch, the radio unit 130 sweeps through a range or set of operating frequencies to utilize the frequency dispersion of the array of
  • the measurement antenna 110 is a horn antenna (or similar medium gain antenna addressing the field of view).
  • the configuration is swapped: the illumination antenna is a horn antenna (or similar medium gain antenna addressing the field of view) and the measurement antenna is a metamaterial aperture antenna.
  • both the illumination antenna and the measurement antenna are metamaterial aperture antennas.
  • a single metamaterial aperture antenna provides both illumination and measurement.
  • the illumination and/or measurement of the scene are performed in the same way regardless of the actual region of interest: the entire field of view is illuminated, and the entire field of view is measured.
  • the illumination and/or measurement of the scene are tailored in accordance with the identifying of a region of interest by the auxiliary sensor unit.
  • the first region of interest 160 is illuminated with a first set of illumination field patterns 171; when the sensor unit 150 identifies a second region of interest 162 enclosing the position of a subject 163, the second region of interest 162 is illuminated with a second set of illumination field patterns 172.
  • the subject 163 may be a second subject within the field of view 120, or the same subject 161 at a later time (e.g. if the subject is moving through the field of view).
  • the illumination antenna is frequency dispersive, reconfigurable, or both.
  • An example of a frequency-dispersive antenna is a metamaterial aperture antenna having a plurality of metamaterial elements arranged on a surface and having a diversity of resonance frequencies, e.g. as described in PCT Publication No. WO/2014/025425.
  • a frequency-dispersive antenna may be operated by sweeping through a set of operating frequencies to illuminate the field of view with a corresponding set of radiation patterns. Alternatively, if it is desirable to concentrate the illumination on a smaller region of interest within the field of view, in some approaches a subset of the operating frequencies may be selected,
  • An example of a reconfigurable antenna is a metamaterial aperture antenna having a plurality of radiation patterns corresponding to a set of antenna
  • the reconfigurable metamaterial aperture antenna has a plurality of adjustable metamaterial elements with respective adjustable physical parameters (such as resonance frequencies and/or Q-factors) that are functions of one or more control inputs.
  • the control inputs may be control voltages for the plurality of scattering elements.
  • FIG. 2A depicts a complementary metamaterial element having an inner conducting region that is electrically isolated from an enclosing conducting region.
  • a pair of Schottky diodes span the gap between the inner conducting region and the enclosing conducting region; a voltage difference applied between the two conductors changes the depletion depths of the diodes, adjusting the resonance frequency and Q-factor of the resonator as shown in FIG. 2B.
  • FIG. 2A depicts a complementary metamaterial element having an inner conducting region that is electrically isolated from an enclosing conducting region.
  • a pair of Schottky diodes span the gap between the inner conducting region and the enclosing conducting region; a voltage difference applied between the two conductors changes the depletion depths of the diodes, adjusting the resonance frequency and Q-factor of the resonator as shown in FIG. 2B.
  • FIG. 2B A second example of an adjustable metamaterial element is depicted in FIG.
  • FIG. 3A which again depicts a complementary metamaterial element having an inner conducting region that is electrically isolated from an enclosing conducting region, this time with a pair of MOSFETs spanning the gap between the inner conducting region and the enclosing conducting region.
  • a voltage applied to the gates of the MOSFETs adjusts the Q-factor of the resonator by altering the conductivity of each MOSFET (the figure depicts full-wave simulation results for the signal transmitted through a microstrip patterned with a single MOSFET-tuned CELC - each curve corresponds to a different source-drain resistance, corresponding to gate voltage).
  • the metamaterial elements are adjustable by the inclusion of tunable lumped elements such as packaged varactor diodes or HEMT transistors.
  • the illumination antenna is both frequency-dispersive and reconfigurable.
  • One example is the illumination antenna of FIG. 1, which is operable over a set of frequencies and also reconfigurable by adjusting the RF switch.
  • Another example is a frequency-dispersive antenna that is mechanically steered (e.g. by mounting the antenna on a pivot or gimbal, or by directing the antenna energy towards a secondary reflector or refractor that is mechanically steered) or
  • the reconfigurable antennas of the preceding paragraph are also frequency-dispersive, and may be operated in a set of antenna modes that include both a plurality of frequencies and a plurality of configurations.
  • a set of illumination antennas is deployed, and one or more of the illumination antennas is selected for the illumination depending on the region of interest.
  • a set of illumination antennas may be deployed along a corridor, and an illumination antenna is selected adjacent to the position of a subject in the corridor.
  • a reconstructed image is obtained using a compressive imaging algorithm, e.g. as described in PCT Publication No. WO/2014/025425.
  • the reconstruction is performed in the same way regardless of the actual region of interest: the minimization problem arg mnif ⁇ g— Hf ⁇ + AR (f) for is solved for a reconstructed image f and measurement matrix H defined over the entire field of view.
  • the reconstruction is informed and/or optimized by information received from the auxiliary sensor unit.
  • the measurement matrix H may be truncated to exclude points outside of the region of interest (equivalently, the reconstructed image f is stipulated as zero outside of the region of interest).
  • this truncation involves a dimensional reduction of the minimization problem; for example, if the auxiliary sensor unit provides a depth map characterizing a surface region of the subject (e.g. as provided by a structured light sensor unit), the measurement matrix H and reconstructed image f may be defined not in a three- dimensional volume but on a two-dimensional manifold (embedded in three- dimensional space) corresponding to the depth map.
  • the auxiliary sensor unit can inform the reconstruction by providing boundary conditions for Green's functions of the measurement matrix H; in other words, the Green's functions are recalculated using, for example, the two-dimensional manifold discussed above as a boundary on the space in which the Green's functions are defined.
  • the reconstructed image f may be combined with a visual image of the region of interest to provide a multi-spectral representation of the region of interest.
  • the auxiliary sensor unit may provide both a depth map (e.g. as provided by a structure light sensor unit) and an optical image (e.g. as provided by an optical camera), as with the MICROSOFT KINECT sensor unit.
  • the depth map and optical image can be combined with a false-color representation of the reconstructed image f to create a three-dimensional image data object. This three-dimensional object can then be displayed to the user on a monitor. An example is presented in FIG.
  • an interactive user interface such as a touch screen, allows the user to interact with the three-dimensional image data object and/or select further regions of interest for additional imaging. For example, the user can operate the touch screen interface (or other user interface) to zoom or rotate the three- dimensional image.
  • the user can operate the touch screen interface (or other user interface) to identify another region of interest (for example, a smaller region of interest addressing a particular feature or portion of the larger region of interest) for subsequent RF imaging.
  • another region of interest for example, a smaller region of interest addressing a particular feature or portion of the larger region of interest
  • the user can identify this area of concern or ambiguity (e.g. by drawing a loop on the touch screen) and prompt the imaging system to re-image with a new region of interest corresponding to the identified area.
  • an illustrative embodiment is depicted as a system block diagram for a multi-sensor compressive imaging system.
  • the system includes an illumination antenna 100 coupled to a transmitter 101 and a measurement antenna 110 coupled to a receiver 111.
  • the system further includes an auxiliary sensor unit 150, which may include a EO/IR sensor (such as a MICROSOFT
  • the transmitter 101, receiver 111, and sensor unit 150 are coupled to processing circuitry 500 configured to reconstruct an image of the region of interest using a compressive imaging algorithm.
  • processing circuitry 500 configured to reconstruct an image of the region of interest using a compressive imaging algorithm.
  • the illumination antenna is a
  • the processing circuitry 500 includes control circuitry providing one or more control inputs 102 to the illumination antenna.
  • the processing circuitry 500 includes control circuitry providing one or more control inputs 112 to the measurement antenna.
  • the system optionally includes a monitor 510 coupled to the processing circuitry 500 for display of reconstructed images (optionally combined with a depth map and visual image to provide a projection of a hybrid, false color three dimensional image as discussed above).
  • the system optionally includes a user interface 520 (schematically depicted as a keyboard, but this schematic representation is not intended to be limiting) coupled to the processing circuitry to allow a user to manipulate displayed images and/or select new regions of interest, as discussed above.
  • the monitor 510 and user interface 520 are combined in the form of a touch-screen monitor.

Abstract

Des systèmes d'imagerie compressive à capteurs multiples peuvent comporter un composant d'imagerie (par exemple une antenne RF, une antenne à hyperfréquences ou une antenne de surface en un métamatériau mmW) et un composant de détection auxiliaire (par exemple un capteur EO/IR). Selon certaines approches, le composant de détection auxiliaire comprend un capteur de lumière structuré conçu pour identifier l'emplacement ou la position d'une cible d'imagerie dans un champ visuel du composant d'imagerie. Selon certaines approches, une image RF, à hyperfréquences ou en un mmW reconstruite peut être combinée à une image visuelle d'une région d'intérêt de façon à former une représentation multispectrale de la région d'intérêt.
PCT/US2014/060080 2013-10-11 2014-10-10 Imagerie compressive à capteurs multiples WO2015054601A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361890043P 2013-10-11 2013-10-11
US61/890,043 2013-10-11

Publications (2)

Publication Number Publication Date
WO2015054601A2 true WO2015054601A2 (fr) 2015-04-16
WO2015054601A3 WO2015054601A3 (fr) 2015-06-04

Family

ID=52813749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/060080 WO2015054601A2 (fr) 2013-10-11 2014-10-10 Imagerie compressive à capteurs multiples

Country Status (1)

Country Link
WO (1) WO2015054601A2 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3226042A1 (fr) * 2016-03-30 2017-10-04 Samsung Electronics Co., Ltd Générateur de lumière structurée et appareil de reconnaissance d'objets le comprenant
KR20170112915A (ko) * 2016-03-30 2017-10-12 삼성전자주식회사 구조광 생성기 및 이를 포함하는 객체 인식 장치
WO2017189691A3 (fr) * 2016-04-28 2017-11-30 Fluke Corporation Manipulation d'imagerie rf 3d et marquage sur paroi de la structure détectée
US10209357B2 (en) 2016-04-28 2019-02-19 Fluke Corporation RF in-wall image registration using position indicating markers
US10302793B2 (en) 2016-08-04 2019-05-28 Fluke Corporation Blending and display of RF in wall imagery with data from other sensors
US10402993B2 (en) 2016-03-30 2019-09-03 Samsung Electronics Co., Ltd. Structured light generator and object recognition apparatus including the same
US10444344B2 (en) 2016-12-19 2019-10-15 Fluke Corporation Optical sensor-based position sensing of a radio frequency imaging device
US10564116B2 (en) 2016-04-28 2020-02-18 Fluke Corporation Optical image capture with position registration and RF in-wall composite image
US10571591B2 (en) 2016-04-28 2020-02-25 Fluke Corporation RF in-wall image registration using optically-sensed markers
US10585203B2 (en) 2016-04-28 2020-03-10 Fluke Corporation RF in-wall image visualization
CN110888121A (zh) * 2019-12-16 2020-03-17 上海智瞳通科技有限公司 目标体探测方法与装置及目标体温度探测方法与装置
CN112421217A (zh) * 2020-11-19 2021-02-26 西安电子科技大学 一种1-比特数字编码超材料天线单元
CN113686809A (zh) * 2021-07-30 2021-11-23 北京航空航天大学青岛研究院 像素单元及形成方法、显示器及太赫兹成像系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5001286B2 (ja) * 2005-10-11 2012-08-15 プライム センス リミティド 対象物再構成方法およびシステム
US7928900B2 (en) * 2006-12-15 2011-04-19 Alliant Techsystems Inc. Resolution antenna array using metamaterials
US8373608B2 (en) * 2007-12-05 2013-02-12 Honeywell International Inc. Reconfigurable antenna pattern verification
US8692708B2 (en) * 2010-03-30 2014-04-08 Sony Corporation Radiometric imaging device and corresponding method
US9450310B2 (en) * 2010-10-15 2016-09-20 The Invention Science Fund I Llc Surface scattering antennas

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10402993B2 (en) 2016-03-30 2019-09-03 Samsung Electronics Co., Ltd. Structured light generator and object recognition apparatus including the same
KR20170112915A (ko) * 2016-03-30 2017-10-12 삼성전자주식회사 구조광 생성기 및 이를 포함하는 객체 인식 장치
US10489924B2 (en) 2016-03-30 2019-11-26 Samsung Electronics Co., Ltd. Structured light generator and object recognition apparatus including the same
US11035548B2 (en) 2016-03-30 2021-06-15 Samsung Electronics Co., Ltd. Structured light generator and object recognition apparatus including the same
KR102629586B1 (ko) 2016-03-30 2024-01-25 삼성전자주식회사 구조광 생성기 및 이를 포함하는 객체 인식 장치
EP3226042A1 (fr) * 2016-03-30 2017-10-04 Samsung Electronics Co., Ltd Générateur de lumière structurée et appareil de reconnaissance d'objets le comprenant
US10209357B2 (en) 2016-04-28 2019-02-19 Fluke Corporation RF in-wall image registration using position indicating markers
US10254398B2 (en) 2016-04-28 2019-04-09 Fluke Corporation Manipulation of 3-D RF imagery and on-wall marking of detected structure
WO2017189691A3 (fr) * 2016-04-28 2017-11-30 Fluke Corporation Manipulation d'imagerie rf 3d et marquage sur paroi de la structure détectée
US10564116B2 (en) 2016-04-28 2020-02-18 Fluke Corporation Optical image capture with position registration and RF in-wall composite image
US10571591B2 (en) 2016-04-28 2020-02-25 Fluke Corporation RF in-wall image registration using optically-sensed markers
US10585203B2 (en) 2016-04-28 2020-03-10 Fluke Corporation RF in-wall image visualization
US11635509B2 (en) 2016-04-28 2023-04-25 Fluke Corporation Manipulation of 3-D RF imagery and on-wall marking of detected structure
US10830884B2 (en) 2016-04-28 2020-11-10 Fluke Corporation Manipulation of 3-D RF imagery and on-wall marking of detected structure
US10302793B2 (en) 2016-08-04 2019-05-28 Fluke Corporation Blending and display of RF in wall imagery with data from other sensors
US10444344B2 (en) 2016-12-19 2019-10-15 Fluke Corporation Optical sensor-based position sensing of a radio frequency imaging device
CN110888121B (zh) * 2019-12-16 2020-11-24 上海智瞳通科技有限公司 目标体探测方法与装置及目标体温度探测方法与装置
CN110888121A (zh) * 2019-12-16 2020-03-17 上海智瞳通科技有限公司 目标体探测方法与装置及目标体温度探测方法与装置
CN112421217A (zh) * 2020-11-19 2021-02-26 西安电子科技大学 一种1-比特数字编码超材料天线单元
CN113686809A (zh) * 2021-07-30 2021-11-23 北京航空航天大学青岛研究院 像素单元及形成方法、显示器及太赫兹成像系统

Also Published As

Publication number Publication date
WO2015054601A3 (fr) 2015-06-04

Similar Documents

Publication Publication Date Title
US10109080B2 (en) Multi-sensor compressive imaging
WO2015054601A2 (fr) Imagerie compressive à capteurs multiples
EP1798570A2 (fr) Système et procédé d'imagerie à micro-ondes vertical
Hunt et al. Metamaterial microwave holographic imaging system
EP2204671B1 (fr) Système d'imagerie à capteur assisté par caméra et système d'imagerie d'aspects multiples
JP5695286B2 (ja) プログラム可能な透過アレイを使用するマイクロ波撮像システム及び方法
US20180292529A1 (en) Vehicle based radar upsampling
EP1662275B1 (fr) Système et procédé pour l'inspection de la sécurité au moyen d'une imagerie à micro-ondes
US20080079625A1 (en) System and method for stereoscopic anomaly detection using microwave imaging
WO2015075072A1 (fr) Appareil de surveillance doté d'une caméra optique et d'un capteur radar
JP2006267102A (ja) 輸送可能な物品をマイクロ波画像生成を使用して検査するシステム及び方法
US20110102235A1 (en) Identification of potential threat materials using active electromagnetic waves
KR20150042746A (ko) 메타물질 디바이스들 및 이를 사용하는 방법들
US20070139249A1 (en) Handheld microwave imaging device
US20230314592A1 (en) Electronic Devices With Radar
US20210364629A1 (en) Improvements in or relating to threat classification
US11067687B2 (en) Multipath acoustic holography and virtual haptics
WO2019215454A1 (fr) Améliorations de classification de menace ou associées à celle-ci
KR20200136130A (ko) 전파 및 영상 신호 기반 도촬카메라 탐지 장치
AU2015347259A1 (en) Polarization-based mapping and perception method and system
WO2016076936A2 (fr) Procédé et système de mappage et de perception basés sur la polarisation
AU2016216608B2 (en) A monitoring device and system
RU2269811C2 (ru) Устройство для получения свч-голограмм и визуализации восстановленного изображения
US20210149047A1 (en) Standoff detection system
Sešek et al. A thz tomography imaging system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14851707

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14851707

Country of ref document: EP

Kind code of ref document: A2