EP4348301A1 - Systèmes et procédés de détection non invasive d'objets interdits à l'aide de composants analogiques et numériques découplés - Google Patents

Systèmes et procédés de détection non invasive d'objets interdits à l'aide de composants analogiques et numériques découplés

Info

Publication number
EP4348301A1
EP4348301A1 EP22811818.8A EP22811818A EP4348301A1 EP 4348301 A1 EP4348301 A1 EP 4348301A1 EP 22811818 A EP22811818 A EP 22811818A EP 4348301 A1 EP4348301 A1 EP 4348301A1
Authority
EP
European Patent Office
Prior art keywords
article
housing
equipment
components
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22811818.8A
Other languages
German (de)
English (en)
Inventor
Hatch Graham
Ehsan Afshari
Karl TRIEBES
Ryan KEARNY
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lassen Peak Inc
Original Assignee
Lassen Peak Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lassen Peak Inc filed Critical Lassen Peak Inc
Priority claimed from PCT/US2022/027178 external-priority patent/WO2022250862A1/fr
Publication of EP4348301A1 publication Critical patent/EP4348301A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/887Radar or analogous systems specially adapted for specific applications for detection of concealed objects, e.g. contraband or weapons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/0209Systems with very large relative bandwidth, i.e. larger than 10 %, e.g. baseband, pulse, carrier-free, ultrawideband
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • G01S13/32Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S13/34Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
    • G01S13/343Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal using sawtooth modulation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S2013/0236Special technical features
    • G01S2013/0245Radar with phased array antenna
    • G01S2013/0254Active array antenna

Definitions

  • consensual search is a search in which an individual either implicitly or explicitly gives consent for a search to be conducted as a condition for something else, like entry into a sporting venue, or prior to boarding an airplane.
  • a consensual search is not considered a detention as the individual is free to leave at any time or can refuse to answer questions.
  • consensual search Although law enforcement occasionally uses consensual search when permission is granted by a subject who is not yet a suspect, the more common and pervasive use case of consensual searches is to prevent unwanted items such as guns or alcohol from being brought into buildings, schools, sporting or other events, airports, voting facilities, court rooms, and other venues.
  • Airports are an area of particular concern. Security at airports can include expensive equipment like millimeter wave scanners and backscatter x-ray scanners.
  • the millimeter wave scanner is a large, fixed device sized and configured to allow a passenger to stand inside, with feet apart and hands over their head, while the device creates a full-body scan that is reviewed by a TSA agent.
  • Backscatter x-ray scanners subject users to mutagenic x rays and can produce revealing full-body images of passengers that are embarrassingly and unnecessarily obtrusive, and need to be reviewed by a TSA agent.
  • Embodiments of the present invention involve breakthrough innovations to revolutionize how both Terry frisks and consensual searches are conducted.
  • Embodiments include imaging systems that are portable and high-resolution methods and devices that are capable of detecting objects hidden under, for example, people’s clothing, or within a bag, or elsewhere.
  • a user can safely conduct a weapons search without being in physical contact with the subject being searched.
  • the components that drive the apparatus can be physically divided while remaining electrically coupled.
  • the analog components of the radar scanner can be placed in one housing that can more easily be mounted on various pieces of equipment, and the elements responsible for digital processing can be carried elsewhere while being in communication with the analog components.
  • Embodiments of the invention include hardware implementations dividing components by grouping and placing certain components within distinct physical housings with appropriate form factors and appropriate physical and electrical connections to allow, for example, for the use of the scanner mounted on a helmet or a riot shield.
  • a first housing that includes within it a set of analog imaging components of a portable radar system with both a ranging resolution and lateral resolution sufficient to detect an object concealed on a person, and also a second housing that includes within it a set of digital processing components in communication with at least a subset of the set of analog imaging components, the digital processing components configured to receive imaging information for processing, the second housing being a different housing from the first housing.
  • the first housing is configured to attach to a user’s first article of portable equipment, while the second housing is configured to attach to the user’s second article of portable equipment in a way that is separate from the first housing.
  • FIG. 1 is a block diagram of a system for providing a noninvasive imaging and detection system, according to an embodiment of the invention.
  • FIG. 2 is a flow chart of a process for noninvasive concealed-object detection, according to an embodiment of the invention.
  • FIG. 3 is a flowchart of a method for creating a dataset from images taken by a non-invasive scanner, the dataset being appropriate for post processing and use in imaging and detection, according to an embodiment of the invention.
  • FIG. 4 is a flowchart of a method for processing a final image, according to an embodiment of the invention.
  • FIG. 5 is a block diagram of a schematic for a radar system on a chip (RSOC), according to an embodiment of the invention.
  • FIG. 6 is a block diagram of a system for providing a noninvasive imaging and detection system, in which the radar components are placed in a housing physically separate from the processing components.
  • RSOC radar system on a chip
  • FIG. l is a block diagram of a system for providing a system for noninvasive imaging and detection.
  • a radar with a ranging resolution and lateral resolution sufficient to search and detect an object concealed on a person.
  • the following discussion is an example of one way of performing this search.
  • the system comprises a coherent radar system on a chip 101, in communication with a core processing system 102.
  • the core processing system 102 includes a processor 103 and custom logic 104.
  • the coherent radar system on a chip is configured to provide both ranging resolution and lateral resolution that is orders of magnitude greater than is found in the prior art.
  • ranging resolution which refers to the quantifiable distance to an object, is directly related to the bandwidth (fmax - fmin), where the available bandwidth is typically 5% - 15% of the transmitted center frequency.
  • the bandwidth fmax - fmin
  • the ranging resolution may be used to distinguish distances in the sub -millimeter range.
  • Lateral resolution relates to the quantifiable distance between samples of perpendicular cross section (e.g., side to side and top to bottom).
  • lateral resolution relates to feature resolution of a scan.
  • the transmitted signal is swept across the target (i.e., the target is scanned)
  • the resultant received signed is processed to show variations in reflectivity from the scanned target.
  • These variations can be processed by using standard techniques such as, but not limited to, a Fast Fourier Transform (FFT) to produce an image.
  • FFT Fast Fourier Transform
  • the feature size, or resolution of the image is directly proportional to the wavelength of the emitted source where the shorter wavelength provides increased resolution. Another way to describe this is to say lateral resolution is a function of both beam width and steering.
  • Beam width is a function of wavelength divided by antenna dimension. As the frequency of the beam increases, its wavelength decreases, and hence, the beam width decreases. In addition, the more antenna elements found on the chip, the larger the dimension, and thus the tighter the beamwidth. The tighter the beam width, the higher the resolution of distinguishing cross- sectional differences. Thus, in the THz range where the chip operates, the device can provide sub millimeter lateral resolution. Coherence is used to achieve high receiver sensitivity, and allows for recovery of the difference of frequency between transmit and source. This high receiver sensitivity is used to obviate the need for transmitting a signal on the order of >1,000x or 30 dB higher in power, which would not allow for a single chip implementation of the radar.
  • Radar System 101 includes a lens 120 that is configured to provide a consistent focal length and beam width over a large range of the radar’s scan angle.
  • lens 120 can be a Luneberg lens of the type or types described in U.S. Patent Application No. 63/161,323, the contents of which are hereby incorporated in their entirety.
  • core processing system 102 includes processor 103 and custom logic 104.
  • Processor 103 is configured to process instructions to render or display images, initiate a scan, process the results of a scan, alert the user, and provide the results of an object match, if any, to the user.
  • Processor 103 can be any of a variety and combination of processors, and can be distributed among various types and pieces of hardware found on the apparatus, or can include hardware distributed across a network.
  • Processor 103 can be an ARM (or other RISC-based) processor.
  • processors can be implemented, for example, as hardware modules such as embedded microprocessors, Application Specific Integrated Circuits (“ASICs”), and Programmable Logic Devices, including flash memory (“PLDs). Some such processors can have multiple instruction executing units or cores. Such processors can also be implemented as one or more software modules in programming languages as Java, C++, C, assembly, a hardware description language, or any other suitable programming language.
  • a processor according to some embodiments includes media and program code (which also can be referred to as code) specially designed and constructed for the specific purpose or purposes.
  • Custom logic 104 can include one or more Field Programmable Gate Array(s) (FPGA) or any type of PLD for custom logic to support processing offload from Processor 103.
  • processing offload includes digital signal processing and digital beam forming.
  • coherent radar system 101 and core processing system 102 In communication with coherent radar system 101 and core processing system 102, are the systems and communications circuits 105, comprising wireless communications circuits 106, Memory 107, power source 108, and an external electrical connection 109.
  • the components may be housed within a single housing; in an embodiment, the components, including the coherent, radar system on the chip 101, the memory' 107, may be stored in separate housings as a need arises to separate chip 101 from memory' 107.
  • Wireless communications circuits 106 can include any practicable wireless communications circuits including, but not limited to, a wireless fidelity (“Wi-Fi”) or wireless local area network (“WLAN”) connection, a wireless wide area network (“WWAN”) connection, a Bluetooth connection, an LTE/5G connection, and/or a cellular connection.
  • Wi-Fi wireless fidelity
  • WLAN wireless local area network
  • WWAN wireless wide area network
  • Bluetooth an LTE/5G connection
  • LTE/5G connection Long Term Evolution
  • Memory 107 can be used to store, in computer code, artificial intelligence (“AI”) instructions, AI algorithms, a catalog of images, device configuration, an allowable, calculated, or predetermined user workflow, conditions for altering, device status, device and scanning configuration, and other metadata resulting from the scanning process.
  • Memory 107 can be a read-only memory (“ROM”); a random-access memory (RAM) such as, for example, a magnetic disk drive, and/or solid-state RAM such as static RAM (“SRAM) or dynamic RAM (“DRAM), and/or FLASH memory or a solid-data disk (“SSD), or a magnetic, or any known type of memory.
  • a memory can be a combination of memories.
  • a memory can include a DRAM cache coupled to a magnetic disk drive and an SSD.
  • Memory 107 can also include processor-readable media such as magnetic storage media such as hard disks, floppy disks, and magnetic tape; optical storage media such as Compact Disc/Digital Video Discs (“CD/DVDs), Compact Disc-Read Only Memories (“CD-ROMs), and holographic devices: magneto-optical storage media such as floptical disks; Solid state memory such as SSDs and FLASH memory; and ROM and RAM devices and chips.
  • Power source 108 can include any type of practicable battery, including but not limited to, Lithium-ion, Nickel Cadmium, Nickel-Metal Hydride, and alkaline. Power source 108 can comprise an external power source coupled to circuitry internal to the device. USB connection 109 can be used to put the apparatus in communication with a network, or can be used to provide an electrical connection to charge or power the apparatus.
  • the apparatus further includes User Controls 110.
  • User Controls 110 include user buttons 111 to manipulate the apparatus to turn the device on and off, to set the resolution, configure the device, or select a preconfigured setting, initiate a scan, initiate a connection with the cloud based service via one of the network interface (e.g., Wi-Fi, cellular, Bluetooth, or any other practicable interface) and control the camera functions.
  • the network interface e.g., Wi-Fi, cellular, Bluetooth, or any other practicable interface
  • Camera 112 is configured to capture optical images
  • a microphone and speaker 113 are configured to facilitate communication, including communication to third parties, or communication with the device through voice or audio commands, and for the device to provide sound to the user such as one or more alarms or notifications.
  • Display panel 114 can be an LCD or other type of display panel configured to display messages to the user, or to provide images representing the results of a scan.
  • the apparatus comprises major program-code components
  • Program-code components 116 can include, but are not limited to, micro-code or micro-instructions, machine instructions (such as produced by a compiler), and files containing higher-level instructions that are executed by a computer using an interpreter.
  • Program code can include hardware, software, firmware, and any practical way of coding. For example, an embodiment may be implemented using HTML, Java, C++, or other object-oriented programming language and development tools. Additional examples of program code include, but are not limited to, control signals, encrypted code, and compressed code.
  • Major program code can include, but is not limited to, a standard operating system (e.g., Linux), hardware drivers for software-managed hardware elements, machine-learning inferencing, image processing, image storage and retention, cloud-service interface, scanning process, user interface, device management, cryptographic functions, user access management, and device health.
  • a standard operating system e.g., Linux
  • hardware drivers for software-managed hardware elements e.g., machine-learning inferencing, image processing, image storage and retention
  • cloud-service interface e.g., scanning process, user interface, device management, cryptographic functions, user access management, and device health.
  • FIG. 5 is a block diagram for a schematic of a radar system on a chip (RSOC) used in an apparatus, according to the present invention.
  • RSOC radar system on a chip
  • the RSOC includes ail the elements described with regard to FIG. 5 on a single chip (with the exception of ADC 509, addressed below).
  • the RSOC transmits the high frequency signals via XX antenna 504, and receives the reflected signal via RX antenna 505, to produce a baseband analog signal that is digitized by an external analog-to-digital converter (ADC 509) and processed by digital processing logic and a CPU to product a visible image of the scanned target.
  • ADC 509 analog-to-digital converter
  • the RSOC consists of two major functions; 1) A transmitter that produces the radar signal and initiates the scan and 2) a receiver that receives the reflected signal and recovers differential phase and frequency information, and provides that information to the digital processing system.
  • Transmiter 520 consists of 4 major functional components: Ramp Generator 501, Wide-Band Voltage Controlled Oscillator (VCO) 502, Directional coupler 503, and a Phased-array element array 504.
  • Ramp generator 501 is configured to provide a voltage signal to Wide Band VCO 502, which controls the center frequency of the VCO nominally centered between approximately .1 to 1 THz.
  • Ramp Generator 501 is configured to move the center frequency of Wide Band VCO 502 over a predetermined frequency that creates a frequency sweeping action to produce the radar scan.
  • Ramp Generator 501 can generally produce a sawtooth voltage waveform, however, other waveforms such as ramp, sinusoid, flat, or combinations thereof, may be employed as well.
  • the Wide Band VCQ Wide Band VCQ
  • the signal from Wide Band VCO 502 can be is implemented to produce low phase noise, thus improving the receiver’s receiver sensitivity.
  • the signal from Wide Band VCO 502 can then be provided to Directional Coupler 503, which can create at least two coherently related identical versions of the input signal.
  • One of the two versions of the input signal is provided to the sub-harmonic mixer as a coherent reference, and the other version of the input signal is provided to the phased array element antenna.
  • Each element in the system acts as an antenna and employs a phase-locked oscillator coherently related to the signal from Wide Band VCO 502 to ensure a fixed phase relationship from adjacent transmitting elements, which can be used for, for example, to attenuate unwanted sidelobes.
  • the high frequency energy produced by the elements is focused using an external radar lens (not shown), generally implemented as a hemispherical component of radar transmissive material, to scan the target and create the reflected high frequency energy to be received by Receiver 530.
  • Receiver 530 consists of 5 major functional elements: 1) Receive Antenna (RX Antenna) 504; 2) Sub Harmonic Mixer 505; 3) Low Noise Amplifier (LNA) 506, 4) Band Pass Active Filter 507; and 5) Variable Gain Amplifier (VGA) 508.
  • Receive Antenna 505 is configured to receive the reflected signal broadcast by the transmitter and reflected from the target.
  • RX Antenna 504 may be implemented as a dipole antenna, or by any other practicable antenna configuration.
  • the signal received at RX antenna is provided to the sub-harmonic mixer, which can then create sum and difference frequencies based on the reference signal provided by the transmitter.
  • LNA 506 is used to amplify the signal as required by Band Pass Active Filter 507.
  • Band Pass Active Filter 507 filters off undesirable harmonics created by the Sub-Harmonic Mixer 505,
  • active refers to the use of active elements to include linearly biased transistors in conjunction with reactive and passive elements to provide the bandpass filter with minimized or reduced noise and phase distortions of the passed signal.
  • VGA 508 receives the signal from band-pass filter and amplifies and provides the necessary impedance matching for external ADC 509.
  • ADC 509 is implemented functionally on the RSOC, In an embodiment, ADC 509 is implemented external to the RSOC.
  • FIG. 2 is a flow chart of a method for using a non-invasive scanner for creating images useful for imaging and detection.
  • the apparatus Prior to use, in an embodiment, the apparatus will have, and will be in, one of a set of operational modes and or stales, including a low-power or standby mode, a synching mode, and an off mode.
  • a user can generally tell, based on the apparatus’s display, whether the apparatus is in an operational mode or not.
  • the apparatus will be able to show the local user which state the apparatus is in, via LEDs, local LCD panel, or using an audible warning. If the apparatus is in an off mode, then the apparatus is powered off and does not perform any scanning.
  • the apparatus can be in a state that requires user interaction to set up the apparatus in sync mode and connect it to an online network for backup and additional functionality such as uploading data and metadata.
  • the apparatus can be set to sync automatically through the online network.
  • the apparatus can send and receive operational control parameters such as a cryptographic device key for device or user login to the system, user- configuration data detailing, for example, who is using the apparatus, what organization or department the user belongs to, updates to the machine-language inferencing engine, relevant (e.g., user or departmental) policies and controls, including general policies on alert, event, and trigger actions.
  • the operational control parameters can include information detailing how full the device disk is, and whether upload is required.
  • the machine- language inferencing engine is the process that performs the object pattern matching and subsequent identification.
  • it can be implemented in software and accelerated using and FPGA.
  • it can be implemented in hardware.
  • it can be implemented in any practicable combination of hardware and software.
  • the apparatus is operational and ready for use.
  • network access exists, along with a live connection to any related network services.
  • no network access exists.
  • the apparatus can include sufficient local storage and processing power for operating independent of a network.
  • the apparatus can further include a timer along with a device key to allow a user to use the apparatus as long as the timer has not timed out, thus ending the user session on the apparatus.
  • other modes that can be used by the apparatus include active-target-acquisition mode, and active-non-physical-search-in-process mode.
  • active- target-acquisition mode the apparatus will show 7 or relate the field of view 7 to the user with an active camera and preparing to go to state 5.
  • State 5 defines the system being in the active state of a non-physical search. In this state, the apparatus imaging system pipeline and real-time alerts and notifications are active.
  • the user initiates a non-physical search of a subject.
  • the initiation of the non-physical search can begin with a user setting up a subject between 5 and 10 feet away from the apparatus. The subject can then be asked to look at the user and/or the apparatus. The user can then point the apparatus toward the subject and turn on the scanning function of the device via a button, trigger, voice control, or other control switch.
  • the apparatus scans the subject.
  • the radar system on a chip generates a radar signal and srveeps a predetermined field of view, emitting a radar signal in the .1 to 1 THz range.
  • the apparatus employs a phased array antenna in conjunction with a voltage controlled oscillator (VCQ) to steer the emitted beam to transmit electromagnetic radiation and deterministically illuminate the subject of the scan, according to an embodiment.
  • VCQ voltage controlled oscillator
  • the emitted signal interacts with the subject, and a certain amount of the electromagnetic radiation is reflected back and received by an antenna on the apparatus.
  • the received signal is coherently mixed with the transmitted signal allowing differential phase and amplitude information to be recovered.
  • the transmit signal is combined, or mixed, with the returning signal allowing for recovery of frequency and phase information in the receive signal.
  • the analog signal from the scan is converted to a digital format using one or more analog-to-digita! converters (ADCs) to create a digital image that can be forwarded to the processing complex of the apparatus.
  • ADCs analog-to-digita! converters
  • the process of scanning and creating an image can be repeated a predetermined number of times (programmed into the apparatus or selected by the user) creating multiple digital images.
  • the multiple images are sent to the processor, and in 205, the multiple images are combined in the processor to form a super image to enhance resolution, creating a super image.
  • the steps of this super imaging process are detailed in Fig. 3, discussed below.
  • the feature resolution of the image is enhanced, thus improving the chances for object recognition in 206.
  • the image can be rendered on a screen on the device.
  • the image can be rendered on a smartphone or other mobile device. When rendered or displayed, the image can contain the original visual image of the target with representations of objects found.
  • the multiple images can also be combined to create a video stream.
  • the device can provide a three-dimensional rendering of the image.
  • different colors are used to indicate the threat level of the detected object.
  • a red outline displayed on the apparatus can indicate the presence and position of a gun.
  • a green outline can be used to indicate the presence of keys, or some other equally innocuous object.
  • an image of an identified object, or a representation thereof can be superimposed of a representation of the scanned target.
  • the representation can be an outline of the scanned target, e.g., a generic outline of a human form, over which the image representing the identified object can be placed, providing the user with information about the positioning of the object on the subject’s body, in addition to detailing the threat level of the object.
  • the representation of the scanned target can take the form of a variety of zones displayed on a screen positioned on the apparatus, or on a mobile device in communication with the apparatus.
  • This processing can include all or some of the following: tagging images or videos with metadata, gathering and uploading metadata, generating a report, providing a digital signature or certificate, archiving, and uploading the data (both received and processed) and metadata.
  • images can be cryptographically tagged with various metadata and transmitted and stored on the device, or can be uploaded for further processing.
  • a data repository e.g., a cloud-based database or an online server
  • metadata can include (but are not limited to) time stamps, geolocation data, device data, customer specific information (user, associated visual images), networked or connected devices, voice recordings, and session information.
  • a web-based service can be implements using public cloud infrastructure and sendees such as those provided by (but not limited to) AWS, Azure, and GCP.
  • Fig. 3 is a flowchart of a method for creating a dataset of images to be used for imaging and detection, according to an embodiment.
  • one or more images are taken.
  • the images are sent to a processor for processing.
  • the image or images received at the processor are increased in size by a predetermined amount creating a set of larger images, at 303.
  • the images are increased in size to achieve finer blending of the image stack in order to extract the high frequency data that is embedded in the low frequency data hidden in the aliasing.
  • At 304 at least a subset of images in the set of larger images are aligned, according to an embodiment.
  • the layers are averaged with linear opacity 1, .5, .25, .125, and so on, allowing images, in an embodiment, to be blended evenly, making use of the aliasing.
  • the image stack, the plurality of images being combined is sharpened using a predetermined radius.
  • the final super image is resized.
  • the output can be resized to any desirable size using any practicable resampling method that provides an appropriate image.
  • the super image is used to create the final image (seen in 206 from Fig. 2). Once the super image is created, the image is further processed, as detailed in Fig. 4, discussed below.
  • Fig. 4 is a flow chart of a method for processing the existing data to create a final image.
  • an optical image is created and mapped to the super image creating a filtered image.
  • the apparatus uses a separate camera to create an optical image used as a base image configured to be mapped to the super image, according to an embodiment.
  • the separate camera is a digital camera using a CCD sensor, or a CMOS sensor, or any practicable sensor.
  • the filtered images are encrypted, while the unfiltered image data is discarded.
  • the encryption can be performed using SSL or TLS secure encryption, or any practicable encryption.
  • the apparatus stores some or all of the filtered image locally.
  • the apparatus stores some or all of the filtered image in a backend cloud sendee where it can be archived or undergo additional processing, or both,
  • the super image is analyzed to determine whether any objects of note are present, on the subject, and if so, the super image is normalized for processing.
  • normalizing the super image means preprocessing it into a format, or with information, appropriate to feed an artificial intelligence system. This preprocessing can include (but is not limited to) scaling to a fixed width and height, conversion of the bit depth, shifting and or rotation of image.
  • the processing can be performed by an artificial intelligence (AI) system.
  • AI artificial intelligence
  • the resultant, image is transferred to an AI engine for pattern matching against known threats and then calculating the likelihood that the input data is a threat.
  • the apparatus performs an image search to match detected shapes against a prebuilt local image threat, library, or a mathematical model representing such images, and makes a threat determination using parameters such as shape type, size, type of weapon, confidence level, contrast, and other parameters.
  • Entries in the threat library can include some or all of the following: guns, knives, bombs and bomb vests, clubs, truncheons, bottles, and other objects of interest, in an embodiment, once a preliminary determination has been made that a weapon is suspected, the apparatus will focus in on the suspected weapon(s) and providing better image resolution to improving the detection confidence. In an embodiment, privacy filtering processing is applied, thus ensuring all locally storage body images are obfuscated as part of the image processing described in Fig. 3.
  • FIG. 6 is a block diagram of a system for providing a noninvasive imaging and detection system, in which the radar components are placed in a housing physically separate from the processing components. Such configurations can be used to attach or integrate the device with conventional safety equipment in a way that is ergonomic and functional.
  • analog image-capture components are placed within radar system housing 601, while the core digital processing system for the overall apparatus is placed within digital processing housing 602 distinct from housing 601.
  • the two housings are in communication with each other via wire 603 (denoted by a dashed line in Fig. 6). Communication can also take place wirelessly using Wi-Fi, Bluetooth, or any practicable wireless protocol.
  • Wi-Fi Wireless Fidelity
  • the radar system housing includes camera 604, microphone (mic) & speaker 605, a high-resolution radar such as the 300 GHz Coherent Radar System on a Chip 606, and the Lens 607.
  • Digital Processing Housing 602 includes Processor 608 and Custom Logic 609.
  • Processor 608 can be partially located within the Radar System Housing 601 and partially located within the Processor Housing 602.
  • the radar system housed in Radar System Housing 601 is configured to, when in operation, generate analog information that is locally (within Radar System Housing 601) converted to a digital format through Analog to Digital Converter (ADC) circuitry 615, and then formatted for transmission from Radar System Housing 601 to Digital Processing Housing 602.
  • ADC Analog to Digital Converter
  • Such formatting can include (but is not limited to) serializing or multiplexing output data streams that are configured to be sent wirelessly or via wire connection (depending on the formatting) to the components within Digital Processing Housing 602 as digital data.
  • Radar System Housing 601 may also include a digitally controlled Ramp Generator 610 configured to receive a digital input from the components in Digital Processing Housing 602 to control an analog ramp generator output that is used to drive a transmission source within Radar System Housing 601.
  • a signal to control Ramp Generator 610 is transmitted from Digital Processing Board contained withing Digital Processing Housing 602 via a cable or wireless communication path 603.
  • the Ramp Generator 610 is configured to drive Voltage Controlled Oscillator 611, which is used to create radar waves via Radar System 606, to be transmitted to a target through Lens 607.
  • the received waves are processed through Mixer/Processing Circuitry 612, and the output from this Mixer/Processing Circuitry 612 is fed to Multiplexer 613, which is configured to send the multiplexed signal to Serializer 614, and ultimately to Analog to Digital Converter (ADC) 615.
  • ADC Analog to Digital Converter
  • the radar system housing can be made small enough to be attached in a forward-facing configuration to a user’s protective helmet such as a riot helmet or a military' helmet, while the processor housing can be placed on the back of the helmet, or attached to some other piece of equipment or an article of clothing of the user.
  • the helmet will have a visor that includes a heads-up display on which can be displayed a view of the camera, or of the radar scanning field, or both.
  • the radar system housing can be configured to he attached in a forward-facing configuration to a protective shield such as a riot shield, two-way radio, a flashlight, a conducted energy weapon (CEW) such as a TASER®, or firearm, while the processor housing can be configured to be attached to the back of the shield for protection, or to be worn by the user on the user’s clothing or body armor, or placed remotely in a secure environment (e.g., a command vehicle, operations center, or with remote personnel) to viewed via a handheld device, tablet, laptop, or other display unit.
  • the shield can include a heads-up display on which can be displayed a view of the camera, or of the radar scanning field, or both.
  • analog image-capture components are placed within a body-worn camera, while the processor and other digital components are housed elsewhere.
  • the body-worn camera can substitute for the camera associated with the radar.
  • the radar components can be attached to or integrated with an unmanned aerial vehicle (colloquially known as a drone or UAV).
  • the radar components can be set forward facing on the front or underside of the UAV, and the processing can be performed by a remote processing unit in communication with the UAV via wireless communication.
  • the remote processing unit can be placed on the UAV in communication with the radar components, or (as with any embodiment) the components of the remote processing unit can be divided up between the UAV and a remote site.
  • the radar components can be attached to or integrated with a ground-based robot.
  • the radar components can be set forward facing on the robot, or on a revolving turret, and the processing can be performed by a remote processing unit in communication with the radar components.
  • the processing unit may be placed in a rear or shielded section of the robot for safety, or can be placed in a remote location such as a command center or with a person controlling or otherwise monitoring the robot.
  • the radar-scanning components may be placed in a housing that is sized and shaped to be mounted in a forward-facing configuration on a carrier’s helmet, while the processing electronics may be placed in a housing that is sized and shaped to be placed behind the helmet, or on the carrier’s body, or on some other body part.
  • the radar system can be mounted on a different article of portable equipment of a user, such as a user’s protective shield, an article of clothing, an article of body armor, a two-way radio, a conducted energy weapon, or a firearm.
  • the digital components can be mounted on the same article of portable equipment, or can be mounted on a different article of portable equipment.
  • the housing containing the digital processing components can be stored remotely (i.e., in a mobile command center) and the digital components can be in communication with the analog components via a wireless communication connection.
  • the policies and control of the apparatus, the chip, and the general system can he configured and controlled by a hierarchical set of domains allowing for different domains to grant configuration control to subordinate domains.
  • the policy and confi guration control can be separated from the users of the device to ensure compliance, operational procedures, and in general simplicity of use.
  • the policy and configuration control can be performed and input by a local user.
  • the policy and configuration control and be performed and input using an AI system.
  • alerts can be visual (e.g., providing an outline of an object on a screen).
  • alerts can be audible (e.g., emitted by a device speaker or through an earpiece).
  • alerts can trigger or prompt a user for additional actions of a remote device (e.g., via call to API), or other user defined actions.
  • an event that triggers a display or alert of an unwanted object can combined with, and work with, other events using, for example, Boolean logic to form complex triggers.
  • Examples of triggers can include: More than two unidentified objects were found that, were larger than a predetermined size.
  • Events can include but are not limited to: an object is identified via machine learning with a predetermined probability; a person is identified via facial recognition, within a predetermined probability: an object of size greater than a predetermined size is found but not identified, an object of size smaller than a predetermined size is found but not identified; a search took place at a certain time of day, or within a certain range of times; and/or whether a contactless Terry Frisk is required; and any other event that can trigger an action.
  • alerts and controls can include: logging an event locally or in the cloud; logging an event in the cloud in either real time or in a batch upload; alerting a user with local audio, vibration, light or local display; alerting a user via a headset, earbuds, glasses, or any other remote device; texting to one or more mobile phone numbers or sending an alert to a mobile app; emailing an alert to one or more email addresses, providing a suggestion to a user on what a next step is for them to take, in addition to the alert itself; communicating to other contactless search devices as a remotely generated event; and calling a custom remote API, which can prompt some other action such as unlocking or locking a door, turning lights on or off, or any other customer-defined action.
  • the term computer program or computer code includes software, firmware, middleware, and any code in any computer language in any configuration, including any set of instructions or data intended for, and ultimately understandable by, a computing device.
  • the order of elements described in each figure is given by way of example only. In an embodiment, the order of elements performed can be changed in any practicable way.
  • FIGS. 2 - 4 can be implemented as software modules.
  • the processes in FIGS. 2 - 4 or any portion or combination thereof can be implemented as hardware modules.
  • FIGS. 2 - 7, any portion or combination thereof can be implemented as a combination of hardware modules, software modules, firmware modules, or any form of program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

Système de balayage de cibles pour détecter des objets cachés comprenant un ensemble de composants d'imagerie analogique d'un système radar portable ayant à la fois une résolution de télémétrie et une résolution latérale suffisantes pour détecter un objet caché sur une personne, les composants d'imagerie analogique étant contenus dans un premier boîtier et en communication avec des composants de traitement numérique contenus dans un second boîtier, les composants de traitement numérique étant configurés pour recevoir des informations d'imagerie en provenance de composants analogiques à des fins de traitement. Chaque boîtier est configuré pour être fixé à un article d'équipement d'un utilisateur.
EP22811818.8A 2021-05-24 2022-05-01 Systèmes et procédés de détection non invasive d'objets interdits à l'aide de composants analogiques et numériques découplés Pending EP4348301A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163192540P 2021-05-24 2021-05-24
PCT/US2022/027178 WO2022250862A1 (fr) 2021-05-24 2022-05-01 Systèmes et procédés de détection non invasive d'objets interdits à l'aide de composants analogiques et numériques découplés

Publications (1)

Publication Number Publication Date
EP4348301A1 true EP4348301A1 (fr) 2024-04-10

Family

ID=90124911

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22811818.8A Pending EP4348301A1 (fr) 2021-05-24 2022-05-01 Systèmes et procédés de détection non invasive d'objets interdits à l'aide de composants analogiques et numériques découplés

Country Status (1)

Country Link
EP (1) EP4348301A1 (fr)

Similar Documents

Publication Publication Date Title
US20230408680A1 (en) Systems and Methods for Noninvasive Detection of Impermissible Objects
US10542222B2 (en) Multiview body camera system with environmental sensors and alert features
US20200389624A1 (en) Mobile based security system and method
US20230401946A1 (en) Methods and apparatus for a public area defense system
Luo et al. Edgebox: Live edge video analytics for near real-time event detection
US11982734B2 (en) Systems and methods for multi-unit collaboration for noninvasive detection of concealed impermissible objects
US11656334B2 (en) System and method for detecting object patterns using ultra-wideband (UWB) radar
US20220260705A1 (en) Systems and Methods for Noninvasive Detection of Impermissible Objects Using Decoupled Analog and Digital Components
US20140298701A1 (en) Gun System and Gun Control Management System Prohibit Gun Violence in Reactive and Proactive
WO2022250862A1 (fr) Systèmes et procédés de détection non invasive d'objets interdits à l'aide de composants analogiques et numériques découplés
EP4348301A1 (fr) Systèmes et procédés de détection non invasive d'objets interdits à l'aide de composants analogiques et numériques découplés
US20220187827A1 (en) Systems and Methods for Noninvasive Aerial Detection of Impermissible Objects
US12000924B2 (en) Systems and methods for noninvasive detection of impermissible objects
US20220214447A1 (en) Systems and Methods for Noninvasive Detection of Impermissible Objects
US20230204757A1 (en) Systems and Methods for Noninvasive Detection of Impermissible Objects Using Personal Equipment
EP4154033A1 (fr) Systèmes et procédés de détection non invasive d'objets interdits
WO2023172437A1 (fr) Systèmes et procédés de détection non invasive d'objets interdits à l'aide d'un équipement personnel
CN116685876A (zh) 用于对不允许的物体的非侵入式检测的系统和方法
WO2023113867A1 (fr) Systèmes et procédés d'utilisation d'une arme à impulsions en conjonction avec une détection non invasive d'objets interdits
AU2016216608B2 (en) A monitoring device and system
WO2020107006A1 (fr) Procédés et appareil pour un système de défense de zone publique
US20230325660A1 (en) Method for detection of an object
Alsaedi et al. Survy of Methods and Techniques for Metal Detection
Al-Room et al. Drone Forensics: A Case Study of Digital Forensic Investigations
Frascà et al. Technologies for IMINT and SIGINT

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231010

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR