WO2022254347A1 - Système et procédé de surveillance et d'alerte de cible - Google Patents

Système et procédé de surveillance et d'alerte de cible Download PDF

Info

Publication number
WO2022254347A1
WO2022254347A1 PCT/IB2022/055109 IB2022055109W WO2022254347A1 WO 2022254347 A1 WO2022254347 A1 WO 2022254347A1 IB 2022055109 W IB2022055109 W IB 2022055109W WO 2022254347 A1 WO2022254347 A1 WO 2022254347A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
target
filter
alert
operable
Prior art date
Application number
PCT/IB2022/055109
Other languages
English (en)
Inventor
Tsachi Rosenhouse
Alon KEREN
Shay MOSHE
Amit DVASH
Ido KLEMER
Michael Orlovsky
Ronen Tur
Rotem BARDA
Original Assignee
Vayyar Imaging Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vayyar Imaging Ltd. filed Critical Vayyar Imaging Ltd.
Priority to EP22815460.5A priority Critical patent/EP4351416A1/fr
Priority to US18/037,127 priority patent/US20240021062A1/en
Priority to CN202280039351.6A priority patent/CN117412707A/zh
Publication of WO2022254347A1 publication Critical patent/WO2022254347A1/fr
Priority to US18/387,473 priority patent/US20240085554A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • A61B5/1117Fall detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/0507Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  using microwaves or terahertz waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning

Definitions

  • the disclosure herein relates to systems and methods for identification and tracking of targets in a monitored region.
  • the disclosure relates to the use of radar chips to identify subjects within a monitored region and to alert third parties if a fall event is detected.
  • Target monitoring systems are used in various scenarios. For example fall detection is an important application especially for senior citizens who live alone in homes and apartments and are isolated from people who could help them in an emergency. For such people, a fall, injury, or life threatening medical conditions can go undetected by family or support staff for an extended period of time.
  • Some wearable and handheld devices are available which comprise of emergency call buttons, however, these need to be manually activated to alert others when assistance is needed. In case an elderly person falls down, he may not be in a position to activate the emergency button and call someone for help.
  • target monitoring systems may distinguish between individuals within a group or a crowd, which may be important for a variety of reasons. For example, when multiple subjects within a common target zone are being individually monitored, say for ongoing health tracking or for fall-risk it is often required to identify each individual within the monitored region. There is therefore a need for a method enabling individuals to be distinguished within a crowd.
  • target monitoring may be required for security reasons.
  • People entering a certain premises may require a proper authenticated identification.
  • identification of the people has been a major concern.
  • Other electronic methods for people identification are based on identification of various parameters such as biometric based fingerprinting, eye pupil matching, face recognition, voice matching, scanning images- photograph matching, and so on.
  • biometric fingerprinting allows touch of the finger or thumb on the biometric machine
  • photograph matching requires person’s face or body in close proximity of the camera or capturing machine.
  • remote asynchronous identification through video involves registering record in streaming, validating thereof through an agent, client further showing identity document thereof analyzed by artificial intelligence, etc.
  • the existing conventional methods involve a series of steps which are dependent upon physical identity documents which are easy to forge.
  • none of the conventional technologies provide people identification from a distance without requiring any input from the subject person. Remote identification is very much required as such identification can be helpful in various applications like locating terrorists, missing persons, children, old people, and pets, identifying strangers, and so on.
  • a target monitoring and alert system comprising a radar unit, a processor unit and a communication module.
  • the radar unit may include at least one transmitter antenna connected to an oscillator and configured to transmit electromagnetic waves into a monitored region, and at least one receiver antenna configured to receive electromagnetic waves reflected by objects within the monitored region and operable to generate raw data.
  • the processor unit may include a moving body isolation processor, and the communication module configured and operable to communicate alerts to third parties.
  • the processor unit further comprises: a frame buffer memory unit for storing frame data; a data filter configured to receive the raw data, and operable to process the raw data to remove data relating to reflections from static objects thereby generating filtered data; a tracker module configured to receive the filtered data from the data filter and operable to process the filtered data to identify moving targets and to track the location of the moving targets over time thereby generating target data; and an alert-threshold generator operable to generate an alert-threshold.
  • a neural network may be configured to receive from the tracker module target data inputs selected from height profiles, signal-to-noise ratio and radial distance to object and operable to generate a fall likelihood score.
  • a fall identification module may be configured to receive the fall likelihood score from the neural network and to generate a fall alert if the likelihood score is above an alert-threshold value.
  • the processor unit further comprises a person identification module including a position characteristic extraction module and a motion characteristic extraction module. Accordingly, the processor unit may be operable to generate a probabilistic identification of a target by applying a stored Artificial Intelligence (Al) algorithms on the position and motion characteristics of the target. The processor unit may be further operable to generate an identification profile of the person.
  • a person identification module including a position characteristic extraction module and a motion characteristic extraction module.
  • the processor unit may be operable to generate a probabilistic identification of a target by applying a stored Artificial Intelligence (Al) algorithms on the position and motion characteristics of the target.
  • Al Artificial Intelligence
  • the processor unit may be further operable to generate an identification profile of the person.
  • the alert-threshold generator may be configured to receive communication from a fall alert mitigation manager, which may be configured and operable to receive input from a telemetric system and to use a sensitivity map to generate the alert threshold value.
  • the sensitivity map may comprise a binary file having a stack of two-dimensional arrays, for example a stack of ten two-dimensional arrays each having 20 rows and 20 columns.
  • the data filter may include a temporal filter unit through which received data may be passed to produce filtered output, which may be operable to select a frame capture rate, to collect raw data from a first frame; to wait for a time delay, to collect raw data from a second frame; and to subtract first frame data from the second frame data.
  • a temporal filter unit through which received data may be passed to produce filtered output, which may be operable to select a frame capture rate, to collect raw data from a first frame; to wait for a time delay, to collect raw data from a second frame; and to subtract first frame data from the second frame data.
  • the temporal filter comprises at least a moving target indication module, which may be operable to select a filter time constant, to apply an infinite impulse response filter over during the filter time constant, to apply a low pass filter, and to remove background from the raw data.
  • the temporal filter may comprise at least an adaptive moving target indication module, which may be operable to select an initial filter time constant, to apply an infinite impulse response filter with the initial filter time constant, to apply a low pass filter, to subtract the result from the next frame, to detect changes in image data, and to update the filter time constant accordingly.
  • an adaptive moving target indication module which may be operable to select an initial filter time constant, to apply an infinite impulse response filter with the initial filter time constant, to apply a low pass filter, to subtract the result from the next frame, to detect changes in image data, and to update the filter time constant accordingly.
  • the temporal filter may comprise at least an extended moving target indication module, which may be operable to select a filter time constant, to apply an infinite impulse response filter during the filter time constant, to apply a low pass filter, to subtract a mean value of several previous frames from the current frame and to remove artifacts from the filtered image.
  • an extended moving target indication module which may be operable to select a filter time constant, to apply an infinite impulse response filter during the filter time constant, to apply a low pass filter, to subtract a mean value of several previous frames from the current frame and to remove artifacts from the filtered image.
  • the temporal filter comprises at least a local adaptive moving target indication module, which may be operable to select an initial filter time constant, to apply an infinite impulse response filter with the initial filter time constant, to apply a low pass filter, to subtract the result from the next frame, to detect changes in image data, to segment the frame into subsets of voxels according to the local rate of change of image data, to set a local filter time constant for each subset of voxels as suits the local rate of change of image data, to apply the infinite impulse response filter to each subset of voxels over during an associated local filter time constant, and to subtract local background from each subset of voxels in a next frame of image data.
  • a local adaptive moving target indication module which may be operable to select an initial filter time constant, to apply an infinite impulse response filter with the initial filter time constant, to apply a low pass filter, to subtract the result from the next frame, to detect changes in image data, to segment the frame into subsets of voxels according to the local rate of change of image data, to set
  • the temporal filter may comprise at least a low motion signal-to-noise ratio enhancement module, which may be operable to apply a low signal-to-noise ratio temporal filter, to average energy values of the Moving Target Indication (MTI) images over several frames and to detect changes in the averaged data.
  • the temporal filter may include at least a motion filter bank.
  • the temporal filter includes at least an afterimage removal module.
  • the afterimage removal module is operable to capture a default background image, to set the default background image to be the value for a background, to set a background threshold, to capture raw data for first frame, to subtract background from raw data to generate candidate filtered data, to calculate a difference between candidate filtered data and the last recorded frame image, and if the difference is above the threshold then subtract the default background from raw data to generate new filtered data, to record the new filtered data as next frame image, to update the background to the new frame image and to capture raw data for next frame.
  • the afterimage removal module is operable to capture a default background image, to set the default background image to be the value for a background, to set a background threshold, to capture raw data for first frame, to subtract background from raw data to generate candidate filtered data, to calculate a difference between candidate filtered data and the last recorded frame image, and if the difference is below the threshold then to record the candidate filtered data as next frame image, to update the background to the new frame image and to capture raw data for next frame.
  • a method for monitoring targets within a monitored regions comprising: providing a radar unit comprising at least one transmitter antenna connected to an oscillator, and at least one receiver antenna; providing a processor unit including a moving body isolation processor; providing a communication module configured and operable to communicate alerts to third parties; the radar unit generating raw data by transmitting electromagnetic waves into the monitored region, and receiving electromagnetic waves reflected by objects within the monitored region; storing frame data in a frame buffer memory unit, generating filtered data by receiving raw data and removing data relating to reflections from static objects thereby generating filtered data; generating target data by identifying moving targets in the filtered data and tracking the location of the moving targets over time; generating an alert-threshold; training a neural network to receive target data inputs selected from height profiles, signal-to-noise ratio and radial distance to object and to generate a fall likelihood score; and generating a fall alert if the likelihood score is above an alert-threshold value.
  • a system for remotely and non-intrusively performing identification of a person includes a radar-based person identification device, a processing unit, a database and a communicator.
  • the radar-based person identification device may include an array of transmitters and receivers which are configured to transmit a beam of electromagnetic radiations towards a monitored region and receive the electromagnetic waves reflected by objects within the monitored region, respectively.
  • the device may also include a pre-processing unit for analyzing and processing the received electromagnetic waves.
  • the pre-processing unit may include a plurality of characteristic extraction modules for the person(s) under identification.
  • the pre-processing unit may include modules such as a position characteristic extraction module and a motion characteristic extraction module.
  • the processing unit may generate a probabilistic identification of the person by applying the stored Artificial Intelligence (Al) algorithms on the position and motion characteristics of the person. The probabilistic identification may then be used to generate an identification profile of the person.
  • Al Artificial Intelligence
  • identification profiles of the number of persons may be stored in the database.
  • the communicator may then transmit the identification reports to the concerned departments through a communication network.
  • Fig. 1 is a schematic representation of a possible fall detection and alert system
  • Fig. 2 is a schematic block diagram indicating data flow within a fall detection system
  • Fig. 3 is a flow chart representing actions of a fall detection method
  • Fig. 4 is a flow chart representing possible actions for removing static objects from image data
  • Fig. 5 is a flow chart representing possible actions for generating and tracking targets within data collected from the monitored region
  • Fig. 6 is a flow chart representing possible actions for detecting fall events within the monitored region
  • Fig. 7A is an example of an unfiltered frame in spherical coordinates of data collected from the monitored region
  • Fig. 7B is an example of a filtered frame in spherical coordinates of data from which static objects have been removed;
  • Fig. 7C represents the filtered data in spherical coordinates indicating locations of local maxima
  • Fig. 7D represents the filtered data in spherical coordinates indicating the location of the strongest local maximum peak
  • Fig. 7E represents the filtered data transformed into Cartesian coordinates
  • Figs. 8A and 8B are images indicating the expected and measured locations of a tracked peak in vertical (x-z) and horizontal (x-y) image sections, respectively;
  • Figs. 9A-H- show a series of frames tracking a target which briefly disappears from view before returning; and
  • Figs. 10A-H- show a series of frames tracking a target which passes through an excluded region
  • Fig. 11 shows a possible three dimensional energy profile for a target divided into an upper region, a middle region and a lower region;
  • Fig. 12A shows a three dimensional energy profile characteristic of a standing target
  • Fig. 12B shows a three dimensional energy profile characteristic of a non-lying target
  • Fig. 12C shows a three dimensional energy profile characteristic of a fallen target
  • Fig. 12D shows a three dimensional energy profile characteristic of a persistent fallen target.
  • Fig. 13A is a schematic flowchart illustrating an exemplary method for populating a database with time dependent energy profiles according to an aspect of the invention
  • Fig. 13B is a schematic flowchart illustrating an exemplary method for anomaly detection and alert generation according to an aspect of the invention
  • Fig. 14 shows a set of standard energy profiles for a target area
  • Fig. 15 shows a set of time dependent energy profiles for target segments of a target area
  • Figs. 16A, 17A and 18A illustrate KL Divergence values over all time windows in case of normal behaviour in exemplary embodiments of the invention
  • Figs. 16B, 17B and 18B illustrate KL Divergence values over all time windows in case of actual falls in exemplary embodiments of the invention
  • Fig. 18C is a block diagram of a training system for generating a fall likelihood score using supervised learning
  • Fig. 18D is a graph indicating changes over time of false positive and false negative records
  • Figs 19A is a block diagram schematically representing selected components of a fall alert generator;
  • Figs. 19B and 19C schematically indicate a sensitivity map which may be used by the fall validation module;
  • Figs. 19D, 19E, 19F and 19G are various examples of sensitivity maps
  • Fig. 20 is a graph illustrating how the height profile input might change over time during a possible fall event
  • Fig. 21 is a block diagram illustrating how various elements of the system may relate to each other;
  • Fig. 22 illustrates a schematic representation of a system for remote identification of people using radar- based person identification device
  • Fig. 23 illustrates a schematic representation of a box created aroud a target person for extracting the position characteristics according to an aspect of the invention
  • Fig. 24 illustrates different locations of persons for identification through the radar-based person identification device according to an aspect of the invention
  • Figs. 25A-25E illustrate different postures of the person(s) for identification through the radar-based person identification device according to an aspect of the invention
  • Fig. 26 illustrates different orientations of the person(s) for identification through the radar-based person identification device according to an aspect of the invention
  • Fig. 27 illustrates a flowchart showing a method for identifying person(s) through the radar-based person identification device according to an aspect of the invention
  • Fig. 28 is a schematic block diagram representing selected components of a possible moving body isolation system incorporated into a radar scanning system
  • Fig. 29A is a flowchart representing selected steps of a method for removing static objects from image data
  • Fig. 29B is a flowchart representing selected steps of a method for moving target indication filtering of 1C image data
  • Fig. 30A is a flowchart representing selected steps of a method for extended moving target indication filtering of image data
  • Fig. 30B is a flowchart representing selected steps of a method for adaptive moving target indication filtering of image data
  • Fig. 30C is a flowchart representing selected steps of a method for segmented frame moving target indication filtering of image data
  • Fig. 31 A is a flowchart representing selected steps of a method for low-motion target enhancement
  • Fig. 31 B is a flowchart representing selected steps of a possible method for filtering image data including low-motion;
  • C Figs. 32A-C present plots over time of magnitude and phase of the reconstructed signal at three indicated voxels within a target region;
  • Figs. 33A-C present plots of the first and second eigenvalues of the reconstructed signal at the three different voxels indicated in Figs 32A-C respectively;
  • Figs. 34A-C illustrate different signal features associated with the three voxel points indicated in Figs. f 32A-C;
  • Fig. 35 is a flowchart representing selected steps of a method for removing afterimage images generated by background removal.
  • aspects of the present disclosure relate to fall detection systems and methods.
  • the 3C disclosure relates to the use of radar chips to scan a monitored region such as an enclosed room.
  • the data obtained by the scanning radar chip may be processed to identify targets within the monitored region.
  • the identified targets may be tracked and profiled to indicate their posture such that fall detection rules may be applied and fall events detected.
  • Certain image processing solutions are available which generate fall alerts using reflections in the target 35 area from fallen objects.
  • these image processing solutions do not differentiate between the fall of the subject person and other objects present in the region.
  • the reflected energy from a toilet bowl containing water is similar to that of a fallen person. Consequently, false alerts are generated with the fall of objects present in the room.
  • Data obtained by the scanning radar chip may be processed to generate current energy profiles within the monitored region.
  • the current energy profiles may be compared with time dependent energy profiles to detect anomalies in the fall events and filtering fall alerts.
  • aspects of the present disclosure relate to systems and methods for isolating moving objects in image data.
  • the disclosure relates to filtering systems for distinguishing data pertaining to stationary and slow-moving objects within image data obtained by a radar chip scanning a monitored region.
  • the raw data obtained by the scanning radar chip may be passed to a moving body isolation processor which stores each frame of raw data in a buffer memory and applies a temporal filter to identify trackable objects moving within the monitored region.
  • the system may further enhance the signal to noise ratio of the data and distinguish noise from slowly oscillating targets.
  • Still further aspects of the present disclosure relate to systems and methods for remote identification of the person(s) using radar-based person identification device.
  • the disclosure relates to the use of radar chips for extracting a plurality of parameters and analyzing the parameters for generating the identification report.
  • the identification report may be sent to concerned authorities.
  • one or more tasks as described herein may be performed by a data processor, such as a computing platform or distributed computing system for executing a plurality of instructions.
  • the data processor includes or accesses a volatile memory for storing instructions, data or the like.
  • the data processor may access a non-volatile storage, for example, a magnetic hard disk, flash-drive, removable media or the like, for storing instructions and/or data.
  • the fall detection system 100 includes a radar unit 104, a processor unit 126 and a communication module 134.
  • the radar unit 104 includes an array of transmitters 106 and receivers 110.
  • the transmitter may include an oscillator 108 connected to at least one transmitter antenna TX or an array of transmitter antennas.
  • the transmitter may be configured to produce a beam of electromagnetic radiation, such as microwave radiation or the like, directed towards a monitored region 105 such as an enclosed room or the like.
  • the receiver may include at least one receiving antenna RX or an array of receiver antennas 110 configured and operable to receive electromagnetic waves reflected by objects 102 within the monitored region 105.
  • the processor unit, 126 which may include modules such as a data filter 123, a tracker module 125, a gait classification module 127 and a fall identification module 129, may be configured to receive data from the radar unit 104 and be operable to generate fall alerts based upon the received data.
  • a preprocessor 112 may be provided to process the raw data before transferring the data to the processor unit 126, as described herein.
  • the communication module 134 is configured and operable to communicate the fall alert to third parties 138.
  • the communication module 134 may be in communication with a computer network 136 such as the internet via which it may communicate alerts to third parties 138 for example via telephones, computers, wearable devices or the like.
  • the system may further include a radar based passive gait speed monitor 127 for use in the subject monitoring station which is schematically represented.
  • the gait speed monitor 127 may be operable to generate a value for the gait speed of a subject passing along an extended target zone 105.
  • the gait speed monitor includes at least one radar scanning arrangement and a processor unit.
  • the radar scanning arrangement 104 is configured to monitor the movement of a subject 102 over an extended range.
  • the extended range 105 is of dimensions suitable for the measurement of speed of sustained gait along a path of say 4-8 meters. Thus, by way of example, it may be preferred to locate a scanning arrangement to cover movement in a target zone of say 5-6 meters squared.
  • the radar typically includes at least one array of radio frequency transmitter antennas and at least one array of radio frequency receiver antennas.
  • the radio frequency transmitter antennas are connected to an oscillator (radio frequency signal source) and are configured and operable to transmit electromagnetic waves towards the target region.
  • the radio frequency receiver antennas are configured to receive electromagnetic waves reflected back from objects within the target region.
  • the processor unit 126 which may include modules such as a data filter 123, a tracker module 125 and a gait classification module 127, may therefore be configured to receive data from the radar unit and be operable to process the target data by applying gait classification rules and further operable to calculate a gait speed of the subject.
  • Raw data is generated by the radar module 104 which typically includes amplitude values for energy reflected at specific angles and ranges.
  • the raw data 12 may be represented as images in spherical coordinates such as shown in Fig. 7A for example.
  • the preprocessor unit 112 may receive the raw data 12 from the radar module 104.
  • the preprocessor unit 112 include a profile generator 114, a voxel selector 116 and an output 118.
  • the data filter 123 receives the raw data 12 directly from the radar module 104 or alternatively may receive pre-processed data 14 from the preprocessor unit 112.
  • the data filter 123 may include a temporal filter operable to process the raw data 12 in order to remove all data relating to reflections from static objects.
  • the filter 123 may thereby generate a filtered image 16 such as shown in Fig. 7B which includes only data pertaining to moving objects within the monitored region with background removed.
  • the data filter 123 may include a memory unit, and a microprocessor. Accordingly, the data filter 123 may store in the memory unit both a first set of raw data set from a first frame and a second set of raw data set from a second frame following a time interval. The microprocessor may be operable to subtract the first frame data from the second fame data thereby generating the filtered frame data. Other methods for filtering data will occur to those skilled in the art.
  • the filtered image data 16 may be transferred to a tracker module 125 operable to process the filtered image data 16 in order to identify moving targets with the data and to track the location of the identified moving targets over time thereby generating target data 24.
  • the tracker module 125 may include a detector 1252, an associator 1254 and a tracker 1256 and is operable to generate data 24 relating to targets within the monitored region.
  • the detector 1252 receives the filtered image data 16 from the temporal filter 123 and processes the filtered image data 16 to detect local maxima peaks 18 within its energy distribution.
  • Fig. 7C shows an example of a filtered data image 16 indicating locations of local maxima peaks.
  • the peaks data 18 may be transferred to the associator 1254.
  • the associator 1254 is operable to store the peak data 18 for each frame in a memory element and to associate each peak with a target object and further generating a single measurement location for each target.
  • Fig 7D represents the filtered data indicating the energy distribution and the location of the measurement in spherical coordinates. Typically the spherical coordinates may be converted into Cartesian coordinates such as shown in Fig. 7E.
  • the tracker 125 may be configured to receive target data, or track data, from each frame and be operable to populate a target database, or track database with a location value and a speed value for each target or track in each frame, thereby generating tracking data which may be used to calculate predicted locations 22 for each track in each frame.
  • Figs. 8A and 8B are images indicating the expected and measured track locations in vertical (x-z) and horizontal (x-y) image sections, respectively;
  • the associator 1254 may be further operable to receive tracking data from a target tracker 1256. Accordingly when a measurement 20 coincides with the expected location of an existing track, the measurement may be associated with that existing target. Alternatively, where the location of the measurement does not coincide with any tracked target, the measurement may be associated with a new track.
  • Track data 24 may be transferred to a gait classification module 127 and/or a fall identification module 129 operable to process the target data 24 by applying fall detection rules and to generate fall alert outputs 26 where required.
  • the fall identification module 129 includes a posture detector and a fall detector.
  • the posture detector may be configured to store target data in a memory unit, to generate an energy profile for each target, and to apply posture selection rules thereby selecting a posture for each track.
  • the posture detector may be further operable to store a posture history for each target in the memory unit. The fall detector may then access the posture history from the memory unit and generate a fall alert if at least one track is identified as fallen.
  • the method may include: providing a radar unit 1302 such as described herein, providing at least one processor unit configured to receive raw data from the radar unit and operable to generate fall alerts based uDon the received data and orovjdina a communication module confiaured and ODerated to communicate a fall alert to third parties.
  • providing the processor may include providing a temporal filter 1304, providing a tracker module 1306 and providing a fall identification module 1308 such as described above.
  • the method may further include: the radar scanning the target region 1310, for example by transmitting electromagnetic waves into a monitored region and receiving electromagnetic waves reflected from objects in the monitored region; transferring multiple frames of raw data to the processor unit 1312; removing static objects from the frames of raw data 1314; transferring filtered data to the tracker module 1316, identifying moving targets in filtered data 1318; transferring target data to the fall identification module 1320; tracking the moving targets over time; assigning posture to the targets 1322; storing a posture history in a memory unit 1324; applying fall detection rules 1326; and generating a fall alert 1330 if a fall is detected 1328.
  • a temporal filter may be applied to select a frame capture rate 1402, to collect raw data from a first frame 1404; to wait for a time delay, perhaps determined by frame capture rate 1406; to collect raw data from a second frame 1408; and to subtract first frame data from the second frame data 1410.
  • a filtered image may be produced from which static background is removed and the only moving target data remain.
  • the method may include detecting local maxima within each frame of filtered data 1510 and associating each local maximum with a target object.
  • the step of identifying moving targets in filtered data may include: setting a peak detection threshold 1512; detecting local maxima within each frame of filtered data 1514; defining a stain region, or point refletion spread region, for each of the local maxima 1518; selecting peaks by selecting only local maxima having an amplitude above the peak detection threshold 1516 and which do not lie within the stain region of a larger local maximum 1520.
  • Peak data may be obtained from the detector 1532 and tracking data may be obtained from the tracker 1534. Accordingly, each selected peak may be associated with a target object 1536. Optionally multiple peaks may be associated with a common target 1538.
  • the peak may be associated with that existing target.
  • the location of the peak does not coincide with any tracked target the peak may be associated with a new target.
  • the moving targets may be tracked over time 1550 by recording in a tracking memory or database a location value for each target in each frame; recording a speed value for each target in each frame 1552; predicting an expected value for a target in each frame 1554; sampling the next values for each target 1556, sending tracking data to associator 1556 and comparing the expected value for each target with the measured value for each target.
  • Figs. 9A-H show a series of frames of filtered data.
  • the series of frames indicate a moving target within the monitored region which is tracked over time.
  • the tracked target is marked in each frame by a small circle indicating the targets tracked location.
  • the target’s location is not indicated.
  • Such a scenario may occur for example, when the moving object within the monitored region, which is represented by the target in the data, moves behind a stationary object.
  • the data filter would typically remove the stationary object from the frame, thereby rendering the moving object invisible in the filtered data.
  • the associated target is not removed from the tracking database. Rather the missing target is retained and its expected location is calculated for subsequent frames such that when the object peak returns to view such as in Fig. 9H, the peak is again associated with the original target.
  • Figs. 10A-H show a series of frames of filtered data.
  • the series of frames indicate a moving target which passes through an excluded region within the monitored region, which is marked by a dashed rectangle in each frame. It may be useful to exclude certain regions from the data when, for example, a persistently moving object interfered with data.
  • a persistently moving object may be for example a swaying pendulum, a rippling curtain or the like.
  • a phase of assigning posture to the targets 1610 may include: obtaining target data 1612; generating energy profile for each target 1614; applying posture selection rules 1616, additionally or alternatively, applying a machine learning algorithm such as a neural network 1617; selecting a current posture 1618; recording current posture 1620 and saving current posture in a posture history 1622
  • a fall detection phase 1630 may include obtaining the posture history of all targets 1632; applying fall decision rules 1634 and providing an alert 1640 only if a fall is detected in one target 1636 and no other target has been assigned a standing posture 1638.
  • generating an energy profile for each target includes assigning a first value for amplitude of reflected energy from an upper region or the target; assigning a second value for amplitude of reflected energy from a middle region or the target; and assigning a third value for amplitude of reflected energy from a lower region or the target.
  • Characteristic energy profiles may be defined for various postures for example a fallen or lying posture may be identified when the third value for the amplitude is higher than both the first value and the second value such as illustrated in Figs. 12C and 12D. Such a posture may generate a fall alert.
  • a standing posture may be identified for example when the first value, second value and third values have similar amplitudes such as shown in Fig. 12A.
  • a posture may be simply classified as not lying where the third value for the amplitude is not higher than both the first value and the second value such as shown in Fig. 12B.
  • the system 100 may further be operable to detect anomalies so as to more accurately detect falls and to generate alerts.
  • the radar unit 104 also includes a pre-processor unit 112 which processes the data received from the receiver 110.
  • the pre-processor unit 112 includes a profile generator 114 configured to generate energy profiles for a target area.
  • the profile generator 114 generates a set of standard energy profiles 122 and time dependent energy profiles 124 for various segments of the target area. Where appropriate, such energy profiles 122 may be generated in advance and preloaded into the unit, as required.
  • the set of standard energy profiles 122 and time dependent energy profiles 124 are stored in the database 120.
  • the pre-processor unit 112 also includes a segment selector 116 configured to select a target segment of interest in the monitored region 102 by selecting radiations received within a given azimuth range (of the angles measured along the horizontal) at a given depth range measured by the time taken by reflections to arrive at the receiving antennas 110.
  • the profile generator 114 also generates a current energy profile for each target segment of the monitored region 102 selected by the segment selector 116.
  • An output unit 118 sends the standard energy profiles 122 and time dependent energy profiles 124 to the database 120 and the current energy profile of each target segment to the processing unit 126 for anomaly detection and filtering alerts.
  • the output unit 118 is also configured to send the raw data received by the receiver 110 to the processing unit 126.
  • the output unit 118 also sends the selected target segments of interest to the processing unit 126 for anomaly detection.
  • the processing unit 126 includes a fall detection module 128 which may be configured to receive data from the output unit 118 and operable to generate fall alerts based upon the fall detection rules.
  • the anomalous fall alerts are filtered by an anomaly detection module 130 which may be configured to receive the current energy profile for a selected target segment from the output unit 118 and the set of standard energy profiles 122 and time dependent energy profiles 124 from the database 120. For the selected target segment, the current energy profile is compared with the corresponding time dependent energy profile and anomalous fall alerts are filtered out.
  • An alert generator 132 then generates fall alerts and sends it to the communication devices (not shown) of the intended recipients.
  • the fall alerts may be communicated through a communication network to the recipients on their smartphones, computers, laptops, wearable devices like smart-watch, electronic bands, wearable collar, etc.
  • the communication networks include a Bluetooth network, a Wired LAN, a Wireless LAN, a WiFi Network, a Zigbee Network, a Z-Wave Network or an Ethernet Network.
  • the alert generator 132 may produce alerts in form of a text message, an image, a short video message, vibration signals, a buzzer, a beeper, a bell, a bleeper, a chirper and combinations thereof.
  • the audio/vibration means provided above for generating alerts are exemplary in nature and should not limit the scope of the invention.
  • Fig. 13A illustrates an exemplary method for populating a database with time dependent energy profiles.
  • the time dependent energy profile for each section of the target area shows the relative likelihood of each of the set of energy profile being selected at a given time of day.
  • the process starts at step 202 at which a set of standard energy profiles 122 are generated and stored in the database 120.
  • the set of standard energy profiles 122 characterize the expected energy distribution associated with a subject in a different pose (standing, sitting, lying, walking, bending over etc.).
  • a set of 32 standard energy profiles of an exemplary subject are shown in Fig. 14. These standard energy profiles are generated from large sample of data collected over a large period of time.
  • the target area is segmented into a number of target segments by the segment selector 116.
  • a learning period for collecting time dependent data is defined at step 206. In an exemplary embodiment, a learning period of 48 hours is defined with time intervals of 1 hour.
  • activity of each target segment is recorded. The activity is recorded through the reflections received from the target segments by the receiver 110 of the radar unit 104.
  • the profile generator 114 selects a closest match for the target segment from the set of standard energy profiles and generates time dependent energy profiles 124 for each segment at step 212.
  • the time dependent energy profiles 124 are stored in the database 120
  • step 214 it is determined if all time intervals of the learning period have been completed. It is noted that the system may continue gathering profiles in an ongoing manner during operation even after the learning period is over. Where required older data may be overwritten or purged. In this manner the previous 48 hours may always be divided into a number of time intervals, such as 24 or twelve time intervals as required.
  • Fig. 15 shows an exemplary set of time dependent energy profiles 124 for various target segments of a target area.
  • the term “Super Voxel” herein refers to a “target segment” of the target area with ‘X’ and ⁇ coordinates defining the particular target segment.
  • Fig. 13B is a schematic flowchart illustrating an exemplary method for anomaly detection in fall alerts and alert generation.
  • a fall is detected in the target region 102 based on the fall detection rules
  • data corresponding to target region 102 is recorded by the receiver 110 of the radar unit 104.
  • a current energy profile is generated by the profile generator 114 and sent to the processing unit 126 by the output unit 118 at step 304.
  • the current energy profile is compared with the recorded time dependent energy profile 124 stored in the database 120. Based on the comparison, it is determined if an anomaly is detected in the fall detection at step 308.
  • an alert is generated and provided to the intended recipients through various means at step 310.
  • the fall alert if filtered out and process repeats from step 304. The process completes at step 312.
  • KL Kullback— Leibler
  • a metric Mi is defined by the KL Divergence as: where, refers to time dependent energy profile distribution of a target segment; and P D refers to the current energy profile distribution of the target segment.
  • a threshold 7 is defined such that if MP ⁇ T there is no anomaly in the fall detection. Consequently, a fall alert is generated and sent to the intended recipients. Otherwise, if M* > T an anomaly is detected in the fall detection the fall detection is filtered out and no alert is generated.
  • an anomaly score may also be provided according to the confidence score based on the quality of information in the database and its diversity.
  • a filter mechanism may be provided to perform a decision function base upon parameters such as the anomaly score and the like to further select appropriate alert generation.
  • Figs. 16A, 17A and 18A illustrate KL Divergence values over all time windows in case of normal behavior in exemplary embodiments of the invention.
  • Figs. 16B, 17B and 18B illustrate KL Divergence values over all time windows in case of actual falls in exemplary embodiments of the invention.
  • circled points in Figs. 16A and 17A represent anomalies detected which do not correspond to actual falls. Such anomalies would not typically result in an alert being generated as they would not be accompanied by a fall detection event.
  • circled points in Figs.16B and 17B represent anomalies detected which correspond to actual falls. Such anomalies would typically be accompanied by a fall detection event and would therefore generate a fall alert.
  • Figs. 16A and 16B represent divergence values recorded before the learning period was completed.
  • Figs. 17A and 17B represent divergence values recorded after a learning period has been completed. Consequently, more events are recorded as anomalous in Fig. 16A than in 17A although both these represent normal behavior.
  • Fig. 18A which shows KL divergence where no actual falls occur
  • Further features of the system include the capability to retain a long-term memory for rare events, such as the operation of a washing machine or the like, which may otherwise be considered anomalies if only a 48 hour slice of memory is considered.
  • the system may classify zones within the target regions based upon the time dependent profiles. For example a zone may be identified to be a bed, if, say, a lying posture is detected over a long time mainly during night hours, or a toilet if, say, sitting and/or standing profiles are detected for characteristic short periods and so on. Such a classification system may form a basis for advanced room learning.
  • the communication module 134 may further be configured to communicate with an event detection module, optionally via the computer network.
  • the event detection may include a machine learning system such as a neural network 140 operable to generate a fall likelihood score.
  • the neural network may be provided by the processor inputs such as from height profiles, signal-to-noise ratio and radial distance to target and the like as well as combinations thereof.
  • a training system 400 for generating a fall likelihood score using supervised learning.
  • a training system 400 is presented by way of illustration and may be used during set up.
  • Various models maybe used such as neural networks, non-linear models, network regression models, networks of sigmoid function neurons and the like.
  • a neural network is described herein in however, other models and training systems will occur to those skilled in the art.
  • a Long Short Term Memory recurrent neural network architecture may be particularly suitable for real time evaluation of fall events as it is relatively easy to implement if it is configured to monitor transitions between height profiles for example. It will of course be appreciated that other architectures such as CNN may be preferred, as appropriate.
  • the training system 400 includes a neural network 420 a real patient record 440 and an error generator 460.
  • Recorded events may be monitored during a training phase, for example within a test environment in which known fall events occur, such that the actual event status 442 is known, for example whether a fall has occurred or a subject is lying down, or the like.
  • the neural network generates a predicted even status 422.
  • the Error generator 460 compares the actual event status 442 and the predicted event status 422 producing a cost function which is fed back to the neural network which optimizes the various neuron parameters so as to minimize the cost function, possibly using iterative techniques or heuristic techniques.
  • a cost function may be generated by a controller summing the squares of the errors for each input, although other cost functions may be preferred as suit requirements.
  • Minimization algorithms may include, but are not limited to heuristic methods such as Memetic algorithms, Differential evolution, Evolutionary algorithms, Dynamic relaxation, Genetic algorithms, Hill climbing with random restart, Nelder-Mead simplicial heuristic: A popular heuristic for approximate minimization (without calling gradients), Particle swarm optimization, Gravitational search algorithm, Artificial bee colony optimization, Simulated annealing, Stochastic tunneling, Tabu search, Reactive Search Optimization (RSO) or the like.
  • heuristic methods such as Memetic algorithms, Differential evolution, Evolutionary algorithms, Dynamic relaxation, Genetic algorithms, Hill climbing with random restart, Nelder-Mead simplicial heuristic: A popular heuristic for approximate minimization (without calling gradients), Particle swarm optimization, Gravitational search algorithm, Artificial bee colony optimization, Simulated annealing, Stochastic tunneling, Tabu search, Reactive Search Optimization (RSO) or the like.
  • minimization may include iterative methods such as Newton's method, Sequential quadratic programming, interior point methods, Coordinate descent methods, Conjugate gradient methods, Gradient descent, Subgradient methods, Bundle method of descent, Ellipsoid methods, Reduced gradient method, Guasi-Newton methods, Simultaneous perturbation stochastic approximation (SPSA) method for stochastic optimization, Interpolation methods and the like.
  • iterative methods such as Newton's method, Sequential quadratic programming, interior point methods, Coordinate descent methods, Conjugate gradient methods, Gradient descent, Subgradient methods, Bundle method of descent, Ellipsoid methods, Reduced gradient method, Guasi-Newton methods, Simultaneous perturbation stochastic approximation (SPSA) method for stochastic optimization, Interpolation methods and the like.
  • SPSA Simultaneous perturbation stochastic approximation
  • the recorded events provide real subject parameters 444 to the neural network, such that the neural network is optimized to produce a predicted diagnosis 422 as close as possible to the actual event status 442 of the real patient record for any given set of subject parameters.
  • the neural network 420 is able to generate a fall likelihood score according to the monitored parameters such as height profile, signal to noise ratio, distance to the subject or the like. It is further noted that, where required, other input parameters may be provided such as body volume, body mass, gait speed, breathing rate, heart rate, heart rate variability, activity of daily living, body temperature, blood pressure and the like as suit requirements.
  • the fall likelihood score may be represented by a percentage value indicating the degree of confidence that a fall event has occurred.
  • Fig. 18D is a graph indicating how the rate of false positives and false negative status records (Loss Value) changes over time.
  • the decreased rate indicates that the system is able to learn to successfully identify false events both during the training data as well as during blind tests with validation data.
  • the Machine Learning event detection module may allow a single network to validate events in multiple situations, for example fall from standing, fall from wheelchair, a subject rising after a fall, a subject falling from bed, a subject getting out of bed and the like.
  • the fall alert generator includes an event detection module, an alert mitigator and a fall validation module.
  • the event detection is configured to receive input from a radar-based monitor and to generate a fall likelihood score.
  • the alert mitigator is configured to receive input from a telemetric system and to generate an alert threshold using sensitivity maps representing the monitored region.
  • the alert threshold may present a dynamic value for a minimum certainty required before an alert is generated.
  • the fall validation module is configured to compare fall likelihood score from the event detection module with the alert threshold from the alert mitigator. If the percentage value of the fall likelihood score is higher than the alert threshold, then a fall alert may be generated.
  • Each sensitivity map may be a binary file comprising a stack of two- dimensional arrays, for example a stack of ten 20 by 20 matrix layers.
  • each 20 by 20 matrix layer may array X values by dividing the array into equal intervals, XJNT from a minimum value X_MIN to a maximum value X_MAX and divide such that:
  • a position to region mapping function may provide a map index as:
  • MaplndX (X_fall-X_min)/X_int
  • Fall probability maps may allow the alert threshold to be adapted according to the position of the alert within the room as well as the historical data for that room.
  • various sensitivity maps are illustrated in:
  • Fig. 19D which indicates an example of a sensitivity map for a transition from lying to bending
  • Fig. 19E which indicates an example of a sensitivity map for a transition from lying to standing
  • Fig. 19F which indicates an example of a sensitivity map for no transitions following a fall
  • Fig. 19G which indicates an example of a sensitivity map for a transition from standing to lying characteristic of a fall.
  • Fig. 20 indicates a possible height profile input parameter which may be input into the event detection module,
  • the intensity of the reflected energy from each of 20 height strata is indicated for consecutive frames at a given coordinate.
  • a lot of energy is reflected from height strata 9 to 20, during the subsequent frames most of the energy is reflected at height strata below 6.
  • Other parameters may be used as inputs such as signal to noise ratio of the frame as well as the radial distance to the reflected signal.
  • fall detection may be effected by preparing data for a fall classifier, such data is typically gathered by a radar based monitor such as described herein.
  • the data may include height profiles of targets within the monitored region, as well as the distance of targets from the detector, which are input to a fall classifier which may determine a current postpre for the subject, the current posture may be input to a posture decision function which provides a posture decision output confirming the posture status of the subject.
  • the posture status may be input into a posture transition function for determining whether a posture transition event has occurred. Posture transition events may serve as inputs to a Long Short Term Memory recurrent neural network for example.
  • a human presence monitor may further mitigate fall alerts by detecting if a human is present at the time of a posture transition.
  • Human presence may be determined according to a sensitivity map provided by a machine learning module configured to characterize the monitored region. It is noted that this may require the addition of a further sensitivity layer where required.
  • the obtained inputs may be provided to a Fall Event Manager unit which further mitigate fall alert generation using the sensitivity map. Accordingly, a suspected fall may be validated or invalidated as appropriate.
  • Fig. 22 is a schematic representation of a system 2100 for remote identification of person(s).
  • the system 2100 includes a radar-based person identification device 2104, a processing unit 2118, a database 2120 and a communicator 2122.
  • the radar-based person identification device 2104 includes an array of transmitters 2106 and an array of receivers 2110.
  • the array of transmitters 2106 may include an oscillator 2108 connected to at least one transmitter antenna or an array of transmitter antennas. Accordingly, the transmitters 2106 may be configured to produce a beam of electromagnetic radiations, such as microwave radiation or the like, directed towards a monitored region 2102 such as an enclosed room or an open area.
  • the receiver 2110 may include an array of receiver antennas configured and operable to receive electromagnetic waves reflected by objects within the monitored region 2102.
  • the monitored region 2102 is shown to include two persons 2102a and 2102b standing in different postures. However, monitored region 2102 may include a smaller area focusing only one person or a larger area focusing more people for measuring their physical parameters without limiting the scope of the invention.
  • the person identification device 2104 monitors the persons 2102a and 2102b without any physical contact or attachments.
  • the person identification device 2104 may be appropriately positioned at a distance of few feet from the monitored region 2102 to effectively monitor the persons 2102a and 2102b.
  • the person identification device 2104 is positioned at any location of various premises. Premises can include such as but are not limited to a residential building, a lift area, an entrance of the premise, a school, a hospital, a guest visiting area, a reception, an office, a mall, and so on.
  • the person identification device 2104 can be located anywhere inside or outside of the geographic boundary of the monitored region like walls or ceilings of the premises.
  • the information received by the receiver 2110 of the person identification device 2104 includes the position, shape and motion parameters of the persons 2102a and 2102b.
  • the parameters which may be monitored may include, but not limited to, height, weight, body volume, body shape, body structure, orientation, various postures of standing, sitting, lying down, style of walking, running, velocity, acceleration, etc.
  • the electromagnetic signals received by the receiver 2110 are sent to a pre-processing unit 2112 of the person identification device 2104.
  • the pre-processing unit 2112 comprises a position characteristic extraction module 2114, and shape extraction module 2115 which extracts characteristics of different positions of the persons 2102a and 2102b. Different persons can stand in different positions such as erect, slanting, free style, cross-armed, facing towards the person identification device 2104, facing back towards the person identification device 2104, and so on as shown in Fig. 26.
  • the position characteristic extraction module 2114 filters out the non-desired signals received from other objects (not shown) present in the monitored region 2102.
  • the position characteristic extraction module 2114 can also extract position characteristics of the persons considering their distance from the person identification device 2104.
  • the position characteristic extraction module 2114 extracts the characteristics of the position considering the angle of orientation of the person from the device 2104.
  • the position characteristic extraction module 2114 generates a boundary or a box 2200 around the target person as shown in Fig. 23.
  • the creation of box around the person helps to gauge the physical features of the person including the height and physical width of the person in a particular position of standing or sitting.
  • the box can be partitioned into various small sized slices along the 3-axis to determine the physical characteristics of the person more accurately. The accuracy of determining the physical characteristics increases in proportion to the number of slices of the box.
  • the position characteristic extraction module 2114 generates 19 position profile features of the target person.
  • the pre-processing unit 2112 also comprises a motion characteristic extraction module 2116 which is configured to extract the motion characteristics of the persons 2102a and 2102b from the received electromagnetic signals.
  • the motion characteristics may include, but not limited to, rate of acceleration and velocity, trajectory of the motion, erect back of the person while walking, bent during walking, looking forward while walking, and so on.
  • the motion characteristic extraction module 2116 generates 11 dynamic profile features of the target person.
  • characteristic movements may be detected and extracted, which may not directly related to position may be monitored, such as head movements, shaking hands, or during locomotion, step size, step rate, symmetry of steps, limping, left or right leading leg and the like as well as combinations thereof.
  • Such movements may also provide data for a dynamic profile features of the target person.
  • the preprocessing unit 2112 may be further configured and operable to extract characteristic shape characteristics of a target person.
  • Figures 25A-25E present for purely illustrative purposes various exemplary positions of the persons which can be identified using the person identification device 2104 in accordance with an embodiment of the invention.
  • the person’s identification device is configured to identify person based on any type of posture, for example, sitting and talking in a chair (Fig. 25A), walking (Fig. 25B), sitting on a chair (Fig. 25C), lying down (Fig. 25D), sitting in different styles on a sofa (free style or cross-legged) (Fig. 25E), and so on. It will be appreciated that many other positions may also be adopted by persons.
  • the position, shape and motion characteristics of the persons 2102a and 2102b generated by the modules 2114, 2115 and 2116, respectively are sent to the processing unit 118.
  • the processing unit 118 is configured to generate position and motion vectors along the 3-axis as well as shape vectors based on the extracted characteristics of the persons 2102a and 2102b.
  • the processing unit 2118 is also configured to generate physiological profiles of the persons based on parameters including such as but are not limited to gender, weight, age, body shape, height, and so on.
  • the processing unit 2118 generates a probabilistic identification of the person by applying the stored Artificial Intelligence (Al) algorithms on the position and motion characteristics and the physiological parameters of the person. The probabilistic identification is then used to generate an identification profile of the person.
  • Al Artificial Intelligence
  • the processing unit 2118 is configured for manual or automated training through machine learning to enhance the stored probabilistic algorithms for person identification. It may use methods from neural networks, statistics, operations research and physics to find hidden insights in the received data of position and motion characteristics without being explicitly programmed for it.
  • the processing unit 2118 may be trained “on the fly” to build probabilistic models based on a training data. The generated probabilistic models may be trained, tested and validated at regular intervals to improve the system performance.
  • the physiological profiles, the extracted position, shape and motion characteristics and identification profiles of the various persons may be stored in the database 2120.
  • the collated profiles and identification reports of individual persons or a group thereof may be sent to third parties 2126a, 2126b and 2126c.
  • concerned authorities interested in identifying particular individuals may include, but are not limited to, a school, a police station, a municipality department, the parents, a concerned Govt. Department or office, a server, a client device, and so on.
  • the profiles and identification reports are sent from the database 2120 through the communicator 2122 which transmits the information through a communication network 2124.
  • the communication network 2124 may include a Bluetooth network, a Wired LAN, a Wireless LAN, a WiFi Network, a Zigbee Network, a Z-Wave Network or an Ethernet Network.
  • the profiles and identification reports of a missing robber or a terrorist or a missing person or a child or a pet if identified in any location may be sent to the police station.
  • the profiles and identification reports may also be sent to a communication device of an owner of the premise such as house when the owner is away, identifying and informing visitors visiting his house during his absence.
  • Another application of the system may be to track a person’s health condition from the way he moves around the house and the shape of his body, as well as from changing habits, such as using the bathroom more often or for longer times at an unusual time of day.
  • the ability to identify a person is important when there is more than one resident.
  • the use of low-resolution radar images maintains privacy since it is impossible to generate a high resolution image of the person, yet it enables identification.
  • Fig. 27 is a schematic flowchart illustrating an exemplary method for the personal identification device according to an aspect of the invention.
  • the process starts at step 2602 and electromagnetic waves (EM) are transmitted by the transmitter 2106 of the person identification device 2104 towards the monitored region 2102 at step 2604.
  • the EM waves reflected from the monitored region 2102 are received by the receiver 2110 at step 2606.
  • the received EM signals are transferred to the position characteristic extraction module 2114 and motion characteristic extraction module 2116 of the pre-processing unit 2112 at step 2608.
  • the position characteristic extraction module 2114 and motion characteristic extraction module 2116 filters out the non-desired data and extracts the shape, the position and motion characteristics of the target person, respectively at step 2612.
  • the extracted position and motion characteristics are transferred to the processing unit 2118 at step 2614.
  • the processing unit 2118 generates a probabilistic identification of the person by applying the stored Artificial Intelligence (Al) algorithms on the position, shape and motion characteristics and the generated physiological parameters of the person.
  • the probabilistic identification is used to generate an identification profile of the person at step 2616.
  • the physiological and identification profiles of the person are stored in the database 2120.
  • the physiological and identification profiles of the person may be sent to one or more third parties to inform thereto for the identified person(s). Additionally, or alternatively where appropriate, the identy of the identified person may be sent to a third party rather than the corresponding identification profile.
  • the process is completed at step 2622.
  • the fall detection system 3100 includes a radar unit 3104, a processor unit 3120 and a communication module 3130.
  • the radar unit 3104 includes an array of transmitters 3106 and receivers 3110.
  • the transmitter may include an oscillator 3108 connected to at least one transmitter antenna TX or an array of transmitter antennas. 3106 Accordingly the transmitter may be configured to produce a beam of electromagnetic radiation, such as microwave radiation or the like, directed towards a monitored region 3105 such as an enclosed room or the like.
  • the receiver may include at least one receiving antenna RX or an array of receiver antennas 3110 configured and operable to receive electromagnetic waves reflected by objects 3102 within the monitored region 3105.
  • the processor unit 3120 may include various modules such as a frame buffer memory unit 3122 and a temporal filter unit 3124.
  • the temporal filter unit may itself include various data filtering modules through which received data may be passed to produce a filtered output. Examples of data filtering modules include moving target indication (MTI) modules 3125a, adaptive MTI modules 3125b, local adaptive MTI modules 3125c, low motion signal-to-noise ratio enhancement modules 3125d, motion filter banks 3125e and phantom afterimage removal modules 3125f. Other filter modules may occur to those skilled in the art.
  • MTI moving target indication
  • the communication module 3134 is configured and operable to communicate the output images to third parties 3138.
  • the communication module 3134 may be in communication with a computer network 3136 such as the internet via which it may communicate alerts to third parties 3138 for example via telephones, computers, wearable devices or the like.
  • Temporal filters may be used to distinguished objects of interest from background objects as they may be used to highlight reflections from moving objects over reflections from stationary objects such as walls and furniture, or vibrating and swinging objects such as fans, washing machine, plants, curtains and the like. It is further noted that temporal filters may also be used to highlight other slowly changing phenomena such as systematic sensor noise and antenna cross-talk.
  • a temporal filter may be applied to select a frame capture rate 3202, to collect raw data from a first frame 3204; to wait for a time delay, perhaps determined by frame capture rate 3206; to collect raw data from a second frame 3208; and to subtract first frame data from the second frame data 3210.
  • a filtered image may be produced from which static background is removed and the only moving target data remain.
  • the temporal filter may be further improved by applying a Moving Target Indication (MTI) filter as illustrated in Fig 29B.
  • MMI Moving Target Indication
  • An MTI may be applied to data signals before they are transferred to the image reconstruction block or directly to the image data.
  • MTI may estimate background data for example using an infinite impulse response (IIRlJow-Dass filter (LPF1. This backaround data is subtracted from the imaae data to isolate reflections from moving objects. It is noted that such a process may be achieved by subtracting the mean value of several previous frames from the current frame.
  • the mean may be calculated by an HR or an FIR low-pass filter such as the above described LPF implementation.
  • the MTI HR filter time constant, or the duration over which the average is taken by the NR response is generally fixed to best suit requirements, either short to better fit dynamic targets or long to fit still or slow targets.
  • the MTI method 3220 may include steps such as selecting a filter time constant 3222, applying an IIR filter over the duration of the selected time constant 3224, applying a low pass filter 3226, and removing the background from the raw data 3228.
  • MTI may generate artifacts such as afterimages, or phantoms, when objects are suddenly removed from the background. For example, when a chair is moved, a person moves in their sleep, a wall is briefly occluded, of the like, subsequent background subtraction may cause such events to leave shadows in the image at the previously occupied location. Since signals are complex, it is not possible to distinguish between a real object and its negative shadow.
  • obscured stationary objects in the background may appear to be dynamic when they suddenly appear when uncovered by a moving object in the foreground.
  • slow changes of interest may be repressed, for example the reflections from people sitting or lying still may change little over time and thus their effects may be attenuated by background subtraction.
  • FIG. 30A illustrating an extended Moving Target Indication filter method 3240, including selecting a filter time constant 3242, applying an NR filter over the duration of the selected time constant 3244, applying a low pass filter 3246, and the step of removing the background from the raw image may be achieved by subtracting the mean value of several previous frames from the current frame 3248 and further by removing artifacts, such as shadows and phantom afterimages from the filtered image 3250.
  • the filter may further increase sensitivity to low-motion targets without false detection of static or even vibrating objects.
  • a method for artifact reduction may include an adaptive MTI unit operable to adjust the IIR filter time constant for the image data according to detected changes in the data 3260. Accordingly, a short time constant may be selected for large changes thereby reducing dynamics artifacts. Longer time constants may be selected for small changes so as to increase sensitivity to low-motion targets because each instantaneous image is more different from the average of many frames than from a few recent frames.
  • a method for segmented frame MTI is illustrated wherein a localized time constant may be selected for each subset of voxels 3280.
  • the method includes the steps of selecting an initial filter time constant 3282, applying an IIR filter of the duration of the initial filter time constant 3282, applying a low pass filter 3286 and subtracting the result from the next frame 3288. Changes in the image data are detected 3290 and the rate of those changes is determined for each subset of voxels.
  • the filter may further segment the frame into subsets of voxels according to the local rate of change of image data 3292.
  • a local filter time constant may be set for each subset of voxels as suits the local rate of change of image data 3294.
  • the HR filer is applied to each subset of voxels over the duration of the associated local filter time constant 3296.
  • the local background may be subtracted from each subset of voxels in the next frame of image data 3298.
  • a subset may include only one voxel and the local time constant may be selected for only one voxel.
  • a time constant may be selected for each voxel separately or for each region of the image as required. Accordingly, the time constant may be optimized for multiple phenomena occurring simultaneously in the same set of image data.
  • one method 3320 includes applying an MTI filter 3322.
  • the energy values, such as magnitude, sigmoid of magnitude, or other such energy function, of the Moving Target Indication (MTI) images may be averaged over several frames 3324 and changes may be detected in the averaged data 3326.
  • MTI Moving Target Indication
  • the signal and noise combined typically have a higher average value than the noise alone. Accordingly, the signal-to-noise ratio of the average is greater than one, which may enable the detection of low-motion and low SNR targets such as a breathing person lying still.
  • the improved signal-to-noise ratio may further enhance the signal reflected from stationary vibrating reflecting objects in the background as well as foreground moving objects of interest. Accordingly, isolation functioning may be applied to distinguish between the micro-motion of vibrating objects with amplitudes smaller than say one millimetre from low-motion of a breathing person at amplitude of around one centimetre.
  • a Motion feature filter bank 3340 A combination of filters may be applied to extract various temporal features.
  • the motion feature filter bank may distinguish between different motion types based on phase changes and statistics. Examples of methods of application of such filters may include applying a Mean(Abs(raw-image)) filter 3344, applying an Abs(Mean(raw-image) filter 3346, and applying a Real-imaginary parts covariance matrix eigenvalues filter 3348. It will be appreciated that the filters may be applied in any order or combination as required.
  • Figs. 32A-C present plots over time of magnitude and phase of the signal reconstructed at three indicated voxels within a target region.
  • the graphs of Figs. 33A-C present plots of the x eigenvalues and y eigenvalues of energy received from the three different voxels indicated in Figs 32A-C respectively.
  • the plot of Fig. 33A showing an apparent random phase distribution about a central point is typical of background noise.
  • the plot of Fig. 33B showing a generally circular phase distribution about a central point is typical of a breathing subject not otherwise moving, indeed this phase distribution was obtained from a voxel reflecting energy from a lying subject.
  • the plot of Fig. 33C showing a phase distribution around an arc with a large radius is typical of a slowly oscillating object, in this case a swinging lamp.
  • Figs. 34A-C illustrate still further plots associated with the three voxel points indicated in Figs. 32A-C.
  • Figs. 34A-C illustrate still further plots associated with the three voxel points indicated in Figs. 32A-C.
  • Figs. 34A-C illustrate still further plots associated with the three voxel points indicated in Figs. 32A-C.
  • a further method for removing artifacts from the temporally filtered image may be to reset the background data to a default value when a large change occurs.
  • the method may include, capturing a default background image 3402 possibly during set up, upon significant change of target region or at a regular interval such as daily, hourly, periodically as appropriate. This image is set to be the default value for a background 3404.
  • a background reset threshold is set 3406 which determines the largest change of data between frames that is to be considered reasonable for noise only.
  • the background is subtracted from the raw data 3410, but the resulting candidate filtered data is not necessarily recorded as the frame image.
  • the difference between the candidate filtered data and the last recorded frame image is calculated 3412 and compared to the threshold value 3414. It is noted that, where appropriate, the background may be reset for each voxel separately depending upon the raw-background difference as described in Fig. 30C.
  • composition or method may include additional ingredients and/or steps, but only if the additional ingredients and/or steps do not materially alter the basic and novel characteristics of the claimed composition or method.
  • a compound or “at least one compound” may include a plurality of compounds, including mixtures thereof.
  • a range such as from 1 to 6 should be considered to have specifically disclosed sub-ranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6 as well as non-integral intermediate values. This applies regardless of the breadth of the range.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Physiology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention concerne des systèmes de surveillance et d'alerte de cible pour identifier et suivre des cibles dans des données radar. Des données de trame brutes sont filtrées pour éliminer des données se rapportant à des réflexions provenant d'objets statiques. Des cibles mobiles sont identifiées dans les données filtrées et leur emplacement est suivi au fil du temps, générant des données de cible. Des réseaux neuronaux peuvent traiter les données de cible, calculer un score de probabilité de chute et générer une alerte de chute si celui-ci se situe au-dessus d'un seuil d'alerte. Un module d'identification de personne peut extraire les caractéristiques de position et de mouvement de chaque cible à partir des données et générer une identification probabiliste de la cible avec une personne.
PCT/IB2022/055109 2019-12-23 2022-06-01 Système et procédé de surveillance et d'alerte de cible WO2022254347A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP22815460.5A EP4351416A1 (fr) 2021-06-01 2022-06-01 Système et procédé de surveillance et d'alerte de cible
US18/037,127 US20240021062A1 (en) 2019-12-23 2022-06-01 Target monitoring and alert system and method
CN202280039351.6A CN117412707A (zh) 2021-06-01 2022-06-01 目标监视和告警系统及方法
US18/387,473 US20240085554A1 (en) 2019-12-23 2023-11-07 System and method for generating mitigated fall alerts

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US202163195189P 2021-06-01 2021-06-01
US63/195,189 2021-06-01
US202163196240P 2021-06-03 2021-06-03
US63/196,240 2021-06-03
US202163210601P 2021-06-15 2021-06-15
US63/210,601 2021-06-15
US202163211828P 2021-06-17 2021-06-17
US63/211,828 2021-06-17

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US17/632,522 Continuation-In-Part US11660023B2 (en) 2019-12-23 2020-12-23 Fall detection systems and methods
PCT/IB2020/062383 Continuation-In-Part WO2021130690A1 (fr) 2019-12-23 2020-12-23 Systèmes et procédés de détection de chute

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US18/037,127 A-371-Of-International US20240021062A1 (en) 2019-12-23 2022-06-01 Target monitoring and alert system and method
US18/387,473 Continuation-In-Part US20240085554A1 (en) 2019-12-23 2023-11-07 System and method for generating mitigated fall alerts

Publications (1)

Publication Number Publication Date
WO2022254347A1 true WO2022254347A1 (fr) 2022-12-08

Family

ID=84322581

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/055109 WO2022254347A1 (fr) 2019-12-23 2022-06-01 Système et procédé de surveillance et d'alerte de cible

Country Status (2)

Country Link
EP (1) EP4351416A1 (fr)
WO (1) WO2022254347A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070139517A1 (en) * 2005-12-16 2007-06-21 Jenkins Michael V Temporal Video Filtering
US20130083246A1 (en) * 2011-09-30 2013-04-04 Apple Inc. Scene adaptive temporal filtering
US20150309579A1 (en) * 2014-04-28 2015-10-29 Microsoft Corporation Low-latency gesture detection
US20160238737A1 (en) * 2015-02-13 2016-08-18 Delta Five, Llc Automated Insect Monitoring System
US20180189930A1 (en) * 2016-12-30 2018-07-05 Toshiba Medical Systems Corporation Apparatus and method for reducing artifacts in mri images
US20180335380A1 (en) * 2017-05-16 2018-11-22 Fluke Corporation Optical gas imaging systems and methods
US20200209378A1 (en) * 2018-12-31 2020-07-02 Celeno Communications (Israel) Ltd. Coherent Wi-Fi Radar Using Wireless Access Point
WO2021050966A1 (fr) * 2019-09-13 2021-03-18 Resmed Sensor Technologies Limited Systèmes et méthodes de détection d'un mouvement

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070139517A1 (en) * 2005-12-16 2007-06-21 Jenkins Michael V Temporal Video Filtering
US20130083246A1 (en) * 2011-09-30 2013-04-04 Apple Inc. Scene adaptive temporal filtering
US20150309579A1 (en) * 2014-04-28 2015-10-29 Microsoft Corporation Low-latency gesture detection
US20160238737A1 (en) * 2015-02-13 2016-08-18 Delta Five, Llc Automated Insect Monitoring System
US20180189930A1 (en) * 2016-12-30 2018-07-05 Toshiba Medical Systems Corporation Apparatus and method for reducing artifacts in mri images
US20180335380A1 (en) * 2017-05-16 2018-11-22 Fluke Corporation Optical gas imaging systems and methods
US20200209378A1 (en) * 2018-12-31 2020-07-02 Celeno Communications (Israel) Ltd. Coherent Wi-Fi Radar Using Wireless Access Point
WO2021050966A1 (fr) * 2019-09-13 2021-03-18 Resmed Sensor Technologies Limited Systèmes et méthodes de détection d'un mouvement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MISHRA ET AL.: "Human Motion Detection and Video Surveillance Using MATLAB", INTERNATIONAL JOURNAL OF SCIENTIFIC ENGINEERING AND RESEARCH (IJSER), vol. 3, no. 7, July 2015 (2015-07-01), XP093012244, Retrieved from the Internet <URL:https://tarjomefa.com/wp-content/uploads/2018/06/9189-English-TarjomeFa.pdf> [retrieved on 20220801] *

Also Published As

Publication number Publication date
EP4351416A1 (fr) 2024-04-17

Similar Documents

Publication Publication Date Title
AU2020411056B2 (en) Fall detection systems and methods
EP3367883B1 (fr) Surveillance des activités quotidiennes d&#39;une personne
Deep et al. A survey on anomalous behavior detection for elderly care using dense-sensing networks
Alvarez et al. Behavior analysis through multimodal sensing for care of Parkinson’s and Alzheimer’s patients
US9597016B2 (en) Activity analysis, fall detection and risk assessment systems and methods
JP2023171650A (ja) プライバシーの保護を伴う人物の識別しおよび/または痛み、疲労、気分、および意図の識別および定量化のためのシステムおよび方法
US8884813B2 (en) Surveillance of stress conditions of persons using micro-impulse radar
Rastogi et al. A systematic review on machine learning for fall detection system
Fan et al. Robust unobtrusive fall detection using infrared array sensors
Yao et al. Fall detection system using millimeter-wave radar based on neural network and information fusion
Krupitzer et al. Beyond position-awareness—Extending a self-adaptive fall detection system
CN109255360A (zh) 一种目标分类方法、装置及系统
CN108171181A (zh) 一种适用于家居内的人体摔倒检测方法
US10098580B2 (en) Hypermotor activity detection system and method therefrom
US20240021062A1 (en) Target monitoring and alert system and method
US20240085554A1 (en) System and method for generating mitigated fall alerts
WO2022254347A1 (fr) Système et procédé de surveillance et d&#39;alerte de cible
Pogorelc et al. Discovery of gait anomalies from motion sensor data
Liu et al. Human behavior sensing: challenges and approaches
Berrang-Ford et al. Sleeping sickness in southeastern Uganda: a spatio-temporal analysis of disease risk, 1970–2003
Wambura et al. Deep and confident image analysis for disease detection
CN117412707A (zh) 目标监视和告警系统及方法
Chen et al. Subtle Motion Detection Using Wi-Fi for Hand Rest Tremor in Parkinson's Disease
EP4099897A1 (fr) Système, procédé et produit-programme informatique pour la mesure à distance de signes vitaux
Phillips II Walk detection using pulse-Doppler radar

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22815460

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280039351.6

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2022815460

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022815460

Country of ref document: EP

Effective date: 20240102