CN115372963B - Fall-down behavior multi-level detection method and equipment based on millimeter wave radar signals - Google Patents

Fall-down behavior multi-level detection method and equipment based on millimeter wave radar signals Download PDF

Info

Publication number
CN115372963B
CN115372963B CN202211299570.6A CN202211299570A CN115372963B CN 115372963 B CN115372963 B CN 115372963B CN 202211299570 A CN202211299570 A CN 202211299570A CN 115372963 B CN115372963 B CN 115372963B
Authority
CN
China
Prior art keywords
target
data
height
fall
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211299570.6A
Other languages
Chinese (zh)
Other versions
CN115372963A (en
Inventor
张闻宇
王志
陈兆希
王泽涛
贺飞翔
丁玉国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qinglei Technology Co ltd
Original Assignee
Beijing Qinglei Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qinglei Technology Co ltd filed Critical Beijing Qinglei Technology Co ltd
Priority to CN202211299570.6A priority Critical patent/CN115372963B/en
Publication of CN115372963A publication Critical patent/CN115372963A/en
Application granted granted Critical
Publication of CN115372963B publication Critical patent/CN115372963B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/886Radar or analogous systems specially adapted for specific applications for alarm systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Abstract

The invention provides a multi-level fall behavior detection method and device based on millimeter wave radar signals, wherein the method comprises the following steps: obtaining multi-scale range profile data, range-doppler profile data and angle data of the target relative to the radar based on the millimeter wave radar signal; determining whether a target exists in the scene and whether the target moves or not by using the multi-scale range profile data, and calculating the range data of the target relative to the radar; determining direction of motion data for the target using the range-doppler plot data; calculating height data of the target relative to the ground by using the distance data and the angle data; determining a fall type of the object from the movement direction data and the height data.

Description

Fall-down behavior multi-level detection method and equipment based on millimeter wave radar signals
Technical Field
The invention relates to the field of detection of abnormal actions of a human body, in particular to a multi-level falling behavior detection device based on millimeter wave radar signals.
Background
In recent years, the aging trend of the population is gradually intensified, the number of the population of the elderly living alone is continuously increased, the decline of the body functions of the elderly causes the negative state such as slow response, slow movement, decline of the balance ability and the like of the human body, and the probability of the occurrence of accidents such as falling down and the like is increased. For the elderly, the fall is one of the main causes of death and accidental injury of the elderly, and seriously threatens the life safety of the elderly. Therefore, whether the old people have the falling condition or not can be timely and accurately found, and the specific falling type is judged, so that the method has important significance for making targeted medical assistance, improving the cure rate and ensuring the life safety of the old people.
The patent document CN114442079A discloses a method for detecting a target object falling, which obtains range-doppler information based on millimeter-wave radar signals, then determines human body velocity information, obtains respiratory frequency, human body height and envelope graph according to the range-doppler information when a certain velocity of a human body is determined, and finally integrates the information to determine whether the target object falls.
The scheme can detect conventional falling actions, but the falling situations are various in practical application scenes, for example, some falling processes are relatively slow, and a human body does not necessarily have a high speed; also for example, the person may not lie completely on the ground, but rather the upper body is upright, similar to sitting on the ground. These falling actions are common, and the existing radar-based detection methods often cannot face these situations, and may even miss the falling actions or have a large false alarm probability.
Disclosure of Invention
In view of this, the present invention provides a method for detecting a fall behavior in multiple levels based on millimeter wave radar signals, including:
obtaining multi-scale range profile data, range-doppler plot data and angle data of the target relative to the radar based on the millimeter wave radar signal;
determining whether a target exists in the scene and whether the target moves or not by using the multi-scale range profile data, and calculating the range data of the target relative to the radar;
determining motion direction data of the target using the range-doppler plot data;
calculating height data of the target relative to the ground by using the distance data and the angle data;
determining a fall type of the object from the movement direction data and the height data.
Optionally, the method further comprises:
determining movement velocity data of the target by using the range-doppler plot data, wherein the movement velocity data is used for determining the falling type of the target;
determining the targeted fall type further comprises:
judging whether the target is suspected to fall down quickly according to the movement speed data, the movement direction data and the height data;
when the suspected rapid falling of the target is judged, whether the target exists in the scene is continuously monitored, and when the target exists in the scene, whether the target actually falls rapidly is determined according to the height data.
Optionally, determining whether the target is suspected to fall quickly according to the moving speed, the moving direction and the height data, further comprising:
judging whether the movement speed reaches a speed threshold value, whether the movement direction data in a plurality of frames is a direction far away from a radar, and whether the height data in a plurality of frames is lower than a first height threshold value;
and when the movement speed reaches a speed threshold value, the movement direction data in a plurality of frames is a direction far away from the radar, and the height data in the plurality of frames is lower than a first height threshold value, determining that the target is suspected to fall down quickly.
Optionally, determining whether the target does fall quickly further comprises:
monitoring whether a target existing state is kept in a scene in a plurality of frames after the target is suspected to fall down quickly, and whether the height data is kept lower than a second height threshold value;
and when the target existence state is kept in the scene and the height data is kept lower than a second height threshold value, determining that the target really falls down quickly.
Optionally, determining a fall type of the object further comprises:
judging whether the target is suspected to fall normally or slowly according to the movement direction data and the height data;
when the target is judged to be subjected to suspected common falling or suspected slow falling, whether the target exists in the scene or not is continuously monitored, and when the target exists in the scene, whether the target really falls or slowly falls is determined according to the height data.
Optionally, determining whether the target is suspected to have a common fall or a slow fall according to the movement direction data and the height data, further comprising:
judging whether the motion direction data in a plurality of frames are in a direction far away from the radar or not, and whether the height data in the plurality of frames are lower than a first height threshold or not, and determining the time for keeping the motion in the direction far away from the radar and the time for keeping the height of the height data lower than the first height threshold;
judging whether the moving direction keeping time reaches a first time threshold value or not and whether the height keeping time reaches a second time threshold value or not;
and when the moving direction keeping time reaches a first time threshold and the height keeping time reaches a second time threshold, judging that the target is suspected to fall normally or slowly, wherein the first time threshold and the second time threshold for judging the suspected fall normally are smaller than the first time threshold and the second time threshold for judging the suspected slow fall.
Optionally, determining whether the target does fall normally or slowly further comprises:
monitoring whether the target existing state is kept in a scene or not and whether the height data is kept lower than a second height threshold or not in a plurality of frames after the target is suspected to fall normally or slowly;
and when the existing state of the target is continuously kept in the detection scene and the height data is kept lower than a second height threshold value, judging that the target really falls normally or slowly.
Optionally, determining a fall type of the object further comprises:
judging whether the suspected semi-lying fall of the target occurs or not according to the matching degree of the motion direction data, the height data and the multi-scale distance image data with the feature kernel;
when the suspected half-lying fall of the target is judged, whether the target exists in the scene or not is continuously monitored, and when the target exists in the scene, whether the target really falls or not is determined according to the height data.
Optionally, the determining whether the suspected semi-lying fall occurs to the target further includes:
judging whether the motion direction data in a plurality of frames are in a direction far away from the radar or not and whether the height data in the plurality of frames are in a first preset height interval or not, calculating the matching degree of the fine-scale high-resolution range profile and the characteristic kernel, and determining the time for keeping the motion in the direction far away from the radar and the height keeping time for keeping the height data in the first preset height interval;
judging whether the moving direction keeping time reaches a first time threshold value, whether the height keeping time reaches a second time threshold value and whether the matching degree is greater than a matching threshold value;
and when the moving direction keeping time reaches a first time threshold, the height keeping time reaches a second time threshold, and the matching degree is greater than a matching threshold, judging that the target is suspected to fall in a half lying state.
Optionally, determining whether the target does indeed fall in semi-reclining further comprises:
monitoring whether a target existing state is kept in a plurality of frames after the target falls down in a suspected half-lying mode, and whether the height data are kept in a second preset height interval;
and when the target keeps the existing state and the height data are kept in a second preset height interval, judging that the target actually falls down by half.
Optionally, the multi-scale range image data comprises coarse range image data;
determining whether a target is present within the scene using the multi-scale range image data, further comprising:
determining whether a target exists in a scene at the last moment;
if a target exists in the scene at the previous moment, judging that the target exists in the scene when the entropy of the coarse-scale high-resolution range profile data is higher than an entropy threshold value and the divergence between the coarse-scale high-resolution range profile data and a template is higher than a divergence threshold value, otherwise, judging that the target does not exist in the scene, wherein the template is the average value of the coarse-scale high-resolution range profile data extracted from a plurality of adjacent frames;
and if the target does not exist in the scene at the last moment, judging whether the target exists in the scene according to the motion direction data.
Optionally, determining whether an object within the scene is moving using the multi-scale range image data, further comprising:
judging whether the correlation between the fine-scale high-resolution range profile and a template is higher than a correlation threshold, wherein the template is the average value of fine-scale high-resolution range profile data extracted from a plurality of adjacent frames;
and if the correlation is higher than the correlation threshold, judging that the moving target exists, otherwise, judging that the moving target does not exist.
Optionally, the method further comprises:
judging whether the duration of the moving target is lower than a time threshold or not, and whether the difference value between the average value of the correlation degrees and the correlation degree threshold in the duration is lower than a first adjusting threshold or not;
increasing the correlation threshold when the duration is below a time threshold and the difference is below an adjustment threshold;
counting the mean value and the maximum value of the correlation degree in the state of no moving target, and judging whether the difference value of the maximum value and the mean value is higher than a second adjustment threshold value;
decreasing the correlation threshold when the difference is above a second adjustment threshold.
Optionally, determining motion direction data of the target by using the range-doppler map data, further comprises:
processing the range-doppler diagram data by using an order statistics constant false alarm rate detector in a sliding window mode, confirming a detection threshold value, traversing point clouds in the sliding window, and extracting the energy and the coordinates of the point clouds larger than the detection threshold value;
and determining the motion direction of the target to be any one of close to the radar, far away from the radar and stay in place according to the energy and the coordinates of the point cloud.
Optionally, determining motion velocity data of the target using the range-doppler plot data, further comprising:
traversing point clouds in the range-Doppler image data, and extracting the position and the energy value of the point with the strongest energy;
judging whether the energy value of the strongest point of the energy reaches a Doppler energy detection threshold value;
when the energy value of the point with the strongest energy reaches the Doppler energy detection threshold value and the Doppler represented by the position of the point with the strongest energy is a positive number, calculating the movement speed data of the target based on the Doppler value of the point with the strongest energy; otherwise, the target movement speed is judged to be zero.
Optionally, calculating range data of the target relative to the radar, further comprises:
extracting a first distance of the target relative to the radar based on the fine-scale range profile data;
and calibrating the first distance by using the coarse-scale range profile data to obtain a second distance of the target relative to the radar.
Optionally, the angle data of the target relative to the radar includes an angle in a horizontal direction and an angle in a vertical direction;
calculating height data of the target relative to the ground, further comprising:
and calculating the height data of the target relative to the ground according to the height data of the radar from the ground, the distance data of the target relative to the radar, the angle of the target relative to the radar in the horizontal direction and the angle of the target relative to the radar in the vertical direction.
Correspondingly, the invention also provides a multi-level fall behavior detection device based on the millimeter wave radar signal, which comprises: a processor and a memory coupled to the processor; the memory stores instructions which can be executed by the processor, and the instructions are executed by the processor so as to enable the processor to execute the fall behavior multi-level detection method based on the millimeter wave radar signals.
According to the falling behavior multi-level detection method and device provided by the embodiment of the invention, multi-scale range image data, range-doppler diagram data and angle data of a target relative to a radar are obtained based on an echo signal of a millimeter wave radar, whether the target exists in a scene and moves, the moving direction of the target, the height of the target relative to the ground and the like are further determined, and finally whether the target falls down in the scene is determined according to multi-dimensional data, and falling types are distinguished. The multidimensional data can ensure higher detection accuracy, and the falling behavior is classified according to the specific situation of the falling behavior, so that follow-up treatment with pertinence is facilitated.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic diagram of an antenna arrangement of a millimeter wave radar device according to an embodiment of the present invention;
fig. 2 is a flow chart of identifying a fast fall action in an embodiment of the invention;
fig. 3 is a flow chart of identifying a normal fall in an embodiment of the invention;
fig. 4 is a flow chart of the method for identifying a semi-reclining fall action according to the embodiment of the present invention;
FIG. 5 is a flow chart of a method of determining whether a target is present in a scene in an embodiment of the present invention;
FIG. 6 is a flow chart of a method of determining whether an object within a scene is moving in an embodiment of the present invention;
FIG. 7 is a flow chart of a method of determining a velocity of movement of a target in an embodiment of the present invention;
FIG. 8 is a top view of a position relationship between a target and a radar in an embodiment of the present invention;
FIG. 9 is a side view of the positional relationship between a target and a radar in an embodiment of the present invention;
fig. 10 is a schematic diagram of data flow used in a fall behavior multi-level detection method according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "first", "second", and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Furthermore, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
The embodiment of the invention provides a multi-level falling behavior detection method based on millimeter wave radar signals, which can be executed by radar equipment provided with a microprocessor and a memory, or electronic equipment such as a mobile terminal, a computer, a server and the like.
The millimeter wave radar equipment is installed at a proper position on the roof of the use scene, and data in the use scene is collected. The radar transmits Frequency Modulation Continuous Wave (FMCW) signals, the FMCW signals in one period are Chirp signals, the signal modulation mode is sawtooth waves, and the period of the Chirp signals is
Figure 121725DEST_PATH_IMAGE001
N Chirp signals transmitted continuously form a frame with a frame period of
Figure 45818DEST_PATH_IMAGE002
. The echo signal received by the radar is mixed with the transmitting signal to obtain a difference frequency signal, and then the difference frequency signal is processedAnd carrying out high-pass filtering, low-noise amplification and ADC (analog-to-digital converter) sampling to obtain a digitized echo signal.
In the method, multi-scale range profile data is acquired based on millimeter wave radar signals (the digitized echo signals). The distance image is a High Resolution Range Profile (HRRP), is a vector sum of a target scattering point complex sub-echo obtained by a broadband radar signal projected on a radar ray, provides distribution information of the target scattering point along a distance direction, and is characterized in that a high-frequency signal with a certain wavelength is sent out, and a high-resolution distance image is obtained by reflecting imaging time and position, so that the HRRP has important structural characteristics of a target.
And processing the received radar echo signals and extracting the distance dimension complex signals of the scene. According to different time scales, moving target extraction is carried out on the distance dimensional complex signal, and high-resolution range profiles of various scales can be obtained.
In one embodiment, the multi-scale range profile data includes a fine-scale high-resolution range profile and a coarse-scale high-resolution range profile. Specifically, each Chirp signal is subjected to processing such as direct current removal, windowing, fast Fourier Transform (FFT) and the like to obtain a first distance dimension complex signal; carrying out slow time DC removal processing on the N first distance dimension complex signals in each frame to obtain a second distance dimension complex signal; and averaging the N first distance dimensional complex signals in each frame in a slow time dimension to obtain a third distance dimensional complex signal. And (4) solving an absolute value of the second distance dimension complex signal in each frame, and solving an average value in a slow time dimension to obtain a fine-scale high-resolution range profile. And taking a third distance dimension complex signal obtained by adjacent K frames, and respectively carrying out direct current removal, absolute value calculation and average value calculation on a slow time dimension to obtain a coarse-scale high-resolution range profile.
It is also necessary in the present method to obtain range-doppler plot data based on millimeter-wave radar signals (the digitized echo signals described above). Specifically, the range-doppler plot can be extracted by windowing the second range-dimensional complex signal in the slow time dimension, performing FFT processing, and obtaining the absolute value.
It is also necessary in the present method to obtain angle data of the target with respect to the radar based on the millimeter wave radar signal (the digitized echo signal described above). The millimeter wave radar device comprises a plurality of receiving antennas which are respectively arranged in different directions, and the angles of targets in a scene relative to the radar can be calculated through echo signals obtained by the plurality of receiving antennas.
As shown in fig. 1, in one embodiment, the millimeter wave radar apparatus employed is provided with 3 receiving antennas, denoted as Rx1, rx2 and Rx3, respectively, and 1 transmitting antenna, denoted as Tx. They are arranged in a manner as shown in the figure, wherein the centers of the Rx1 and Rx3 antennas are spaced apart in the vertical direction by the distance
Figure 414483DEST_PATH_IMAGE003
The center-to-center distance of the Rx2 and Rx3 antennas in the horizontal direction is
Figure 715014DEST_PATH_IMAGE003
Wherein, in the step (A),
Figure 219945DEST_PATH_IMAGE004
is the carrier wavelength of the FMCW signal.
Conjugate multiplication is carried out on a second distance dimension complex signal obtained by extracting the signal received by the Rx1 antenna and a second distance dimension complex signal obtained by extracting the signal received by the Rx3 antenna to obtain a first coherent signal in the vertical direction; and carrying out conjugate multiplication on the second distance-dimensional complex signal obtained by extracting the signal received by the Rx2 antenna and the second distance-dimensional complex signal obtained by extracting the signal received by the Rx3 antenna to obtain a first coherent signal in the horizontal direction. Averaging the first coherent signals in the vertical direction in a slow time dimension to obtain second coherent signals in the vertical direction; and averaging the first coherent signal in the horizontal direction in a slow time dimension to obtain a second coherent signal in the horizontal direction. And respectively solving a phase angle of the second reference signal in the vertical direction and the second reference signal in the horizontal direction, and extracting the angles of the target in the vertical direction and the target in the horizontal direction relative to the radar. Specifically, the phase angle may be calculated as follows:
Figure 580519DEST_PATH_IMAGE005
wherein, the first and the second end of the pipe are connected with each other,
Figure 436479DEST_PATH_IMAGE006
the imaginary part of the second reference signal on the distance gate of # r,
Figure 275122DEST_PATH_IMAGE007
the real part of the second reference signal at distance gate # r,
Figure 900139DEST_PATH_IMAGE008
the angle of the target on the range gate with respect to the radar is No. r,
Figure 431614DEST_PATH_IMAGE009
indicates the direction (
Figure 509292DEST_PATH_IMAGE010
13, which represents the vertical direction,
Figure 417205DEST_PATH_IMAGE010
23, which represents the horizontal direction),
Figure 896728DEST_PATH_IMAGE011
is an arctangent function.
As shown in fig. 10, the fall behavior multi-level detection method according to the embodiment of the present invention may be divided into four parts, where the first part is to collect millimeter wave radar signals;
the second part is data preprocessing, which may be understood as the process of data preprocessing, by which multi-scale range image data, range-doppler-map data, and angle data of the target relative to the radar are obtained. Further wherein the multi-scale range image data comprises coarse-scale and fine-scale high resolution range image data; the angle data includes an angle of the target relative to the radar in a horizontal direction and an angle of the target relative to the radar in a vertical direction;
and the third part is to obtain various information by utilizing the data obtained by preprocessing, wherein the various information comprises whether a target exists in the scene, whether a moving target exists, the distance of the target relative to the radar, the moving direction of the target, the height of the target relative to the ground and the like. Specifically, multi-scale range profile data may be utilized to determine whether targets are present and moving within the scene, and to calculate range data for the targets relative to the radar; determining motion direction data of the target using the range-doppler plot data; height data of the target relative to the ground is calculated using the distance data and the angle data. More specific calculation methods are various, and the present application may use a conventional calculation method, and may also use a special method, which is specifically described in the following embodiments;
the fourth part is to perform comprehensive judgment according to the information to determine whether the target falls, and can distinguish types of falls, such as ordinary falls, semi-lying falls, slow falls, fast falls, and the like, and the fall types are defined as follows:
general fall behavior: the falling process of the target is relatively smooth, the whole process from the standing state to the falling state of the human body is finished within a certain time (for example, 0.5 to 2 seconds), and after the falling is finished, all the human body bodies fall on the ground;
semi-lying and falling behaviors: the target falling behavior is laid on the ground after the target falling behavior is finished, the whole process from a standing state to a falling state of the human body is unlimited, and after the target falling behavior is finished, the lower half body and the upper half body of the human body are on the ground, do not touch the ground and sit on the ground;
slow fall behavior: the falling process of the target is slow, the whole process from the standing state to the falling state of the human body is completed within a long time (for example, 2 to 10 seconds), and after the falling is completed, the main body of the human body is completely fallen on the ground;
fast fall behavior: the falling process of the target is very rapid, the whole process from the standing state to the falling state of the human body is completed in a short time (for example, 0.5 second), and after the falling is completed, the human body is completely fallen on the ground.
After the device executing the method detects the falling behavior and the type thereof, prompt information can be sent to the user terminal, wherein the prompt information includes the specific falling type, so that the user (such as family members or medical staff of the detected object) can know the specific situation of the detected object, and medical assistance can be made in a targeted manner.
According to the falling behavior multi-level detection method provided by the embodiment of the invention, multi-scale range image data, range-doppler diagram data and angle data of a target relative to a radar are obtained based on an echo signal of a millimeter wave radar, whether the target exists in a scene and moves, the moving direction of the target, the height of the target relative to the ground and the like are further determined, and finally whether the target falls down in the scene is determined according to multi-dimensional data, and falling types are distinguished. The multi-dimensional data can ensure that the detection accuracy is higher, and the falling behavior is classified according to the specific situation of the falling behavior, so that follow-up treatment with pertinence is facilitated.
In a first embodiment, to identify a rapid fall action, range-doppler plot data will also be used to determine velocity data of the movement of the target. The process of determining whether a rapid fall has occurred in an object includes: judging whether the target is suspected to fall down quickly according to the movement speed data, the movement direction data and the height data; when the suspected rapid falling of the target is judged, whether the target exists in the scene is continuously monitored, and when the target exists in the scene, whether the target actually falls rapidly is determined according to the height data.
As shown in fig. 2, the process of identifying a fast fall action further comprises the steps of:
s11, judging whether the movement speed reaches a speed threshold value, whether the movement direction data in a plurality of frames are far away from the radar, and whether the height data in the plurality of frames are lower than a first height threshold value. Step S12 is executed when all of these three conditions are satisfied, otherwise, the detection is continued.
And S12, judging that the target is suspected to fall rapidly. If the extracted target motion speed is greater than the speed threshold value
Figure 599105DEST_PATH_IMAGE012
And at a plurality ofWithin a frame, the direction of motion of the target is the height of the target from the ground and away from the radar
Figure 164078DEST_PATH_IMAGE013
Less than a height threshold
Figure 344524DEST_PATH_IMAGE014
In this case, it is determined that a suspected rapid fall has occurred.
And S13, detecting whether the target keeps the existing state in a plurality of frames after the target is suspected to fall down quickly, and whether the height data is kept lower than a second height threshold value. Step S14 is executed when these two conditions are satisfied, otherwise, it returns to step S11.
And S14, judging that the target really falls down quickly. After the suspected rapid falling behavior is detected, if the information of the extracted human-presence state is continuously kept as the human-presence state in a plurality of subsequent frames, and the height of the target from the ground is
Figure 944132DEST_PATH_IMAGE013
Remain at no greater than a height threshold
Figure 817410DEST_PATH_IMAGE015
Within the range of (2), the user really considers that the rapid falling behavior happens in the use scene at the moment. In the present embodiment
Figure 869680DEST_PATH_IMAGE014
And with
Figure 853817DEST_PATH_IMAGE015
May be equal or have a certain size relationship, such as
Figure 307932DEST_PATH_IMAGE015
Figure 352111DEST_PATH_IMAGE014
In a second embodiment, it can be identified whether a normal fall action has occurred. The process of determining whether an object has suffered an ordinary fall includes: judging whether the target is suspected to fall normally or not according to the movement direction data and the height data; when the suspected common falling of the target is judged, whether the target exists in the scene or not is continuously monitored, and when the target exists in the scene, whether the target really falls or not is determined according to the height data.
As shown in fig. 3, the process of identifying a normal fall action further comprises the steps of:
s21, judging whether the motion direction data in the frames are in the direction far away from the radar or not and whether the height data in the frames are lower than a first height threshold or not, and determining the motion direction keeping time far away from the radar and the keeping time of the height data of the target far away from the ground lower than the first height threshold. Step S22 is executed when these two conditions are satisfied, otherwise the detection is continued.
And S22, judging whether the moving direction keeping time reaches a first time threshold value or not and whether the height keeping time reaches a second time threshold value or not. Step S23 is executed when these two conditions are satisfied, otherwise the detection is continued.
And S23, judging that the target is suspected to fall normally. If the target motion direction is far away from the radar and the duration is more than the time threshold value
Figure 891677DEST_PATH_IMAGE016
And within a plurality of frames, the height of the target from the ground
Figure 679504DEST_PATH_IMAGE013
Remain less than the height threshold
Figure 988126DEST_PATH_IMAGE017
And continues to be less than the altitude threshold
Figure 203206DEST_PATH_IMAGE018
Has a maximum duration greater than a time threshold
Figure 964489DEST_PATH_IMAGE019
Then the suspected common fall behavior is considered to have occurred.
And S24, detecting whether the target keeps the existing state and whether the height data is kept lower than a second height threshold value in a plurality of frames after the target is suspected to fall normally or slowly. Step S25 is executed when these two conditions are satisfied, otherwise, it returns to step S21.
And S25, judging that the target really falls over normally. After the suspected common falling behavior is detected, if the information of the extracted human-presence state is continuously kept as the human-presence state in a plurality of subsequent frames, and the height of the target from the ground is
Figure 556007DEST_PATH_IMAGE013
Remain no greater than a height threshold
Figure 719135DEST_PATH_IMAGE020
Within the range of (2), it is considered that there is indeed an ordinary fall in the usage scene at this time. In the present embodiment
Figure 105117DEST_PATH_IMAGE017
Figure 619275DEST_PATH_IMAGE018
Figure 748905DEST_PATH_IMAGE020
May be equal or have a certain size relationship, such as
Figure 766540DEST_PATH_IMAGE020
Figure 589002DEST_PATH_IMAGE018
Figure 324877DEST_PATH_IMAGE017
In a third embodiment, it is possible to identify whether a slow fall action has taken place, a process which is similar to that of identifying a normal fall action, but the time threshold for identifying a slow fall action is longer than in the previous embodiment. In particular toIn other words, if the target motion direction is detected as being far from the radar and the duration is greater than the time threshold
Figure 992619DEST_PATH_IMAGE021
And within a plurality of frames, the height of the target from the ground
Figure 130339DEST_PATH_IMAGE013
Remain less than the height threshold
Figure 123703DEST_PATH_IMAGE022
And continues to be less than the height threshold
Figure 346874DEST_PATH_IMAGE023
Has a maximum duration greater than a time threshold
Figure 818307DEST_PATH_IMAGE024
Then the suspected slow fall behavior is considered to have occurred. In the present embodiment
Figure 76113DEST_PATH_IMAGE024
The values of (1) and (2) in the previous embodiment
Figure 709219DEST_PATH_IMAGE019
The sizes of the materials are different from each other,
Figure 685266DEST_PATH_IMAGE024
Figure 694810DEST_PATH_IMAGE019
after the suspected slow falling behavior is detected, if the information of the existence of the person is continuously kept as the state of the person in a plurality of subsequent frames, and the height of the target from the ground is kept
Figure 807122DEST_PATH_IMAGE013
Remain no greater than a height threshold
Figure 876710DEST_PATH_IMAGE025
Within the range of (3), it is considered that a slow fall behavior does occur in the usage scene at this time.
In a fourth embodiment, it can be identified whether a semi-reclining fall action has occurred. The process of determining whether a semi-reclining fall has occurred to a target includes: judging whether the target is suspected to fall down in a half lying mode or not according to the matching degree of the motion direction data, the height data and the multi-scale distance image data with the characteristic kernel; when the suspected half-lying falling of the target is judged, whether the target exists in the scene or not is continuously monitored, and when the target exists in the scene, whether the target really falls or not is determined according to the height data.
As shown in fig. 4, the process of identifying a semi-reclining fall action further comprises the steps of:
s31, judging whether the motion direction data in a plurality of frames are in a direction far away from the radar or not and whether the height data in the frames are in a first preset height interval or not, calculating the matching degree of the fine-scale high-resolution range profile and the characteristic kernel, and determining the motion direction keeping time far away from the radar and the keeping time of the height data far away from the ground in the first preset height interval;
and judging whether the moving direction keeping time reaches a first time threshold, whether the height keeping time reaches a second time threshold and whether the matching degree is greater than a matching threshold. If all conditions are met, step S32 is executed, otherwise the detection is continued.
And S32, judging that the target is suspected to fall in a half lying state. If the target motion direction is far away from the radar and the duration is more than the time threshold value
Figure 808894DEST_PATH_IMAGE026
And within a plurality of frames, the height of the target from the ground
Figure 887708DEST_PATH_IMAGE013
Remain less than the height threshold
Figure 120106DEST_PATH_IMAGE027
And continues to be less than the altitude threshold
Figure 95015DEST_PATH_IMAGE028
Has a maximum duration greater than a time threshold
Figure 45654DEST_PATH_IMAGE029
Meanwhile, the moving direction of the target is the fine-scale high-resolution range profile and the characteristic kernel one in the whole process of being far away from the radar
Figure 397001DEST_PATH_IMAGE030
And characteristic kernel two
Figure 483905DEST_PATH_IMAGE031
Is simultaneously higher than the characteristic threshold
Figure 629716DEST_PATH_IMAGE032
And if the suspected half-lying fast falling behavior occurs, calculating the matching degree of the fine-scale high-resolution distance image and the characteristic kernel by the following method:
Figure 802071DEST_PATH_IMAGE033
wherein the content of the first and second substances,
Figure 957109DEST_PATH_IMAGE034
representing the degree of matching of the fine-scale high-resolution range profile with the feature kernel,
Figure 898520DEST_PATH_IMAGE035
represents the row r and column c of the fine-scale high-resolution range image in the range from time t1 to time t2,
Figure 480811DEST_PATH_IMAGE036
row r and column c representing the feature kernel,
Figure 140463DEST_PATH_IMAGE037
it is indicated that the multiplication number,
Figure 821894DEST_PATH_IMAGE038
meaning that the sums are by row,
Figure 883391DEST_PATH_IMAGE039
indicating column-wise summation.
And S33, detecting whether the target keeps the existing state in a plurality of frames after the target is subjected to suspected half-lying fall, and whether the height data is kept in a second preset height interval. Step S34 is executed when the condition is satisfied, otherwise, it returns to step S31.
And S34, judging that the target really falls down half-lying. Specifically, after the suspected semi-lying and falling behavior is detected, if in a plurality of following frames, the extracted information of the existence state of the person is continuously kept as the human state, and the height of the target from the ground is kept
Figure 636583DEST_PATH_IMAGE013
Remain no greater than a height threshold
Figure 517951DEST_PATH_IMAGE040
In the range of (2), it is considered that the half-lying fall behavior does occur in the usage scene at this time.
Different types of falling actions can be detected in the four embodiments, and the embodiments can be alternatively adopted, or can be partially or completely adopted, so that the purpose of accurately distinguishing the falling behaviors of the human body is achieved. Where multiple embodiments are used to detect multiple fall types simultaneously, the various thresholds used in the processing for detecting different fall types may be the same or different, for example the height thresholds used in the embodiments described above, where four embodiments are used simultaneously,
Figure 14792DEST_PATH_IMAGE027
Figure 930795DEST_PATH_IMAGE028
and
Figure 120468DEST_PATH_IMAGE040
is slightly higher than
Figure 489132DEST_PATH_IMAGE041
The application also provides a method for determining whether a target exists in the scene by using the multi-scale range profile data, and the method can be used for detecting whether a person exists in the scene when the fall type is detected, so that the detection result is more accurate. As shown in fig. 5, the method comprises the following steps:
and S41, determining whether the target exists in the scene at the last moment. The detection operation is continuously executed, the method needs to consider the state of the previous moment, and the target does not exist in the previous moment when the method is initially executed. And executing the step S42 when the target exists in the scene at the previous moment, otherwise executing the step S45.
And S42, judging whether the entropy of the coarse-scale high-resolution range profile data is higher than an entropy threshold value or not, and whether the divergence between the coarse-scale high-resolution range profile data and the template is higher than a divergence threshold value or not, executing a step S43 when the conditions are met, and otherwise executing a step S44.
Specifically, a histogram of energy values of the coarse-scale high-resolution range profile of a plurality of adjacent frames is counted, normalization processing is performed to obtain probability distribution of the energy values of the coarse-scale high-resolution range profile, and entropy of the energy distribution is calculated in the following manner:
Figure 524085DEST_PATH_IMAGE042
wherein, the first and the second end of the pipe are connected with each other,
Figure 294594DEST_PATH_IMAGE043
entropy of the energy value of the coarse-scale high-resolution range profile,
Figure 389589DEST_PATH_IMAGE044
is the probability distribution of the energy values of a plurality of adjacent frames of the coarse-scale high-resolution range profile after the column vector is normalized, log is logarithmic operation, eps is a very small positive number, the occurrence of negative infinity in the logarithmic operation process is prevented,
Figure 245550DEST_PATH_IMAGE045
presentation pair
Figure 84193DEST_PATH_IMAGE046
And (5) transposition operation.
In this embodiment, the mean value of the coarse-scale high-resolution range profiles of several adjacent frames is calculated as a template, and the divergence between the coarse-scale high-resolution range profile at the current time and the template is calculated as follows:
Figure 974789DEST_PATH_IMAGE047
wherein, the first and the second end of the pipe are connected with each other,
Figure 240685DEST_PATH_IMAGE048
is the divergence of the energy values of the coarse-scale high-resolution range profile,
Figure 583942DEST_PATH_IMAGE049
is a calculated mean value of the coarse-scale high-resolution range profile obtained from several adjacent frames,
Figure 960696DEST_PATH_IMAGE050
the distance image is a coarse-scale high-resolution distance image at the current moment.
And S43, judging that the target exists in the scene.
And S44, judging that no target exists in the scene.
Entropy of coarse-scale high-resolution range profile
Figure 705798DEST_PATH_IMAGE043
Divergence less than entropy threshold or coarse scale high resolution range profile
Figure 142596DEST_PATH_IMAGE048
And if the divergence threshold value is smaller than the divergence threshold value, the scene at the current moment is considered to be unmanned, otherwise, the scene at the current moment is considered to be manned.
And S45, judging whether the target exists in the scene or not according to the motion direction data. The operation may specifically be to determine whether a motion close to the radar motion direction occurs in the scene, and if the motion close to the radar motion direction occurs in the scene, it is determined that there is a person in the scene at the current time (i.e., step S43 is executed), otherwise, it is determined that there is no person in the scene at the current time (i.e., step S44 is executed).
The application also provides a method for determining whether the target in the scene moves (or called as a method for determining whether the moving target exists in the scene) by using the multi-scale range profile data, and when the fall type is detected, the method of the embodiment can be used for detecting whether the moving target exists in the scene, so that the detection result is more accurate. As shown in fig. 6, the method comprises the following steps:
and S51, judging whether the correlation between the fine-scale high-resolution range profile and the template is higher than a correlation threshold value or not, wherein the template is the average value of the fine-scale high-resolution range profile data extracted from a plurality of adjacent frames. Step S52 is performed if the correlation is higher than the correlation threshold, otherwise step S53 is performed.
In this embodiment, the average of the fine-scale high-resolution range profiles of a plurality of adjacent frames is calculated as a template, and the correlation between the fine-scale high-resolution range profile at the current time and the template is calculated as follows:
Figure 973149DEST_PATH_IMAGE051
wherein the content of the first and second substances,
Figure 419173DEST_PATH_IMAGE052
for the correlation between the fine-scale high-resolution range profile and the template,
Figure 18782DEST_PATH_IMAGE049
a template computed for a fine-scale high-resolution range profile obtained from several adjacent frames,
Figure 626481DEST_PATH_IMAGE053
for the fine-scale high-resolution range profile at the current time,
Figure 678751DEST_PATH_IMAGE054
is a pair of
Figure 928466DEST_PATH_IMAGE055
And (5) calculating a two-norm.
And S52, judging that the moving target exists.
And S53, judging that no moving object exists.
In this embodiment, if the correlation degree calculated at the current time is smaller than the correlation degree threshold, it is determined that no moving object exists at the current time; and if the correlation degree calculated at the current moment is greater than the moving target detection threshold, the moving target exists at the current moment.
Further, the method can also perform the operation of adaptively adjusting the correlation threshold. The method for adjusting the correlation threshold comprises the following steps:
s501, judging whether the duration time of the moving object is lower than a time threshold value or not, and whether the difference value of the average value of the correlation degrees in the duration time and the correlation degree threshold value is lower than a first adjusting threshold value or not. If the condition is satisfied, step S502 is executed, otherwise the correlation threshold is not changed.
And S502, increasing a correlation threshold. Specifically, the time for the continuous existence of the moving target is calculated, and if the time for the continuous existence of the moving target is less than a time threshold and the difference value between the mean value of the correlation degrees and the moving target detection threshold in the existence time of the moving target is less than a first adjustment threshold, the moving target detection threshold is increased.
S503, counting the mean value and the maximum value of the correlation degree under the state without the moving target;
and S504, judging whether the difference value of the maximum value and the mean value is higher than a second adjusting threshold value. When the difference is higher than the second adjustment threshold, step S505 is executed, otherwise, the correlation threshold is not changed.
And S505, reducing the correlation threshold. And counting the average value and the maximum value of the correlation degree under the condition that no moving target exists, and reducing the moving target detection threshold value if the difference value of the maximum value and the average value obtained by counting is greater than a second adjustment threshold value.
The application also provides a method for determining the movement direction data of the target by using the range-doppler diagram data, and the method of the embodiment can be adopted to detect the movement direction of the target in a scene when the fall type is detected, so that the detection result is more accurate.
And processing the range-doppler diagram by using constant false alarm detection to obtain a detection threshold. Dividing the range-Doppler image into three parts according to Doppler values, wherein the three parts respectively represent a part close to a radar, a part far away from the radar and a part staying in place, and respectively calculating the energy sum of point clouds of which the internal energy values in the three parts are higher than a threshold. And comparing the energy sum of the point clouds of which the energy values in different areas exceed the threshold value to determine the motion direction of the target.
In this embodiment, the range-doppler plot data is processed by using an order statistics type constant false alarm detector in a sliding window manner, a detection threshold is determined, a point cloud in the sliding window is traversed, and the energy and the coordinates of the point cloud greater than the detection threshold are extracted.
Further, the movement direction of the target is determined to be any one of close to the radar, far away from the radar and stay in place according to the energy and the coordinates of the point cloud. For radar signals, the positive doppler portion represents the direction of motion away from the radar and the negative doppler portion represents the direction of motion closer to the radar. In the embodiment, point clouds with Doppler coordinates smaller than a first Doppler threshold are extracted, and energy sum is calculated to obtain first Doppler energy sum; extracting the point cloud of which the Doppler coordinate is larger than a second Doppler threshold value, and calculating the energy sum to obtain a second Doppler energy sum; and calculating the energy sum to obtain a third Doppler energy sum.
Calculating a difference value between the first Doppler energy sum and the second Doppler energy sum, and if the difference value is smaller than a first motion direction judgment threshold value, determining that the motion direction of the target is close to the radar; if the difference value is larger than the second motion direction judgment threshold value, the motion direction of the target is far away from the radar; if the difference value is greater than the first motion direction discrimination threshold and less than the second motion direction discrimination threshold, or the third Doppler energy sum is greater than the third motion direction discrimination threshold, the motion direction of the target is the in-situ stay.
The application also provides a method for determining the movement speed of the target by using the range-doppler diagram data, and the method of the embodiment can be used for detecting the movement speed of the target in a scene when a falling type is detected, so that the detection result is more accurate. As shown in fig. 7, the method includes the steps of:
s61, traversing point clouds in the range-Doppler image data, and extracting the position and the energy value of the point with the strongest energy;
s62, judging whether the energy value of the point with the strongest energy reaches the Doppler energy detection threshold value. And (4) when the energy value of the strongest point reaches the Doppler energy detection threshold value and the Doppler represented by the position of the strongest point is a positive number, executing the step S63, otherwise, executing the step S64.
And S63, calculating the movement speed data of the target based on the numerical value of the Doppler frequency of the energy strongest point. Specifically, the moving speed of the target and the doppler value are in a linear conversion relationship, and according to the doppler frequency value and the radar parameter, the moving speed of the target can be calculated as follows:
Figure 382581DEST_PATH_IMAGE056
wherein, the first and the second end of the pipe are connected with each other,
Figure 426761DEST_PATH_IMAGE057
is the speed of movement of the object and,
Figure 700747DEST_PATH_IMAGE058
in order to extract the value of the obtained doppler frequency,
Figure 488575DEST_PATH_IMAGE059
is the carrier frequency of the radar device and,
Figure 797196DEST_PATH_IMAGE060
is the speed of light.
And S64, judging that the target movement speed is zero.
The application also provides a method for calculating the distance data of the target relative to the radar, and the method can be adopted to detect the distance between the target and the radar in the scene when the falling type is detected, so that the detection result is more accurate. In the embodiment, the distance of the target relative to the radar in the use scene is extracted according to the energy size and distribution in the high-resolution range images with different scales.
Under the condition that a moving target exists, preliminarily extracting the distance between the target and the radar in the fine-scale high-resolution range profile according to the size of energy and historical data extracted in a plurality of frames; when the moving direction of the target is in an in-situ staying state, the extraction result is calibrated by using the coarse-scale high-resolution distance image pair, and the specific implementation mode is as follows:
firstly, extracting a first target distance from a fine-scale high-resolution range profile, when a moving target exists, extracting distances represented by three points with strongest peak energy in the fine-scale high-resolution range profile, respectively calculating differences between the three extracted distances and first target distances extracted from a plurality of adjacent historical frames, and selecting the distance with the minimum difference as the first target distance at the current moment; when the moving object does not exist, the first object distance is set as the first object distance extracted from the last frame.
And calibrating the first distance by using the coarse-scale range profile data to obtain a second distance of the target relative to the radar. When the moving direction of the target is in an in-situ staying state, extracting the distances represented by five points with the strongest peak energy in the coarse-scale high-resolution distance image, and selecting the distance with the minimum difference value with the second target distance extracted from a plurality of historical frames as the second target distance at the current moment; and when the moving direction of the target is far away from the radar or close to the radar, setting the second target distance as the extracted first target distance.
The application further provides a method for calculating the height of the target relative to the ground, and the method can be used for detecting the height of the target relative to the ground in a scene when the falling type is detected, so that the detection result is more accurate. The method calculates and calculates the height data of the target relative to the ground according to the height data of the radar from the ground, the distance data of the target relative to the radar, the angle of the target relative to the radar in the horizontal direction and the angle of the target relative to the radar in the vertical direction.
Assuming that the position relationship between the target and the radar in the usage scenario is as shown in fig. 8 and 9, the height of the target from the ground can be calculated by the following method:
Figure 12277DEST_PATH_IMAGE061
wherein, the first and the second end of the pipe are connected with each other,
Figure 39139DEST_PATH_IMAGE013
indicating the height of the target from the ground,
Figure 365078DEST_PATH_IMAGE062
which represents the range of the target relative to the radar (which may be the second target range calculated by the method provided in the previous embodiment, or a range calculated by another method),
Figure 793785DEST_PATH_IMAGE063
in the form of a circumferential ratio,
Figure 179767DEST_PATH_IMAGE064
indicating a distance of in the vertical direction
Figure 428346DEST_PATH_IMAGE062
The angle of the target relative to the radar at the location of (a),
Figure 557976DEST_PATH_IMAGE065
indicating a distance of
Figure 841190DEST_PATH_IMAGE062
The angle of the target relative to the radar at the location of (a),
Figure 398073DEST_PATH_IMAGE066
indicating the height of the radar from the ground,
Figure 133948DEST_PATH_IMAGE067
is a function of the inverse sine wave,
Figure 67269DEST_PATH_IMAGE068
is a tangent function.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications derived therefrom are intended to be within the scope of the invention.

Claims (16)

1. A multi-level fall behavior detection method based on millimeter wave radar signals is characterized by comprising the following steps:
acquiring multi-scale range profile data, range-doppler plot data and angle data of a target relative to a radar based on millimeter wave radar signals, wherein the multi-scale range profile data comprises a coarse-scale high-resolution range profile and a fine-scale high-resolution range profile;
determining whether a target exists in the scene and whether the target moves or not by using the multi-scale range profile data, and calculating the range data of the target relative to the radar, wherein determining whether the target exists in the scene or not by using the multi-scale range profile data further comprises determining whether the target exists in the scene at the previous moment; if a target exists in the scene at the previous moment, judging that the target exists in the scene when the entropy of the coarse-scale high-resolution range profile is higher than an entropy threshold value and the divergence between the coarse-scale high-resolution range profile and a template is higher than a divergence threshold value, otherwise, judging that the target does not exist in the scene, wherein the template is the average value of the coarse-scale high-resolution range profiles extracted from a plurality of adjacent frames; if the target does not exist in the scene at the previous moment, judging whether the target exists in the scene or not according to the motion direction data; determining whether an object within a scene is moving using the multi-scale range image data further comprises: judging whether the correlation between the fine-scale high-resolution range profile and a template is higher than a correlation threshold, wherein the template is the average value of the fine-scale high-resolution range profile extracted from a plurality of adjacent frames; if the correlation is higher than a correlation threshold, judging that a moving target exists, otherwise, judging that no moving target exists;
determining motion direction data of the target using the range-doppler plot data;
calculating height data of the target relative to the ground by using the distance data and the angle data;
determining a fall type of the object from the movement direction data and the height data.
2. The method of claim 1, further comprising:
determining movement velocity data of the target by using the range-doppler plot data, wherein the movement velocity data is used for determining the falling type of the target;
determining the targeted fall type further comprises:
judging whether the target is suspected to fall down quickly or not according to the movement speed data, the movement direction data and the height data;
when the suspected rapid falling of the target is judged, whether the target exists in the scene is continuously monitored, and when the target exists in the scene, whether the target really falls rapidly is determined according to the height data.
3. The method of claim 2, wherein determining whether the target has a suspected rapid fall according to the moving speed, the moving direction and the height data further comprises:
judging whether the movement speed reaches a speed threshold value, whether the movement direction data in a plurality of frames is a direction far away from a radar, and whether the height data in a plurality of frames is lower than a first height threshold value;
and when the movement speed reaches a speed threshold value, the movement direction data in a plurality of frames is a direction far away from the radar, and the height data in the plurality of frames is lower than a first height threshold value, determining that the target is suspected to fall down quickly.
4. The method of claim 2, wherein determining whether the target does have a fast fall further comprises:
monitoring whether a target existing state is kept in a scene in a plurality of frames after the target is suspected to fall down quickly, and whether the height data is kept lower than a second height threshold value;
and when the target existence state is kept in the scene and the height data is kept lower than a second height threshold value, determining that the target really falls down quickly.
5. The method of claim 1, wherein determining a fall type for a target further comprises:
judging whether the target is suspected to fall normally or slowly according to the movement direction data and the height data;
when the target is judged to be subjected to suspected common falling or suspected slow falling, whether the target exists in the scene or not is continuously monitored, and when the target exists in the scene, whether the target really falls or slowly falls is determined according to the height data.
6. The method of claim 5, wherein determining whether the target is suspected to have a normal fall or a slow fall according to the movement direction data and the height data further comprises:
judging whether the motion direction data in a plurality of frames are in a direction far away from the radar, whether the height data in the plurality of frames are lower than a first height threshold value, and determining the time for keeping the motion in the direction far away from the radar and the time for keeping the height of the height data lower than the first height threshold value;
judging whether the moving direction keeping time reaches a first time threshold value or not and whether the height keeping time reaches a second time threshold value or not;
and when the moving direction keeping time reaches a first time threshold and the height keeping time reaches a second time threshold, judging that the target is suspected to fall normally or slowly, wherein the first time threshold and the second time threshold for judging the suspected fall normally are smaller than the first time threshold and the second time threshold for judging the suspected slow fall.
7. The method of claim 5, wherein determining whether the target does fall normally or slowly further comprises:
monitoring whether the existing state of the target is kept in a scene and whether the height data is kept lower than a second height threshold value in a plurality of frames after the target is suspected to fall normally or slowly;
and when the existing state of the target is continuously kept in the detection scene and the height data is kept lower than a second height threshold value, judging that the target really falls normally or slowly.
8. The method of claim 1, wherein determining a fall type for an object further comprises:
judging whether the suspected semi-lying fall of the target occurs or not according to the matching degree of the motion direction data, the height data and the multi-scale distance image data with the feature kernel;
when the suspected half-lying falling of the target is judged, whether the target exists in the scene or not is continuously monitored, and when the target exists in the scene, whether the target really falls in a half-lying mode or not is determined according to the height data.
9. The method of claim 8, wherein determining whether the subject has had a suspected semi-reclining fall, further comprises:
judging whether the motion direction data in a plurality of frames are in a direction far away from the radar or not and whether the height data in the plurality of frames are in a first preset height interval or not, calculating the matching degree of the fine-scale high-resolution range profile and the characteristic kernel, and determining the time for keeping the motion in the direction far away from the radar and the height keeping time for keeping the height data in the first preset height interval;
judging whether the moving direction keeping time reaches a first time threshold value, whether the height keeping time reaches a second time threshold value and whether the matching degree is greater than a matching threshold value;
and when the moving direction keeping time reaches a first time threshold, the height keeping time reaches a second time threshold, and the matching degree is greater than a matching threshold, judging that the target is suspected to fall in a half lying state.
10. The method of claim 8, wherein determining whether the target does indeed fall within a half bed further comprises:
monitoring whether a target existing state is kept in a plurality of frames after the target falls down in a suspected half-lying mode, and whether the height data are kept in a second preset height interval;
and when the target keeps the existing state and the height data are kept in a second preset height interval, judging that the target actually falls down by half.
11. The method of claim 1, further comprising:
judging whether the duration of the moving target is lower than a time threshold or not, and whether the difference value of the average value of the correlation degrees in the duration and the correlation degree threshold is lower than a first adjusting threshold or not;
increasing the correlation threshold when the duration is below a time threshold and the difference is below an adjustment threshold;
counting the mean value and the maximum value of the correlation degree in the state of no moving target, and judging whether the difference value of the maximum value and the mean value is higher than a second adjustment threshold value;
decreasing the correlation threshold when the difference is above a second adjustment threshold.
12. The method of any of claims 1-11, wherein determining direction of motion data for a target using the range-doppler plot data, further comprises:
processing the range-doppler diagram data by using an order statistics constant false alarm rate detector in a sliding window mode, confirming a detection threshold value, traversing point clouds in the sliding window, and extracting the energy and the coordinates of the point clouds larger than the detection threshold value;
and determining the motion direction of the target to be any one of close to the radar, far away from the radar and stay in place according to the energy and the coordinates of the point cloud.
13. The method of any of claims 2-4, wherein determining velocity data of the movement of the target using the range-Doppler-map data, further comprises:
traversing point clouds in the range-Doppler image data, and extracting the position and the energy value of the point with the strongest energy;
judging whether the energy value of the strongest point of the energy reaches a Doppler energy detection threshold value;
when the energy value of the energy strongest point reaches the Doppler energy detection threshold value and the Doppler represented by the position of the energy strongest point is a positive number, calculating the movement speed data of the target based on the Doppler value of the energy strongest point; otherwise, the target movement speed is judged to be zero.
14. The method of any one of claims 1-11, wherein calculating range data for the target relative to the radar further comprises:
extracting a first distance of the target relative to the radar based on the fine-scale range profile data;
and calibrating the first distance by using the coarse-scale distance image data to obtain a second distance of the target relative to the radar.
15. The method of any one of claims 1-11, wherein the angle data of the target relative to the radar includes an angle in a horizontal direction and an angle in a vertical direction;
calculating height data of the target relative to the ground, further comprising:
and calculating the height data of the target relative to the ground according to the height data of the radar from the ground, the distance data of the target relative to the radar, the angle of the target relative to the radar in the horizontal direction and the angle of the target relative to the radar in the vertical direction.
16. A fall behavior multi-level detection device based on millimeter wave radar signals is characterized by comprising: a processor and a memory coupled to the processor; wherein the memory stores instructions executable by the processor to cause the processor to perform a method of fall behavior multi-level detection based on millimeter wave radar signals according to any one of claims 1 to 15.
CN202211299570.6A 2022-10-24 2022-10-24 Fall-down behavior multi-level detection method and equipment based on millimeter wave radar signals Active CN115372963B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211299570.6A CN115372963B (en) 2022-10-24 2022-10-24 Fall-down behavior multi-level detection method and equipment based on millimeter wave radar signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211299570.6A CN115372963B (en) 2022-10-24 2022-10-24 Fall-down behavior multi-level detection method and equipment based on millimeter wave radar signals

Publications (2)

Publication Number Publication Date
CN115372963A CN115372963A (en) 2022-11-22
CN115372963B true CN115372963B (en) 2023-03-14

Family

ID=84074155

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211299570.6A Active CN115372963B (en) 2022-10-24 2022-10-24 Fall-down behavior multi-level detection method and equipment based on millimeter wave radar signals

Country Status (1)

Country Link
CN (1) CN115372963B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117281498B (en) * 2023-11-24 2024-02-20 北京清雷科技有限公司 Health risk early warning method and equipment based on millimeter wave radar

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108968970A (en) * 2018-05-24 2018-12-11 厦门精益远达智能科技有限公司 A kind of method, apparatus and radar system that Doppler's millimetre-wave radar detection human body is fallen
CN113378682A (en) * 2021-06-03 2021-09-10 山东省科学院自动化研究所 Millimeter wave radar fall detection method and system based on improved clustering algorithm
CN114296076A (en) * 2022-01-04 2022-04-08 长沙莫之比智能科技有限公司 Indoor fall detection method and device based on millimeter wave radar
CN114442079A (en) * 2022-01-14 2022-05-06 北京清雷科技有限公司 Target object falling detection method and device
TWI774444B (en) * 2021-06-28 2022-08-11 萬旭電業股份有限公司 Millimeter wave radar apparatus detecting fall posture
CN114895301A (en) * 2022-05-23 2022-08-12 武汉大学 Millimeter wave radar and video-assisted indoor fall detection method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11435443B2 (en) * 2019-10-22 2022-09-06 Infineon Technologies Ag Integration of tracking with classifier in mmwave radar
CN114038012A (en) * 2021-11-08 2022-02-11 四川启睿克科技有限公司 Fall detection method and system based on millimeter wave radar and machine learning
CN115220041A (en) * 2022-06-21 2022-10-21 华中科技大学 Millimeter wave radar scale positioning method and system with Doppler compensation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108968970A (en) * 2018-05-24 2018-12-11 厦门精益远达智能科技有限公司 A kind of method, apparatus and radar system that Doppler's millimetre-wave radar detection human body is fallen
CN113378682A (en) * 2021-06-03 2021-09-10 山东省科学院自动化研究所 Millimeter wave radar fall detection method and system based on improved clustering algorithm
TWI774444B (en) * 2021-06-28 2022-08-11 萬旭電業股份有限公司 Millimeter wave radar apparatus detecting fall posture
CN114296076A (en) * 2022-01-04 2022-04-08 长沙莫之比智能科技有限公司 Indoor fall detection method and device based on millimeter wave radar
CN114442079A (en) * 2022-01-14 2022-05-06 北京清雷科技有限公司 Target object falling detection method and device
CN114895301A (en) * 2022-05-23 2022-08-12 武汉大学 Millimeter wave radar and video-assisted indoor fall detection method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Doppler_Radar_Fall_Activity_Detection_Using_the_Wavelet_Transform;Bo Yu Su 等;《IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING》;20150331;第62卷(第3期);第865-875页 *
基于多传感融合的老年人跌倒检测系统;冯涛等;《电子测量技术》;20160615(第06期);第1-6页 *
深度学习融合超宽带雷达图谱的跌倒检测研究;何密 等;《雷达学报》;20221010;第11卷;第1-13页 *

Also Published As

Publication number Publication date
CN115372963A (en) 2022-11-22

Similar Documents

Publication Publication Date Title
CN111134685B (en) Fall detection method and device
CN112401856B (en) Nursing home monitoring method and system based on millimeter wave radar
US11327167B2 (en) Human target tracking system and method
CN110609281B (en) Region detection method and device
CN115372963B (en) Fall-down behavior multi-level detection method and equipment based on millimeter wave radar signals
US9261585B2 (en) Radar apparatus using image change detector and method of operating the same
US20220179062A1 (en) Detection apparatus and method
CN112782664A (en) Toilet fall detection method based on millimeter wave radar
US20230112537A1 (en) Vital information acquisition apparatus and method
CN110837079B (en) Target detection method and device based on radar
CN114814832A (en) Millimeter wave radar-based real-time monitoring system and method for human body falling behavior
CN114442079A (en) Target object falling detection method and device
CN114005246B (en) Fall detection method and device for old people based on frequency modulation continuous wave millimeter wave radar
CN111722187A (en) Radar installation parameter calculation method and device
CN113466851A (en) Human body posture detection method and device and data processing equipment
CN116400313A (en) Behavior detection method and device based on millimeter wave radar multi-domain data fusion
CN116087943A (en) Indoor falling detection method and system based on millimeter wave radar
CN111232778B (en) Method and device for counting number of people in elevator car
CN109239677A (en) A kind of environment self-adaption CFAR detection thresholding determines method
CN114690143A (en) Method and device for suppressing radar clutter, radar and medium
CN117281498B (en) Health risk early warning method and equipment based on millimeter wave radar
US11846702B2 (en) Image processing device and image processing method
CN110609254B (en) Action detection method and device based on wireless signals and electronic equipment
CN117647788B (en) Dangerous behavior identification method and device based on human body 3D point cloud
CN116224280B (en) Radar target detection method, radar target detection device, radar equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant