CN113903147A - Radar-based human body posture distinguishing method, device, equipment and medium - Google Patents

Radar-based human body posture distinguishing method, device, equipment and medium Download PDF

Info

Publication number
CN113903147A
CN113903147A CN202111156916.2A CN202111156916A CN113903147A CN 113903147 A CN113903147 A CN 113903147A CN 202111156916 A CN202111156916 A CN 202111156916A CN 113903147 A CN113903147 A CN 113903147A
Authority
CN
China
Prior art keywords
human body
state
monitoring radar
radar
target image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111156916.2A
Other languages
Chinese (zh)
Inventor
郭耀
李跃星
王安琪
熊兆中
镇畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Time Varying Transmission Co ltd
Original Assignee
Time Varying Transmission Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Time Varying Transmission Co ltd filed Critical Time Varying Transmission Co ltd
Priority to CN202111156916.2A priority Critical patent/CN113903147A/en
Publication of CN113903147A publication Critical patent/CN113903147A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/043Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention provides a method, a device, equipment and a medium for distinguishing human body postures based on radar, wherein a monitoring radar and a camera are arranged at a user entrance and exit place; the method comprises the following steps: acquiring a target image shot by a camera and an identifier of a monitoring radar matched with the camera; the target image is generated by shooting a human body triggering warning position by the camera when the monitoring radar detects that the human body reaches a preset warning posture and triggers warning; carrying out image recognition on the target image to determine the human body action state in the target image; and judging whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar according to the human body action state. According to the radar-based human body posture judging method, secondary judgment on whether the triggering warning posture occurs in the region is carried out on the monitoring radar for triggering warning, so that the judgment accuracy of the human body posture can be improved, and the overall false alarm rate of human body posture monitoring is reduced.

Description

Radar-based human body posture distinguishing method, device, equipment and medium
Technical Field
The invention relates to the technical field of safety protection, in particular to a human posture distinguishing method, a human posture distinguishing device, human posture distinguishing equipment and a human posture distinguishing medium based on radar.
Background
With the continuous development of science and technology, the research in the aspect of safety protection is more and more emphasized. The protection against falls in safety protection is an important aspect in recent years, particularly for the elderly. The old people may be injured or even die due to the decline of their bodies and each fall.
At present, the research on the aspect of falling prevention is mainly divided into two parts of falling prevention equipment and falling prevention monitoring equipment. The anti-falling equipment can prevent the old people from falling down to a certain extent, the anti-falling monitoring equipment can provide a basis for the improvement direction of the anti-falling equipment, and the anti-falling monitoring equipment can also provide an early warning effect when the old people fall down.
The existing anti-falling monitoring equipment mainly monitors whether a human body falls down through a non-contact sensor, and the accuracy of judging the posture of the human body by the anti-falling monitoring mode is lower.
Disclosure of Invention
The invention provides a method, a device, equipment and a medium for judging human body postures based on radar, which are used for solving the problem of low accuracy of judging human body postures in the existing anti-falling monitoring mode.
The invention provides a human body posture distinguishing method based on a radar, wherein a monitoring radar and a camera are arranged at a user entrance and exit place;
the method comprises the following steps:
acquiring a target image shot by a camera and an identifier of a monitoring radar matched with the camera; the target image is generated by shooting a human body triggering warning position by the camera when the monitoring radar detects that the human body reaches a preset warning posture and triggers warning;
carrying out image recognition on the target image to determine the human body action state in the target image;
and judging whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar according to the human body action state.
Optionally, the performing image recognition on the target image to determine the human motion state in the target image according to the method includes:
and inputting and training the target image into a converged depth convolution network model to determine the human body action state in the target image.
Optionally, in the method described above, the deep convolutional network model includes a feature extraction submodel and a human body state identification submodel;
the inputting and training of the target image into the converged deep convolutional network model to determine the human body action state in the target image comprises:
inputting the target image into a feature extraction sub-model to determine a human skeleton state diagram corresponding to the target image; the human skeleton state diagram is generated by connecting characteristic points of human parts;
and inputting the human body skeleton state diagram into the human body state recognition submodel so as to determine the human body action state according to the aspect ratio of the human body skeleton state diagram.
Optionally, in the method as described above, the human body motion state includes:
a non-lying state and a lying state; the non-lying state comprises a sitting state, a standing state and a squatting state.
Optionally, the method for judging whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identifier of the monitoring radar according to the human body motion state includes:
if the human body action state is a non-lying state, determining that a preset alarm gesture does not occur in a monitoring radar area corresponding to the identification of the monitoring radar;
and if the human body action state is a lying state, determining that a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar.
Optionally, the method, if the human body action state is a lying state, after determining that a preset alarm gesture occurs in a monitoring radar area corresponding to the identifier of the monitoring radar, further includes:
and sending the identification of the monitoring radar and the human skeleton state diagram to terminal equipment of a supervisor for warning.
Optionally, the method further includes, if the human body motion state is a non-lying state, after determining that a preset alarm gesture does not occur in a monitoring radar area corresponding to the identifier of the monitoring radar, further:
and sending warning cancellation information to the corresponding monitoring radar according to the identification of the monitoring radar so that the monitoring radar cancels warning.
The invention provides a human body posture distinguishing device based on radar, wherein a monitoring radar and a camera are arranged at a user entrance and exit place;
the device comprises:
the acquisition module is used for acquiring a target image shot by a camera and an identifier of a monitoring radar matched with the camera; the target image is generated by shooting a human body triggering warning position by the camera when the monitoring radar detects that the human body reaches a preset warning posture and triggers warning;
the image recognition module is used for carrying out image recognition on the target image so as to determine the human body action state in the target image;
and the judging module is used for judging whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar according to the human body action state.
Optionally, in the apparatus described above, the image recognition module is specifically configured to:
and inputting and training the target image into a converged depth convolution network model to determine the human body action state in the target image.
Optionally, in the apparatus described above, the deep convolutional network model includes a feature extraction submodel and a human body state identification submodel;
when the target image is input and trained into the converged depth convolution network model to determine the human body action state in the target image, the image recognition module is specifically configured to:
inputting the target image into a feature extraction sub-model to determine a human skeleton state diagram corresponding to the target image; the human skeleton state diagram is generated by connecting characteristic points of human parts; and inputting the human body skeleton state diagram into the human body state recognition submodel so as to determine the human body action state according to the aspect ratio of the human body skeleton state diagram.
Optionally, in the apparatus as described above, the human body motion state includes:
a non-lying state and a lying state; the non-lying state comprises a sitting state, a standing state and a squatting state.
Optionally, in the apparatus described above, the determining module is specifically configured to:
if the human body action state is a non-lying state, determining that a preset alarm gesture does not occur in a monitoring radar area corresponding to the identification of the monitoring radar; and if the human body action state is a lying state, determining that a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar.
Optionally, the apparatus as described above, further comprising:
and the warning module is used for sending the identification of the monitoring radar and the human skeleton state diagram to terminal equipment of a supervisor for warning.
Optionally, the apparatus as described above, further comprising:
and the cancellation warning module is used for sending cancellation warning information to the corresponding monitoring radar according to the identification of the monitoring radar so as to enable the monitoring radar to cancel warning.
A third aspect of the present invention provides an electronic device comprising: a memory, a processor;
the memory to store the processor-executable instructions;
wherein the processor is configured to perform the method of the first aspect by the processor.
A fourth aspect of the present invention provides a computer-readable storage medium, in which computer-executable instructions are stored, and when the computer-executable instructions are executed by a processor, the computer-readable storage medium is configured to implement the radar-based human body posture distinguishing method according to any one of the first aspect.
A fifth aspect of the invention provides a computer program product comprising a computer program which, when executed by a processor, implements a radar-based body pose discrimination method according to any one of the first aspects.
The invention provides a human body posture distinguishing method, a human body posture distinguishing device, human body posture distinguishing equipment and a human body posture distinguishing medium based on a radar, wherein a monitoring radar and a camera are arranged at a user entrance and exit place; the method comprises the following steps: acquiring a target image shot by a camera and an identifier of a monitoring radar matched with the camera; the target image is generated by shooting a human body triggering warning position by the camera when the monitoring radar detects that the human body reaches a preset warning posture and triggers warning; carrying out image recognition on the target image to determine the human body action state in the target image; and judging whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar according to the human body action state. The invention relates to a human body posture distinguishing method based on radar. And then carrying out image recognition on the target image to determine the human motion state in the target image. The human action state can reflect the posture and the current state of the human body. Therefore, whether a preset alarm gesture, such as a human body falling gesture, occurs in the monitoring radar area triggering the alarm can be judged according to the action state of the human body. According to the radar-based human body posture judging method, secondary detection judgment on whether the area triggers the warning posture is carried out on the monitoring radar triggering the warning, so that the judgment accuracy of the human body posture can be improved, and the overall false alarm rate of human body posture monitoring is reduced.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a schematic flowchart of a method for discriminating a human body posture based on a radar according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a method for discriminating a human body posture based on a radar according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart illustrating a method for determining human body posture based on radar according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
With the above figures, certain embodiments of the invention have been illustrated and described in more detail below. The drawings and the description are not intended to limit the scope of the inventive concept in any way, but rather to illustrate it by those skilled in the art with reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
For a clear understanding of the technical solutions of the present application, a detailed description of the prior art solutions is first provided. At present, the anti-falling monitoring equipment generally adopts a low-cost anti-falling radar to judge whether a human body falls down, the point cloud number of the anti-falling radar is less, and misjudgment with certain probability can be caused when the human body posture is judged through the point cloud. Therefore, the accuracy of judging the human posture by the existing anti-falling monitoring mode is lower.
Therefore, the inventor finds that in order to solve the problem that the accuracy of judging the human body posture in the anti-falling monitoring mode in the prior art is low, the region triggered and warned by the monitoring radar can be shot by setting the camera and utilizing the shooting function of the camera, and an image used for judging the human body posture secondarily is generated, so that the accuracy of judging the human body posture is improved.
The inventor proposes a technical scheme of the application based on the creative discovery.
The embodiments of the present invention will be described with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a method for discriminating a human body posture based on a radar according to an embodiment of the present invention, and as shown in fig. 1, in this embodiment, an execution subject of the embodiment of the present invention is a human body posture discrimination device based on a radar, and the human body posture discrimination device based on a radar may be integrated in an electronic device, such as a server. The method for discriminating the human body posture based on the radar provided by the embodiment comprises the following steps:
it should be noted that, monitoring radars and cameras are arranged at the places where users enter and exit, for example, the places where users live are arranged at the top corners of each room, and the number of the monitoring radars and the cameras can be set according to actual requirements.
And step S101, acquiring a target image shot by the camera and an identifier of the monitoring radar matched with the camera. The target image is generated by shooting the human body triggering warning position by the camera when the monitoring radar detects that the human body reaches the preset warning posture and triggers warning.
In this embodiment, the monitoring radar set at the user entrance/exit location monitors in real time whether the preset alarm gesture is achieved, for example, a human body falling phenomenon occurs. If the monitoring radar judges that the human body falls down, a warning can be triggered and the matched camera is informed. At the moment, the camera can shoot the corresponding area according to the information of the human body triggering warning position sent by the monitoring radar, and the human body triggering warning position is the position where the human body falls down, so that a target image is generated.
And step S102, carrying out image recognition on the target image to determine the human motion state in the target image.
In this embodiment, the human motion state refers to what posture state the human body is in the image, and can be determined by the bone state of the human body in the image.
And S103, judging whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar according to the human body action state.
In this embodiment, human action state if the state of lying then can judge to take place to predetermine the warning gesture and take place the human body to tumble promptly, and the condition that the monitoring radar detected accords with actual conditions, if human action state is the state of not lying, then can judge not to take place the human body to tumble, and the monitoring radar detects the error, has triggered the false alarm.
For a better understanding of the aspects of the present invention, some embodiments of the invention will be further described below with reference to the accompanying drawings.
As shown in fig. 2 and 3, a radar-based human body posture discrimination apparatus of the present embodiment is integrated in a server. This embodiment monitoring radar adopts the millimeter wave radar, and every millimeter wave radar all has the camera module of matching, probably is equipped with 1 or a plurality of camera in the camera module. The millimeter wave radar and the camera module are in communication connection with the server through the WiFi module. In fig. 2 and 3, APP refers to an application program in the terminal device of the monitoring person.
According to the embodiment of the invention, through the real-time detection of the millimeter wave radar, after the human body is detected to fall down, the matched camera module can be triggered, so that the camera module can photograph the detected human body falling down area. The camera module can combine the identification of the millimeter wave radar and the shot image after shooting and transmit the combined identification and the shot image to the server side. The server identifies the combination of the images taken to determine the human motion state in the images. Firstly, the human body part in the image is converted into a human skeleton state diagram, for example, the human body part can be converted into a matchmaker, so that whether the matchmaker really falls down or not can be identified according to the posture of the matchmaker. And if the fact that the robot does not fall is determined, the millimeter wave radar cancels the warning, and if the fact that the robot falls is determined, the warning information and the postures of the matchmaker are pushed to the terminal equipment of the monitoring personnel or the user, and the warning is carried out through an application program at the terminal equipment.
In this embodiment, the recognition of the image shot by the camera is based on a neural network algorithm, wherein the data set used For training is based on the real shot picture of the local camera, four postures of standing, sitting, squatting and lying are collected, and the image is processed by using a deep convolutional network (referred to as "volume Architecture For Feature extraction" For short). Firstly, extracting a characteristic diagram (key parts of a human body) from the interior of a deep convolutional network, and then connecting point positions of the human body parts based on an even matching mode in a diagram theory to form a matchmaker to be finally identified. Meanwhile, the width-to-height ratio is utilized through the deep convolutional network to assist in judging the state of the matchmaker, namely the human skeleton state, which can be divided into two types: lying down state, non-lying down state.
If final human skeleton state is in the state of lying down, then the secondary judgement of falling is true, and the server will inform the APP and report an emergency and ask for help or increased vigilance, transmits matchmaker image data to APP simultaneously, supplies control personnel or user to inspect. If the human skeleton state is in a non-lying state, the fact that the falling phenomenon does not occur is determined, the server does not inform the APP of warning, and a warning cancellation notice is sent to the radar.
The embodiment of the invention utilizes the camera and the neural network algorithm, realizes secondary judgment of the fall detection, improves the accuracy of the system, simultaneously does not need to transmit data in real time all the time compared with the traditional visual detection, obviously reduces the flow consumption, also reduces the processing load of the server, and is beneficial to the server to process more equipment terminal tasks. In addition, through extracting human skeleton state in the image, for example with the mode of extracting the matchmaker, guaranteed user privacy on the one hand, also provided the vision that original radar does not exist on the one hand and forensics.
The invention also provides an electronic device, a computer readable storage medium and a computer program product according to the embodiments of the invention.
As shown in fig. 4, fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. Electronic devices are intended for various forms of computers such as laptops, personal digital assistants and other suitable computers. The electronic device may also represent various forms of mobile devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic apparatus includes: processor 201, memory 202. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device.
The memory 202 is a non-transitory computer readable storage medium provided by the present invention. The memory stores instructions executable by the at least one processor, so that the at least one processor executes the method for discriminating the human posture based on the radar provided by the invention. The non-transitory computer-readable storage medium of the present invention stores computer instructions for causing a computer to execute a radar-based human body posture discrimination method provided by the present invention.
The memory 202, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to a radar-based human body posture discrimination method in the embodiments of the present invention. The processor 201 executes various functional applications and data processing of the server by running non-transitory software programs, instructions and modules stored in the memory 202, that is, implements a radar-based human body posture determination method in the above method embodiments.
Meanwhile, the present embodiment also provides a computer product, and when instructions in the computer product are executed by a processor of the electronic device, the electronic device is enabled to execute the radar-based human body posture determination method of the above embodiment.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the embodiments of the invention following, in general, the principles of the embodiments of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the embodiments of the invention pertains.
It is to be understood that the embodiments of the present invention are not limited to the precise arrangements described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of embodiments of the invention is limited only by the appended claims.

Claims (10)

1. A human body posture distinguishing method based on a radar is characterized in that a monitoring radar and a camera are arranged at a user entrance and exit place;
the method comprises the following steps:
acquiring a target image shot by a camera and an identifier of a monitoring radar matched with the camera; the target image is generated by shooting a human body triggering warning position by the camera when the monitoring radar detects that the human body reaches a preset warning posture and triggers warning;
carrying out image recognition on the target image to determine the human body action state in the target image;
and judging whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar according to the human body action state.
2. The method according to claim 1, wherein the image recognition of the target image to determine the human motion state in the target image comprises:
and inputting and training the target image into a converged depth convolution network model to determine the human body action state in the target image.
3. The method of claim 2, wherein the deep convolutional network model comprises a feature extraction submodel and a human state recognition submodel;
the inputting and training of the target image into the converged deep convolutional network model to determine the human body action state in the target image comprises:
inputting the target image into a feature extraction sub-model to determine a human skeleton state diagram corresponding to the target image; the human skeleton state diagram is generated by connecting characteristic points of human parts;
and inputting the human body skeleton state diagram into the human body state recognition submodel so as to determine the human body action state according to the aspect ratio of the human body skeleton state diagram.
4. The method of claim 3, wherein the human motion state comprises:
a non-lying state and a lying state; the non-lying state comprises a sitting state, a standing state and a squatting state.
5. The method according to claim 4, wherein the determining whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identifier of the monitoring radar according to the human body motion state includes:
if the human body action state is a non-lying state, determining that a preset alarm gesture does not occur in a monitoring radar area corresponding to the identification of the monitoring radar;
and if the human body action state is a lying state, determining that a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar.
6. The method according to claim 5, wherein after determining that a preset alarm gesture occurs in a monitoring radar area corresponding to the identifier of the monitoring radar if the human body motion state is a lying state, the method further comprises:
and sending the identification of the monitoring radar and the human skeleton state diagram to terminal equipment of a supervisor for warning.
7. The method according to claim 5, wherein after determining that the monitoring radar area corresponding to the identifier of the monitoring radar has not occurred with the preset alarm gesture if the human body motion state is the non-lying state, the method further comprises:
and sending warning cancellation information to the corresponding monitoring radar according to the identification of the monitoring radar so that the monitoring radar cancels warning.
8. A human body posture distinguishing device based on a radar is characterized in that a monitoring radar and a camera are arranged at a user entrance and exit place;
the device comprises:
the acquisition module is used for acquiring a target image shot by a camera and an identifier of a monitoring radar matched with the camera; the target image is generated by shooting a human body triggering warning position by the camera when the monitoring radar detects that the human body reaches a preset warning posture and triggers warning;
the image recognition module is used for carrying out image recognition on the target image so as to determine the human body action state in the target image;
and the judging module is used for judging whether a preset alarm gesture occurs in a monitoring radar area corresponding to the identification of the monitoring radar according to the human body action state.
9. An electronic device, comprising: a memory, a processor;
the memory to store the processor-executable instructions;
wherein the processor is configured to perform a radar-based body pose discrimination method according to any one of claims 1 to 7 by the processor.
10. A computer readable storage medium having computer executable instructions stored thereon, which when executed by a processor, are configured to implement a radar-based body posture discrimination method according to any one of claims 1 to 7.
CN202111156916.2A 2021-09-30 2021-09-30 Radar-based human body posture distinguishing method, device, equipment and medium Pending CN113903147A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111156916.2A CN113903147A (en) 2021-09-30 2021-09-30 Radar-based human body posture distinguishing method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111156916.2A CN113903147A (en) 2021-09-30 2021-09-30 Radar-based human body posture distinguishing method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN113903147A true CN113903147A (en) 2022-01-07

Family

ID=79189508

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111156916.2A Pending CN113903147A (en) 2021-09-30 2021-09-30 Radar-based human body posture distinguishing method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN113903147A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117079416A (en) * 2023-10-16 2023-11-17 德心智能科技(常州)有限公司 Multi-person 5D radar falling detection method and system based on artificial intelligence algorithm

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109581361A (en) * 2018-11-22 2019-04-05 九牧厨卫股份有限公司 A kind of detection method, detection device, terminal and detection system
CN109765552A (en) * 2019-02-22 2019-05-17 弗徕威智能机器人科技(上海)有限公司 It is a kind of that detection method and system are fallen down based on radar system and robot
CN110363961A (en) * 2019-07-26 2019-10-22 美的置业集团有限公司 Alarming method by monitoring, device, medium and terminal device are fallen down in a kind of user family
CN110532966A (en) * 2019-08-30 2019-12-03 深兰科技(上海)有限公司 A kind of method and apparatus carrying out tumble identification based on disaggregated model
CN210271176U (en) * 2019-04-04 2020-04-07 深圳市天鼎微波科技有限公司 Human behavior recognition device
CN111510665A (en) * 2019-01-30 2020-08-07 杭州海康威视数字技术股份有限公司 Monitoring system, monitoring method and device combining millimeter wave radar and camera
CN112581723A (en) * 2020-11-17 2021-03-30 芜湖美的厨卫电器制造有限公司 Method and device for recognizing user gesture, processor and water heater
WO2021118570A1 (en) * 2019-12-12 2021-06-17 Google Llc Radar-based monitoring of a fall by a person
CN113111767A (en) * 2021-04-09 2021-07-13 上海泗科智能科技有限公司 Fall detection method based on deep learning 3D posture assessment
CN113392765A (en) * 2021-06-15 2021-09-14 广东工业大学 Tumble detection method and system based on machine vision

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109581361A (en) * 2018-11-22 2019-04-05 九牧厨卫股份有限公司 A kind of detection method, detection device, terminal and detection system
CN111510665A (en) * 2019-01-30 2020-08-07 杭州海康威视数字技术股份有限公司 Monitoring system, monitoring method and device combining millimeter wave radar and camera
CN109765552A (en) * 2019-02-22 2019-05-17 弗徕威智能机器人科技(上海)有限公司 It is a kind of that detection method and system are fallen down based on radar system and robot
CN210271176U (en) * 2019-04-04 2020-04-07 深圳市天鼎微波科技有限公司 Human behavior recognition device
CN110363961A (en) * 2019-07-26 2019-10-22 美的置业集团有限公司 Alarming method by monitoring, device, medium and terminal device are fallen down in a kind of user family
CN110532966A (en) * 2019-08-30 2019-12-03 深兰科技(上海)有限公司 A kind of method and apparatus carrying out tumble identification based on disaggregated model
WO2021118570A1 (en) * 2019-12-12 2021-06-17 Google Llc Radar-based monitoring of a fall by a person
CN112581723A (en) * 2020-11-17 2021-03-30 芜湖美的厨卫电器制造有限公司 Method and device for recognizing user gesture, processor and water heater
CN113111767A (en) * 2021-04-09 2021-07-13 上海泗科智能科技有限公司 Fall detection method based on deep learning 3D posture assessment
CN113392765A (en) * 2021-06-15 2021-09-14 广东工业大学 Tumble detection method and system based on machine vision

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117079416A (en) * 2023-10-16 2023-11-17 德心智能科技(常州)有限公司 Multi-person 5D radar falling detection method and system based on artificial intelligence algorithm
CN117079416B (en) * 2023-10-16 2023-12-26 德心智能科技(常州)有限公司 Multi-person 5D radar falling detection method and system based on artificial intelligence algorithm

Similar Documents

Publication Publication Date Title
JP6141079B2 (en) Image processing system, image processing apparatus, control method therefor, and program
Joshi et al. A fall detection and alert system for an elderly using computer vision and Internet of Things
CN112949417A (en) Tumble behavior identification method, equipment and system
JP5001808B2 (en) Crime prevention device and crime prevention program
JP2019106631A (en) Image monitoring device
WO2022252642A1 (en) Behavior posture detection method and apparatus based on video image, and device and medium
CN109711289B (en) Riding reminding method and device, electronic equipment and storage medium
CN111597879A (en) Gesture detection method, device and system based on monitoring video
CN113903147A (en) Radar-based human body posture distinguishing method, device, equipment and medium
WO2023185037A1 (en) Action detection method and apparatus, electronic device, and storage medium
CN115984967A (en) Human body falling detection method, device and system based on deep learning
CN113955594B (en) Elevator control method and device, computer equipment and storage medium
US11783636B2 (en) System and method for detecting abnormal passenger behavior in autonomous vehicles
CN114187561A (en) Abnormal behavior identification method and device, terminal equipment and storage medium
CN114022896A (en) Target detection method and device, electronic equipment and readable storage medium
CN113660455B (en) Method, system and terminal for fall detection based on DVS data
WO2023095196A1 (en) Passenger monitoring device, passenger monitoring method, and non-transitory computer-readable medium
CN113420739B (en) Intelligent emergency monitoring method and system based on neural network and readable storage medium
CN113673319A (en) Abnormal posture detection method, abnormal posture detection device, electronic device and storage medium
CN109815828A (en) Realize the system and method for initiative alarming or help-seeking behavior detection control
CN111814590B (en) Personnel safety state monitoring method, equipment and computer readable storage medium
CN114463838A (en) Human behavior recognition method, system, electronic device and storage medium
CN113420626A (en) Construction site safety behavior judging method and storage device
CN112784676A (en) Image processing method, robot, and computer-readable storage medium
CN112749642A (en) Method and device for identifying falling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220107

RJ01 Rejection of invention patent application after publication