CN110472614B - Identification method for motion sickness - Google Patents

Identification method for motion sickness Download PDF

Info

Publication number
CN110472614B
CN110472614B CN201910778281.6A CN201910778281A CN110472614B CN 110472614 B CN110472614 B CN 110472614B CN 201910778281 A CN201910778281 A CN 201910778281A CN 110472614 B CN110472614 B CN 110472614B
Authority
CN
China
Prior art keywords
data
cluster
threshold value
target
speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910778281.6A
Other languages
Chinese (zh)
Other versions
CN110472614A (en
Inventor
王稳
刘翔
何鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Free Health Information Technology Co ltd
Original Assignee
Sichuan Free Health Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Free Health Information Technology Co ltd filed Critical Sichuan Free Health Information Technology Co ltd
Priority to CN201910778281.6A priority Critical patent/CN110472614B/en
Publication of CN110472614A publication Critical patent/CN110472614A/en
Application granted granted Critical
Publication of CN110472614B publication Critical patent/CN110472614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Abstract

The invention discloses a method for identifying a motion sickness. The invention belongs to the technical field of monitoring, and particularly relates to a method for identifying a motion sickness. The problem of need set up a plurality of cameras and detect motion of falling, and detect inaccurately among the prior art is solved. The technical scheme of the invention is as follows: and acquiring moving data in the video data by adopting a frame difference method, matching the figure outline similarity with the area similarity through a clustering algorithm, tracking, calculating an external rectangle, judging that the person falls if the target movement speed suddenly changes and the aspect ratio of the external rectangle exceeds a threshold value or is inverted, and judging that the person falls if the falling time exceeds the threshold value. The method can solve the problem that the monitoring target is found out in time and is correspondingly processed in time in the body-building house management, analyze and calculate the target clusters, and judge the falling and falling gesture, so that the falling and falling behavior is accurately judged.

Description

Identification method for motion sickness
Technical Field
The invention belongs to the technical field of monitoring, and particularly relates to a method for identifying a motion sickness.
Background
Syncope is caused by transient disorder of systolic and diastolic functions of blood vessels, which causes transient ischemia of brain, and if people are suddenly stimulated, extremely frightened and extremely painful, anemic patients stand up suddenly after sitting or squatting for a long time, heart diseases or spondylopathy and the like can cause sudden syncope.
Currently, in a safety monitoring system, a plurality of monitoring systems are arranged to monitor a monitoring picture for a long time, and when the picture is abnormal and is in motion sickness, an alarm device alarms to monitor the picture.
In the prior art, a connection mode of directly connecting a camera with a monitoring system host is adopted, wherein hundreds or even thousands of cameras are required to be arranged in the monitoring system, and the connection mode can cause the problems of complex construction and high cost.
Disclosure of Invention
Aiming at the problems that a plurality of cameras are required to be arranged to detect the motion of the motion sickness and the detection is inaccurate in the prior art, the invention provides a method for identifying the motion sickness, which aims at: the method solves the problem that the monitoring target is found out to fall down in time and is correspondingly processed in time in gymnasium management, analyzes and calculates the target clusters, and can judge the falling down posture, so that the falling down behavior is accurately judged.
The technical scheme adopted by the invention is as follows:
a method for identifying motion sickness, comprising the steps of:
step 1: obtaining moving data in video data by adopting a frame difference method, subtracting gray data of a frame above the moving data from gray data of a current frame of the moving data to obtain difference data, wherein the difference data is change data;
step 2: eliminating Gaussian noise of the change data through a line filter, carrying out weighted average on the change data through the Gaussian filter, and obtaining a pixel value of each point after carrying out weighted average on the change data and other pixel values in a neighborhood range;
step 3: the pixel values are subjected to median filtering, a two-dimensional template is formed through thresholding, all the pixel values in the two-dimensional template are ordered according to the size of the two-dimensional template, a monotonically ascending two-dimensional data sequence is generated, and the output is that:
G(x,y)=med{f(x-k,y-l),(k,l∈W)},
wherein W is a two-dimensional template, the template initialization is a 5×5 linear region, f (x, y), and G (x, y) are an original image and a thresholded image respectively;
step 4: the two-dimensional data sequence obtains clustering data through a clustering algorithm of a weighting coefficient, the clustering algorithm is based on density cluster clustering (dbscan), the dbscan is a classical clustering algorithm based on density, in the image gray data processed through the steps, the whole gray image is defined as a density space, firstly, a radius epsilon of an adjacent area which can be used as a cluster in a point periphery adjacent point is defined, and then the area at least comprises the number minPts of the point is defined; wherein the point is the pixel point with the pixel value (255 ) after the processing in the steps; epsilon is 5; minPts is the data width 5; wherein epsilon and minPts are both empirical data; adding clustering data to circumscribe a rectangle and the length-width ratio of the rectangle, comparing the clustering data with the contour data of the human body, and adding the clustering data into a judging queue when the similarity is more than 55% through a SIFT algorithm;
(1) Based on the above parameters, the points in the image data can be divided into three categories:
a, a core point, namely an arbitrary point p, wherein the point p is a core point when the point p is in a nearby area (radius is smaller than or equal to epsilon) and the number of points in the area is larger than or equal to minPts;
edge points (border) satisfy that the number of p neighbors (radius less than or equal to epsilon) is less than minPts, but points that fall within the neighborhood of core points;
outlier (outlier): neither core nor edge points, or as outliers
(2) Selecting any point p, judging whether the point is a core point, an edge point or an outlier according to pesilon and minPts, and deleting the outlier;
(3) If the distance between the core points is less than minPts, connecting the two core points together, thus forming a plurality of groups of clusters;
(4) According to minPts, the edge points are distributed to the range of the core points closest to the minPts;
(5) Repeating the steps until all points are satisfied in the cluster or are outliers;
step 5: performing target tracking on the clustered data, recording attribute change of the clustered data through target tracking, creating a data set M, tracking the clustered data through the mode if the data of the M is empty, and adding the data into the M;
step 6: calculating or inverting, setting the cluster as a target cluster, wherein the motion speed of the cluster data exceeds a threshold value or instantaneous change occurs; defining a centroid (x, y) of the torso of the target person; the aspect ratio of the circumscribed rectangle of the cluster data and the moving speed of the cluster data, if the aspect ratio of the circumscribed rectangle of the cluster data exceeds a threshold value;
Figure BDA0002175750810000021
wherein n: bone points of the torso portion;
calculating the motion speed of the target person according to the frame rate and the running speed of the program running carrier server through the pixel difference between the centroid positions of the target person of the current frame and the last frame,
V=pixel/s×(FPS×V run )
wherein, s: pixel size and logically determined object size ratio, vnm: frame isolation threshold value, pixel of frame isolation method: euclidean distance between the centroids of the front frame and the rear frame;
step 7: in the transverse distance, the target cluster circumscribed rectangle is compared with other cluster circumscribed rectangles, if the width of the other cluster circumscribed rectangles is smaller than the width of the target cluster circumscribed rectangle and the aspect ratio of the target cluster is larger than a threshold value, the target cluster is judged to fall, the aspect ratio of the circumscribed rectangle is inverted, besides falling under normal conditions, the sudden change of the arm action of the target person is also caused, at the moment, the target person is required to be subjected to proportion judgment with the person with transverse coordinates in the space, and the height of the target person in the image space is ensured to be changed, so that the target person can be judged to fall;
step 8: and calculating the falling time of the target cluster, and judging that the falling is caused by the condition that if the falling time exceeds the threshold time, the target cluster does not generate instantaneous change of speed again, the movement speed does not exceed the threshold value, the length-width ratio does not exceed the threshold value, and the condition is fully satisfied.
The clustering algorithm in the step 4 is based on density cluster clustering.
Wherein, the step 6 includes 2 cases:
if the aspect ratio of the circumscribed rectangle of the cluster data does not exceed the threshold value, the judgment is exited;
case 2: if the aspect ratio of the circumscribed rectangle of the clustering data exceeds a threshold value or is inverted, and the movement speed of the clustering exceeds the threshold value or instantaneous change occurs, the next judgment is carried out.
Wherein, the threshold time in the step 8 is 20 seconds to 40 seconds.
In summary, due to the adoption of the technical scheme, the beneficial effects of the invention are as follows:
1. in the gymnasium management, the motion and action of the person are analyzed and calculated by timely finding the motion sickness condition of the monitoring target and timely performing corresponding treatment, and the falling down can be well judged, so that the motion sickness behavior can be accurately judged.
Drawings
The invention will now be described by way of example and with reference to the accompanying drawings in which:
FIG. 1 is a schematic flow chart of the present invention.
FIG. 2 is a schematic diagram of a circumscribed rectangle of a target cluster according to the present invention.
Detailed Description
All of the features disclosed in this specification, or all of the steps in a method or process disclosed, may be combined in any combination, except for mutually exclusive features and/or steps.
The present invention will be described in detail with reference to fig. 1 and 2.
A method for identifying motion sickness, comprising the steps of:
step 1: obtaining moving data in video data by adopting a frame difference method, subtracting gray data of a frame above the moving data from gray data of a current frame of the moving data to obtain difference data, wherein the difference data is change data;
step 2: eliminating Gaussian noise of the change data through a line filter, carrying out weighted average on the change data through the Gaussian filter, and obtaining a pixel value of each point after carrying out weighted average on the change data and other pixel values in a neighborhood range;
step 3: the pixel values are subjected to median filtering, a two-dimensional template is formed through thresholding, all the pixel values in the two-dimensional template are ordered according to the size of the two-dimensional template, a monotonically ascending two-dimensional data sequence is generated, and the output is that:
G(x,y)=med{f(x-k,y-l),(k,l∈W)},
wherein W is a two-dimensional template, the template initialization is a line-shaped region of 5*5, f (x, y), and G (x, y) are an original image and a thresholded image respectively;
step 4: the two-dimensional data sequence obtains clustering data through a clustering algorithm of a weighting coefficient, the clustering algorithm is based on density cluster clustering (dbscan), the dbscan is a classical clustering algorithm based on density, in the image gray data processed through the steps, the whole gray image is defined as a density space, firstly, a radius epsilon of an adjacent area which can be used as a cluster in a point periphery adjacent point is defined, and then the area at least comprises the number minPts of the point is defined; wherein the point is the pixel point with the pixel value (255 ) after the processing in the steps; epsilon is 5; minPts is the data width 5; wherein epsilon and minPts are both empirical data; adding clustering data to circumscribe a rectangle and the length-width ratio of the rectangle, comparing the clustering data with the contour data of the human body, and adding the clustering data into a judging queue when the similarity is more than 55% through a SIFT algorithm;
(1) Based on the above parameters, the points in the image data can be divided into three categories:
a, a core point, namely an arbitrary point p, wherein the point p is a core point when the point p is in a nearby area (radius is smaller than or equal to epsilon) and the number of points in the area is larger than or equal to minPts;
edge points (border) satisfy that the number of p neighbors (radius less than or equal to epsilon) is less than minPts, but points that fall within the neighborhood of core points;
outlier (outlier): neither core nor edge points, or as outliers
(2) Selecting any point p, judging whether the point is a core point, an edge point or an outlier according to pesilon and minPts, and deleting the outlier;
(3) If the distance between the core points is less than minPts, connecting the two core points together, thus forming a plurality of groups of clusters;
(4) According to minPts, the edge points are distributed to the range of the core points closest to the minPts;
(5) Repeating the steps until all points are satisfied in the cluster or are outliers;
step 5: performing target tracking on the clustered data, recording attribute change of the clustered data through target tracking, creating a data set M, tracking the clustered data through the mode if the data of the M is empty, and adding the data into the M;
step 6: calculating the length-width ratio of the circumscribed rectangle of the cluster data and the moving speed of the cluster data, and if the length-width ratio of the circumscribed rectangle of the cluster data exceeds a threshold value or is inverted and the moving speed of the cluster data exceeds the threshold value or has instantaneous change, setting the cluster as a target cluster; defining a centroid (x, y) of the torso of the target person;
Figure BDA0002175750810000041
wherein n: bone points of the torso portion;
calculating the motion speed of the target person according to the frame rate and the running speed of the program running carrier server through the pixel difference between the centroid positions of the target person of the current frame and the last frame,
V=pixel/s×(FPS×V run )
wherein, s: pixel size and logically determined object size ratio, vnm: frame isolation threshold value, pixel of frame isolation method: euclidean distance between the centroids of the front frame and the rear frame;
the threshold value process of the aspect ratio is that the ratio of the length (y-axis length) to the width (x-axis length) of the existing rectangle of the target is calculated, and the value=x: y, if the current ratio is greater than 1 (the ratio greater than 1 typically falls within the outlier), the value is recorded and the inverse is calculated. If the rectangular shape of a certain frame changes, the ratio is towards 1 or more, the change amount between the current value and the recorded value is recorded, the change exceeds 60% of the recorded value, and the ratio of the aspect ratio at the moment to the aspect ratio of the previous recorded value is more than or equal to 1.5, and the tumbling is calculated; otherwise, the recording flow is skipped, and if the ratio change is greater than or equal to 1.5 or inversion occurs, the falling flow is directly determined.
The threshold value process of the movement speed is as follows: the value is a preset value, the motion speed of the target person is obtained and added into the set (the motion speed of the same person is taken only once), and when the set size is larger than 10, the average motion speed is calculated. If the moving speed of the target person exceeds 30% of the average moving speed in a certain frame, it belongs to the exceeding threshold value. This threshold needs to be updated in real time and needs to be recalculated with the exclusion of the following data additions:
(1) Motion speed data that has been determined to exceed a threshold value
(2) The motion speed data (the speed of the person at which the actual displacement does not occur in the case of the speed of 40 pixels/s) which is regarded as noise data.
Step 7: in the transverse distance, the target cluster circumscribed rectangle is compared with other cluster circumscribed rectangles, if the width of the other cluster circumscribed rectangles is smaller than the width of the target cluster circumscribed rectangle and the aspect ratio of the target cluster is larger than a threshold value, the target cluster is judged to fall, the aspect ratio of the circumscribed rectangle is inverted, besides falling under normal conditions, the sudden change of the arm action of the target person is also caused, at the moment, the target person is required to be subjected to proportion judgment with the person with transverse coordinates in the space, and the height of the target person in the image space is ensured to be changed, so that the target person can be judged to fall;
step 8:
(1) Calculating the falling time of the target cluster, wherein the falling time exceeds a threshold time;
(2) The target cluster does not generate instantaneous speed change again, and the movement speed does not exceed a threshold value;
(3) The aspect ratio does not exceed the threshold;
meanwhile, if the conditions (1), (2) and (3) are satisfied, the patient is judged to be faint;
the clustering algorithm in the step 4 is based on density cluster clustering.
Wherein, the step 6 includes 2 cases:
if the aspect ratio of the circumscribed rectangle of the cluster data does not exceed the threshold value, the judgment is exited;
case 2: if the aspect ratio of the circumscribed rectangle of the clustering data exceeds a threshold value or is inverted, and the movement speed of the clustering exceeds the threshold value or instantaneous change occurs, the next judgment is carried out.
Wherein, the threshold time in the step 8 is 20 seconds to 40 seconds.
The foregoing examples merely represent specific embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that, for those skilled in the art, several variations and modifications can be made without departing from the technical solution of the present application, which fall within the protection scope of the present application.

Claims (4)

1. A method for identifying motion sickness, comprising the steps of:
step 1: obtaining moving data in video data by adopting a frame difference method, subtracting gray data of a frame above the moving data from gray data of a current frame of the moving data to obtain difference data, wherein the difference data is change data;
step 2: the change data is weighted and averaged through a Gaussian filter, and the pixel value of each point is obtained after the weighted and averaged;
step 3: the pixel values are subjected to median filtering, a two-dimensional template is formed through thresholding, all the pixel values in the two-dimensional template are ordered according to the size of the pixel values, a monotonically rising two-dimensional data sequence is generated, and the two-dimensional data sequence is output as follows:
G(x,y)=med{f(x-k,y-l),(k,l∈W)},
wherein W is a two-dimensional template, the template initialization is a 5×5 linear region, f (x, y) is an original image, and G (x, y) is an image after thresholding;
step 4: the two-dimensional data sequence obtains clustering data through a clustering algorithm of a weighting coefficient, clustering circumscribed rectangles and the length-width ratio of the rectangles are added, and the clustering data is compared with contour data of a human body;
step 5: performing target tracking on the clustered data, and recording attribute change of the clustered data through target tracking;
step 6: the attribute change comprises the length-width ratio and the moving speed of the circumscribed rectangle, the length-width ratio of the circumscribed rectangle of the cluster data and the moving speed of the cluster data are calculated, and if the length-width ratio of the circumscribed rectangle of the cluster data exceeds a threshold value or is inverted and the moving speed of the cluster data exceeds the threshold value or is changed instantaneously, the cluster is set as a target cluster;
defining a centroid (x, y) of the torso of the target person;
Figure QLYQS_1
wherein n: bone points of the torso portion;
calculating the motion speed of the target person according to the frame rate and the running speed of the program running carrier server through the pixel difference between the centroid positions of the target person of the current frame and the last frame,
V=pixel/s×(FPS×V run )
wherein, s: pixel size and logically determined object size ratio, vnm: frame isolation threshold value, pixel of frame isolation method: euclidean distance between the centroids of the front frame and the rear frame;
the threshold value process of the aspect ratio is that the ratio of the length (y-axis length) to the width (x-axis length) of the existing rectangle of the target is calculated, and the value=x: y, if the current ratio is greater than 1 (the ratio greater than 1 typically falls within the outlier), then the value is recorded and the inverse ratio is calculated; if the rectangular shape of a certain frame changes, the ratio is towards 1 or more, the change amount between the current value and the recorded value is recorded, the change exceeds 60% of the recorded value, and the ratio of the aspect ratio at the moment to the aspect ratio of the previous recorded value is more than or equal to 1.5, and the tumbling is calculated; otherwise, skipping the recording flow, and if the ratio change is greater than or equal to 1.5 or inversion occurs, directly identifying the flow as a falling flow;
the threshold value process of the movement speed is as follows: the value is a preset value, the motion speed of the target person is obtained and added into the set (the motion speed of the same person is taken once), and when the set size is more than 10, the average motion speed is calculated; if the moving speed of the target person exceeds 30% of the average moving speed in a certain frame, the moving speed is more than a threshold value; this threshold needs to be updated in real time and needs to be recalculated with the exclusion of the following data additions:
(1) Motion speed data that has been determined to exceed a threshold value
(2) Motion speed data (character speed with speed of 40 pixels/s and without actual displacement) considered as noise data;
step 7: in the transverse distance, comparing the target cluster circumscribed rectangle with other cluster circumscribed rectangles, and judging that the target cluster falls down if the width of the other cluster circumscribed rectangles is smaller than that of the target cluster circumscribed rectangle and the aspect ratio of the target cluster is larger than a threshold value;
step 8: and calculating the falling time of the target cluster, and judging that the target cluster is in motion sickness if the time exceeds the threshold time, the target cluster does not generate instantaneous change of speed again, the motion speed does not exceed the threshold value, the length-width ratio does not exceed the threshold value and the conditions are all satisfied.
2. The method for recognizing motion sickness according to claim 1, wherein the clustering algorithm in step 4 is density-cluster-based clustering.
3. The method for recognizing motion sickness according to claim 1, wherein said step 6 comprises 2 cases:
if the aspect ratio of the circumscribed rectangle of the cluster data does not exceed the threshold value, the judgment is exited;
case 2: if the aspect ratio of the circumscribed rectangle of the clustering data exceeds a threshold value or is inverted, and the movement speed of the clustering exceeds the threshold value or instantaneous change occurs, the next judgment is carried out.
4. The method of claim 1, wherein the threshold time in step 8 is 20 seconds to 40 seconds.
CN201910778281.6A 2019-08-22 2019-08-22 Identification method for motion sickness Active CN110472614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910778281.6A CN110472614B (en) 2019-08-22 2019-08-22 Identification method for motion sickness

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910778281.6A CN110472614B (en) 2019-08-22 2019-08-22 Identification method for motion sickness

Publications (2)

Publication Number Publication Date
CN110472614A CN110472614A (en) 2019-11-19
CN110472614B true CN110472614B (en) 2023-06-30

Family

ID=68513455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910778281.6A Active CN110472614B (en) 2019-08-22 2019-08-22 Identification method for motion sickness

Country Status (1)

Country Link
CN (1) CN110472614B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112560723B (en) * 2020-12-22 2023-10-17 中电海康集团有限公司 Fall detection method and system based on morphological recognition and speed estimation
CN113158783B (en) * 2021-03-10 2022-11-18 重庆特斯联智慧科技股份有限公司 Community resident health monitoring method and system based on human body recognition
CN113239874A (en) * 2021-06-01 2021-08-10 平安科技(深圳)有限公司 Behavior posture detection method, device, equipment and medium based on video image
CN113628413A (en) * 2021-08-30 2021-11-09 中山大学附属第三医院(中山大学肝脏病医院) Automatic alarm and help-seeking technology for accidents of wearing and taking off protective clothing

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009266052A (en) * 2008-04-28 2009-11-12 Hitachi Ltd Abnormal behavior detector
WO2011143711A1 (en) * 2010-05-19 2011-11-24 Australasian Pork Research Institute Ltd Image analysis for making animal measurements
JP2012036005A (en) * 2010-07-12 2012-02-23 Mitsubishi Electric Corp Fall detecting device and passenger conveyor
CN102722715A (en) * 2012-05-21 2012-10-10 华南理工大学 Tumble detection method based on human body posture state judgment
CN103093197A (en) * 2013-01-15 2013-05-08 信帧电子技术(北京)有限公司 Monitoring method and system for recognizing hanging behavior
CN203328700U (en) * 2013-06-06 2013-12-11 刘翔 Motion-state tracking and real-time transmission system
FR3028343A1 (en) * 2014-11-10 2016-05-13 Centre Nat Rech Scient METHOD FOR DETECTING THE FALL OF A HUMAN SUBJECT AND CORRESPONDING ACTIMETRIC DEVICE
CN106557812A (en) * 2016-11-21 2017-04-05 北京大学 The compression of depth convolutional neural networks and speeding scheme based on dct transform
CN109635721A (en) * 2018-12-10 2019-04-16 山东大学 Video human fall detection method and system based on track weighting depth convolution sequence poolization description

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI382762B (en) * 2008-11-17 2013-01-11 Ind Tech Res Inst Method for tracking moving object
CN102044131B (en) * 2010-07-20 2013-12-11 北京紫峰华纳科技有限公司 Alarm signal processing method and human body position information-based alarm method
CN103118249A (en) * 2013-03-13 2013-05-22 胡茂林 Housebound elder intelligence video surveillance system
CN103927743B (en) * 2014-03-27 2017-04-05 中国科学院长春光学精密机械与物理研究所 The detection method of man-made target in a kind of remotely sensed image
CN104574441B (en) * 2014-12-31 2017-07-28 浙江工业大学 A kind of tumble real-time detection method based on GMM and temporal model
CN106204640A (en) * 2016-06-29 2016-12-07 长沙慧联智能科技有限公司 A kind of moving object detection system and method
CN106708084B (en) * 2016-11-24 2019-08-02 中国科学院自动化研究所 The automatic detection of obstacles of unmanned plane and barrier-avoiding method under complex environment
CN109670396B (en) * 2018-11-06 2023-06-27 华南理工大学 Fall detection method for indoor old people

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009266052A (en) * 2008-04-28 2009-11-12 Hitachi Ltd Abnormal behavior detector
WO2011143711A1 (en) * 2010-05-19 2011-11-24 Australasian Pork Research Institute Ltd Image analysis for making animal measurements
JP2012036005A (en) * 2010-07-12 2012-02-23 Mitsubishi Electric Corp Fall detecting device and passenger conveyor
CN102722715A (en) * 2012-05-21 2012-10-10 华南理工大学 Tumble detection method based on human body posture state judgment
CN103093197A (en) * 2013-01-15 2013-05-08 信帧电子技术(北京)有限公司 Monitoring method and system for recognizing hanging behavior
CN203328700U (en) * 2013-06-06 2013-12-11 刘翔 Motion-state tracking and real-time transmission system
FR3028343A1 (en) * 2014-11-10 2016-05-13 Centre Nat Rech Scient METHOD FOR DETECTING THE FALL OF A HUMAN SUBJECT AND CORRESPONDING ACTIMETRIC DEVICE
CN106557812A (en) * 2016-11-21 2017-04-05 北京大学 The compression of depth convolutional neural networks and speeding scheme based on dct transform
CN109635721A (en) * 2018-12-10 2019-04-16 山东大学 Video human fall detection method and system based on track weighting depth convolution sequence poolization description

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘皓.基于视频的多特征融合摔倒检测算法研究.《中国优秀硕士学位论文全文数据库信息科技辑》.2018,(第06期),第I138-1556页. *
蔡耀萱.基于智能手机的人体跌倒检测方法研究与实现.《中国优秀硕士学位论文全文数据库 信息科技辑》.2019,(第01期),第I140-1173页. *

Also Published As

Publication number Publication date
CN110472614A (en) 2019-11-19

Similar Documents

Publication Publication Date Title
CN110472614B (en) Identification method for motion sickness
Fuhl et al. Eyes wide open? eyelid location and eye aperture estimation for pervasive eye tracking in real-world scenarios
CN109949341B (en) Pedestrian target tracking method based on human skeleton structural features
JP5675229B2 (en) Image processing apparatus and image processing method
JP6969611B2 (en) Information processing systems, control methods, and programs
Riche et al. Dynamic saliency models and human attention: A comparative study on videos
JP2008234208A (en) Facial region detection apparatus and program
JP2018143338A (en) Watching support system and control method thereof
JP6822328B2 (en) Watching support system and its control method
JP6043933B2 (en) Sleepiness level estimation device, sleepiness level estimation method, and sleepiness level estimation processing program
JP7005213B2 (en) Image analyzer
KR100882509B1 (en) Apparatus and Method for image based-monitoring elderly people with Principal Component Analysis
CN109299702B (en) Human behavior recognition method and system based on depth space-time diagram
JP2021531539A (en) Personal identification system and method
KR101542206B1 (en) Method and system for tracking with extraction object using coarse to fine techniques
JP6729510B2 (en) Monitoring support system and control method thereof
CN114202797A (en) Behavior recognition method, behavior recognition device and storage medium
CN109815786B (en) Gait recognition method based on regional entropy characteristics
Khashman Automatic detection, extraction and recognition of moving objects
Shrivastava et al. Measurement of psoriasis area and severity index area score of Indian psoriasis patients
JP2016170603A (en) Moving body tracking device
Juang et al. Vision-based human body posture recognition using support vector machines
JP2019008515A (en) Watching support system and method for controlling the same
US10970557B2 (en) Posture determination method, electronic system and non-transitory computer-readable recording medium
Dorgham et al. Improved elderly fall detection by surveillance video using real-time human motion analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant