CN116269355B - Safety monitoring system based on figure gesture recognition - Google Patents

Safety monitoring system based on figure gesture recognition Download PDF

Info

Publication number
CN116269355B
CN116269355B CN202310525777.9A CN202310525777A CN116269355B CN 116269355 B CN116269355 B CN 116269355B CN 202310525777 A CN202310525777 A CN 202310525777A CN 116269355 B CN116269355 B CN 116269355B
Authority
CN
China
Prior art keywords
personnel
state
gesture
action
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310525777.9A
Other languages
Chinese (zh)
Other versions
CN116269355A (en
Inventor
李淑琴
肖勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Minxuan Intelligent Science & Technology Co ltd
Original Assignee
Jiangxi Minxuan Intelligent Science & Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Minxuan Intelligent Science & Technology Co ltd filed Critical Jiangxi Minxuan Intelligent Science & Technology Co ltd
Priority to CN202310525777.9A priority Critical patent/CN116269355B/en
Publication of CN116269355A publication Critical patent/CN116269355A/en
Application granted granted Critical
Publication of CN116269355B publication Critical patent/CN116269355B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Abstract

The invention relates to the technical field of person safety monitoring, and particularly discloses a safety monitoring system based on person gesture recognition, which comprises the following steps: the image acquisition module is used for acquiring image information of personnel; the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel; the intelligent wearable device is used for monitoring physiological parameter information of each person; the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results; according to the system, on the basis of acquiring the personnel images to identify and analyze the personnel states, comprehensive analysis and judgment are carried out on the personnel states by combining the intelligent wearable equipment of each personnel, so that the accuracy of judgment results is improved.

Description

Safety monitoring system based on figure gesture recognition
Technical Field
The invention relates to the technical field of person safety monitoring, in particular to a safety monitoring system based on person gesture recognition.
Background
In many situations, the behavior and the state of the person need to be monitored and judged, for example, in the monitoring process of the constructor, whether the working state of the person is abnormal, whether the physical state of the person is abnormal or not and other problems need to be judged, so as to assist the manager in intelligently judging the construction site.
The existing character state monitoring system mainly utilizes AI to identify the character actions, judges whether the actions and the postures of the characters are abnormal or not so as to ensure the normal physical state of the people, and judges the working state of the people at the same time so as to be convenient for managing the people; however, the existing AI identification model has limited accuracy, so that there is a problem of erroneous judgment; meanwhile, in the prior art, a mode of monitoring the physiological parameters of the personnel is adopted to assist in completing the judgment process of the personnel state, so that the accuracy of the judgment result is improved.
For the personnel state monitoring method in the prior art, the real-time physiological parameter data of the human body, such as heartbeat, blood pressure, blood oxygen and the like, are mainly obtained, and each real-time physiological parameter data is compared with the standard range of normal physiological parameters of the human body one by one to judge, so that early warning can be carried out when obvious abnormality exists in the physiological parameters of the personnel in the judging mode, however, the judging result is only in a state of a single time point, on one hand, the fluctuation exists in the judging result, namely, a larger error exists, on the other hand, the changing state of the physiological parameters of the personnel in the time dimension is not further analyzed, and further the robustness of the judging result is poor.
Disclosure of Invention
The invention aims to provide a safety monitoring system based on character gesture recognition, which solves the following technical problems:
how to more accurately realize the monitoring process of the physiological state of the personnel in the time dimension.
The aim of the invention can be achieved by the following technical scheme:
a safety monitoring system based on character gesture recognition, the system comprising:
the image acquisition module is used for acquiring image information of personnel;
the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel;
the intelligent wearable device is used for monitoring physiological parameter information of each person;
the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results;
the physiological parameter analysis process comprises the following steps:
acquiring real-time data and historical data of each physiological parameter of a person according to physiological parameter information, analyzing the historical data, and acquiring average value data and peak value data of each physiological parameter of the person;
substituting the real-time data, the mean value data and the peak value data of each physiological parameter of the personnel into a preset physiological analysis model to obtain a judgment result of the physiological state of the personnel.
In an embodiment, the process of analyzing the real-time data, the mean value data and the peak value data of each physiological parameter by the preset physiological analysis model is as follows:
by the formulaCalculating and obtaining a person physiological state value phy (t) at the current time point;
wherein S is the number of physiological parameter monitoring items, z E [1, S];A threshold interval corresponding to the z-th physiological parameter monitoring item; />Monitoring a measured value for the z-th physiological parameter; />A standard reference value corresponding to a z-th physiological parameter monitoring item; w is a judgment function, when->When (I)>
When (when)When (I)>;/>The corresponding early warning value of the z-th physiological parameter monitoring item; />The first preset time period is set; />Is->Within a period of timeA maximum value; />、/>、/>Is a preset coefficient, and->;/>The dimensionality-removed weighting coefficient corresponding to the z-th physiological parameter monitoring item;
the physiological state value phy (t) of the person is set to be a preset warning threshold valueAnd (3) performing comparison:
if it isGenerating an early warning signal;
otherwise, judging by combining the state analysis result.
In one embodiment, the process of identifying the image information by the identification module includes:
identifying the body contour of the person based on the AI technology, and adding identification points on the body contour of the person;
acquiring distribution information of identification points through key frames in the image information;
the process of the state analysis comprises the following steps:
judging the current action gesture of the personnel according to the distribution information of the identification points;
and judging the action state type of the personnel according to the action postures of the personnel.
In one embodiment, the identification points include a center identification point and a plurality of edge identification points;
the process for judging the current posture of the personnel comprises the following steps:
respectively establishing vectors of the edge recognition points relative to the center recognition points to obtain a personnel vector sequence;
presetting an action gesture library, and setting a reference model for each action gesture;
and respectively comparing the personnel vector sequences with reference models of different postures, and determining the current action posture of the personnel according to the comparison result.
In one embodiment, the process of comparing the human vector sequence with the reference models of different poses includes size comparison:
the size comparison process comprises the following steps:
extracting vector module sequences in the personnel vector sequences according to a preset fixed orderAnd satisfy->
By the formulaCalculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
Wherein n is the number of edge recognition points, i E [1, n];For j-th pose reference model +.>Corresponding to the reference range interval>As a first judgment function, when->When (I)>The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,;/>for interval->Intermediate value of>For interval->Is a range value of (2); />The dimension characteristic coefficient corresponding to the ith gesture is obtained;
will beAnd a preset threshold->And (3) performing comparison:
if it isJudging that the matching with the j-th gesture reference model is successful;
otherwise, judging that the matching is failed.
In an embodiment, the process of comparing the personnel vector sequences with the reference models of different poses respectively further includes vector comparison;
the vector comparison process comprises the following steps:
comparing the personnel vector sequence with the corresponding vector of the gesture reference model successfully matched;
by the formulaCalculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
Wherein, the liquid crystal display device comprises a liquid crystal display device,for the second judgment function, ++>、/>Respectively +.j in the j-th gesture reference model>Corresponding to the reference range boundary vector;
if it isIs positioned at->And->In the acute angle range, then->
Otherwise the first set of parameters is selected,;/>is vector quantityVector->Angle of (1)>For vector->Vector->Is included in the plane of the first part; />Is an angle conversion function; />The direction characteristic coefficient corresponding to the ith gesture is obtained;
selectingAnd taking the action gesture corresponding to the minimum value reference model as a judging result.
In an embodiment, the action gesture library includes an abnormal gesture type and a normal gesture type;
the process of state analysis further comprises:
acquiring the action postures of the personnel in a plurality of key frames in the image information, and judging the action postures of the personnel:
if the action gesture of the person belongs to the abnormal gesture type, early warning is carried out;
otherwise, comparing the action postures of the personnel in the key frames with a preset state class library;
a plurality of action state categories are preset in the preset state category library, and each action state category is provided with a corresponding gesture type set;
by the formulaCalculation and acquisitionObtain the matching value +.>
Wherein, the liquid crystal display device comprises a liquid crystal display device,for the number of human action gestures present in the corresponding gesture type set of the q-th action state category,/->;/>The occurrence probability of the action gesture of the kth personnel in the gesture type set corresponding to the qth action state category is given;
judgingAnd the maximum value corresponds to the action state type as a judging result.
In one embodiment, phy (t) is compared with a predetermined reference thresholdPerforming comparison, wherein->
If it isJudging that the physiological state of the personnel is normal;
otherwise, get personnelAction state category within a time period;
by the formulaCalculating to obtain a personnel motion quantity coefficient Y;
wherein, the liquid crystal display device comprises a liquid crystal display device,a second preset time period; b is->Action state class number in time period x epsilon 1, B];/>Duration for the x-th action state category; />An influence function of the xth action state category;
y is matched with a preset threshold valueAnd (3) performing comparison:
if it isEarly warning is carried out;
otherwise, judging that the physiological state of the personnel is normal.
The invention has the beneficial effects that:
(1) According to the invention, through the analysis process of the physiological parameter information, on one hand, when the individual physiological parameter data of the personnel exceeds the threshold value interval, the early warning process can be realized through the early warning values corresponding to the parameters; on the other hand, the real-time data, the maximum value data and the average value data are combined with process data of a person in a period of time before the current time point to form a judging model through preset coefficients, when any parameter is abnormal, the abnormal parameter can be reflected to a result, and further the fluctuation, the variability and the overall state of the physiological parameter of the user in the time dimension are judged, so that a continuous and accurate monitoring process of the physiological state of the person is realized, and the monitoring result has better robustness.
(2) Compared with the prior art, the method for judging the human action gesture directly compares the human action model skeleton with the preset skeleton, and firstly, the two-layer screening mode of size comparison and vector comparison can greatly reduce the reference comparison action gesture, further reduce the data processing amount and improve the judging efficiency; and secondly, the judging method sets different weight values according to the characteristics of different identification points, so that the influence degree of the key identification points on the result can be enlarged in the comparison and analysis process, and the suitability of the judging process is higher in the matching process, namely the accuracy of judging the action gesture of the user is improved.
Drawings
The invention is further described below with reference to the accompanying drawings.
FIG. 1 is a logic block diagram of a person gesture recognition based safety monitoring system of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring now to FIG. 1, in one embodiment, a person gesture recognition based safety monitoring system is provided, the system comprising:
the image acquisition module is used for acquiring image information of personnel;
the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel;
the intelligent wearable device is used for monitoring physiological parameter information of each person;
the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results;
the physiological parameter analysis process comprises the following steps:
acquiring real-time data and historical data of each physiological parameter of a person according to physiological parameter information, analyzing the historical data, and acquiring average value data and peak value data of each physiological parameter of the person;
substituting the real-time data, the mean value data and the peak value data of each physiological parameter of the personnel into a preset physiological analysis model to obtain a judgment result of the physiological state of the personnel.
According to the technical scheme, the motion state information of the personnel is acquired through the image acquisition module and the identification module, the physiological parameter information of each personnel is monitored through the intelligent wearable equipment, the analysis results of the two are compared and analyzed, and the personnel state is safely monitored according to the state analysis and the comparison and analysis results; furthermore, when obvious problems occur in the personnel state or obvious abnormalities occur in the personnel physical state, an early warning signal can be generated to remind a manager of timely processing the problems; meanwhile, further judgment is carried out according to comparison of analysis results of the two, and when potential safety hazards exist in personnel states, corresponding early warning is timely generated, so that accuracy of judgment results is improved.
Through the analysis process of the physiological parameter information of the personnel in the embodiment, on one hand, when the single physiological parameter data of the personnel exceeds a threshold value interval, the early warning process can be realized through the early warning values corresponding to the parameters; on the other hand, the embodiment combines the process data of the personnel in a period of time before the current time point, forms a judging model by the real-time data, the maximum value data and the mean value data through preset coefficients, and can reflect the abnormal condition of any parameter into the result, so as to judge the fluctuation, the variability and the overall state of the physiological parameter of the user in the time dimension, realize the continuous accurate monitoring process of the physiological state of the personnel, and have better robustness.
It should be noted that, the image acquisition module in the system can be realized by a high-definition camera device, and the identification module is realized based on an AI character identification model; physiological parameters monitored by the intelligent wearable device include common physiological parameters such as heart rate, blood pressure, blood oxygen, body temperature and the like, and are not further described in detail in the embodiment.
The preset physiological analysis model is used for each birthThe process of analyzing the real-time data, the mean value data and the peak value data of the physical parameters comprises the following steps: by the formulaCalculating and obtaining a person physiological state value phy (t) at the current time point;
wherein S is the number of physiological parameter monitoring items, z E [1, S];A threshold interval corresponding to the z-th physiological parameter monitoring item; />Monitoring a measured value for the z-th physiological parameter; />A standard reference value corresponding to a z-th physiological parameter monitoring item; w is a judgment function, when->When (I)>
When (when)When (I)>;/>The corresponding early warning value of the z-th physiological parameter monitoring item; />The first preset time period is set; />Is->Within a period of timeA maximum value; />、/>、/>Is a preset coefficient, and->;/>The dimensionality-removed weighting coefficient corresponding to the z-th physiological parameter monitoring item;
the physiological state value phy (t) of the person is set to be a preset warning threshold valueAnd (3) performing comparison:
if it isGenerating an early warning signal;
otherwise, judging by combining the state analysis result.
Through the above technical solution, the present embodiment provides a method for evaluating a physiological state of a person, by the formula:to calculate and obtain the physiological state value phy (t) of the person at the current time point for evaluation, wherein the evaluation is mainly based on the deviation state +_ of the physiological parameter monitoring item at the current time point relative to the corresponding interval>Historical maximum state before the current point in timeAnd the historical average state before the current time point, and obviously,for real-time data of each physiological parameter +.>For the mean data of each physiological parameter, +.>Peak data for each physiological parameter; wherein, preset coefficient->、/>Fitting setting according to empirical data, and performing normalization treatment to satisfy +.>The method comprises the steps of carrying out a first treatment on the surface of the Thus by the formulaCan comprehensively evaluate the monitoring item of the ith physiological parameter, and in addition, the dimensionality-removed weighting coefficient +.>According to the value range of each physiological parameter and the influence weight, the physiological parameter is set after data fitting, so that more accurate evaluation can be carried out by synthesizing a plurality of physiological parameter monitoring items through the calculation process of a personnel physiological state value phy (t), and then the phy (t) is compared with a preset warning threshold value>In->Early warning is carried out, and the abnormal physical condition of the personnel is found in time.
The above-mentioned techniqueFirst preset time period in the surgical schemeSelecting and setting according to the scene of the system application; threshold interval->Standard reference value->The settings are selected based on empirical data corresponding to each physiological parameter item, which is not described in detail herein.
As one embodiment of the present invention, the process of identifying the image information by the identification module includes:
identifying the body contour of the person based on the AI technology, and adding identification points on the body contour of the person;
acquiring distribution information of identification points through key frames in the image information;
the process of the state analysis comprises the following steps:
judging the current action gesture of the personnel according to the distribution information of the identification points;
and judging the action state type of the personnel according to the action postures of the personnel.
The identification points comprise a center identification point and a plurality of edge identification points;
the process for judging the current posture of the personnel comprises the following steps:
respectively establishing vectors of the edge recognition points relative to the center recognition points to obtain a personnel vector sequence;
presetting an action gesture library, and setting a reference model for each action gesture;
and respectively comparing the personnel vector sequences with reference models of different postures, and determining the current action posture of the personnel according to the comparison result.
Through the above technical scheme, the present embodiment provides a process of identifying image information and analyzing a state by the identification module, firstly, identifying a body contour of a person based on AI, and adding identification points on the body contour of the person according to the identification, in this embodiment, the identification points include a center identification point located at the center of the body and five edge identification points located at the head and limbs, distribution information of the identification points, that is, vectors of the edge identification points relative to the center identification point, is obtained through key frames in the image information, and further, in the process of analyzing the state, the distribution information of the identification points is compared with an action gesture library, and then, an action state of the person is determined according to the comparison result, and then, an action state category of the person is determined according to a plurality of action gestures of the person within a period of time.
As one embodiment of the present invention, the process of comparing the human vector sequence with the reference models of different poses, respectively, includes size comparison:
the size comparison process comprises the following steps:
extracting vector module sequences in the personnel vector sequences according to a preset fixed orderAnd satisfy->
By the formulaCalculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
Wherein n is the number of edge recognition points, i E [1, n];For j-th pose reference model +.>Corresponding to the reference range interval>As a first judgment function, when->When (I)>The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,;/>for interval->Intermediate value of>For interval->Is a range value of (2); />The dimension characteristic coefficient corresponding to the ith gesture is obtained;
will beAnd a preset threshold->And (3) performing comparison:
if it isJudging that the matching with the j-th gesture reference model is successful;
otherwise, judging that the matching is failed.
Through the above technical solution, the present embodiment provides a first screening process of the comparison process, that is, performing preliminary matching by a size comparison method, and firstly extracting a vector mode sequence in a personnel vector sequence according to a preset fixed orderAnd satisfy->Then go through formula->Calculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>Wherein the first judgment function->Is defined as: when->When (I)>The method comprises the steps of carrying out a first treatment on the surface of the Otherwise, go (L)>Corresponding reference range interval +.>Setting the dimension characteristic coefficient according to the error interval of the standard value of the jth attitude reference model>The influence weights measured according to the different edge position points are set after data fitting, thus, byCan judge the deviation state of the personnel posture data relative to the corresponding posture reference model, and then pass +.>And a preset threshold->In->And judging that the current personnel gesture is successfully matched with the jth gesture reference model, and realizing the preliminary screening process of the gesture.
As one implementation mode of the invention, the process of comparing the personnel vector sequence with the reference models with different postures respectively further comprises vector comparison;
the vector comparison process comprises the following steps:
comparing the personnel vector sequence with the corresponding vector of the gesture reference model successfully matched;
by the formulaCalculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
Wherein, the liquid crystal display device comprises a liquid crystal display device,for the second judgment function, ++>、/>Respectively +.j in the j-th gesture reference model>Corresponding to the reference range boundary vector;
if it isIs positioned at->And->In the acute angle range, then->
Otherwise the first set of parameters is selected,;/>is vector quantityVector->Angle of (1)>For vector->Vector->Is included in the plane of the first part; />Is an angle conversion function; />The direction characteristic coefficient corresponding to the ith gesture is obtained;
selectingAnd taking the action gesture corresponding to the minimum value reference model as a judging result.
Through the above technical scheme, in this embodiment, on the basis of the size comparison to complete the preliminary screening, a further screening and judging process is performed through a vector comparison process, and a formula is used for the methodCalculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>Wherein the boundary vector->、/>Setting the error interval of the standard vector according to the jth gesture reference model, < >>As a second judging function, if->Is positioned at->And->Within an acute angle rangeThe method comprises the steps of carrying out a first treatment on the surface of the Otherwise, go (L)>The method comprises the steps of carrying out a first treatment on the surface of the At the same time, the angle conversion function->For a preset function of quantifying the angle, the direction characteristic coefficient +.>Then the influence weights measured according to different edge position points are set after data fitting, so that the +.>The minimum value corresponds to the gesture reference model to be used for judging the gesture type of the personnel, and the current gesture type of the personnel can be determined.
Compared with the prior art which directly compares the human motion model skeleton with the preset skeleton, the method for judging the human motion gesture in the embodiment has the advantages that firstly, the two-layer screening mode of size comparison and vector comparison can greatly reduce the reference comparison motion gesture, further reduce the data processing amount and improve the judging efficiency; secondly, in the embodiment, different weight values are set for the characteristics of different identification points, so that the influence degree of key identification points on results can be enlarged in the comparison and analysis process, and the suitability of the judgment process is higher in the matching process, namely the accuracy of judging the action gestures of the user is improved.
As one embodiment of the present invention, the motion gesture library includes an abnormal gesture type and a normal gesture type;
the process of state analysis further comprises:
acquiring the action postures of the personnel in a plurality of key frames in the image information, and judging the action postures of the personnel:
if the action gesture of the person belongs to the abnormal gesture type, early warning is carried out;
otherwise, comparing the action postures of the personnel in the key frames with a preset state class library;
a plurality of action state categories are preset in the preset state category library, and each action state category is provided with a corresponding gesture type set;
by the formulaCalculating to obtain the matching value +.>
Wherein, the liquid crystal display device comprises a liquid crystal display device,for the number of human action gestures present in the corresponding gesture type set of the q-th action state category,/->;/>The occurrence probability of the action gesture of the kth personnel in the gesture type set corresponding to the qth action state category is given;
judgingAnd the maximum value corresponds to the action state type as a judging result.
Through the technical scheme, the method and the device for early warning according to the action gestures of the personnel firstly carry out preliminary judgment according to the action gestures of the personnel, early warning is carried out when abnormal gesture types exist, otherwise, the personnel action gestures of a plurality of key frames are compared with a preset action state library, the states of the personnel are judged, specifically, the corresponding person gesture types are firstly set according to common state types, the average duration occupied by each gesture type in the state types is counted, the occurrence probability is further determined, and then the occurrence probability is further determined through a formulaCalculating to obtain the matching value +.>,/>The greater the probability of occurrence ∈>The higher, the match value +.>The larger, and then by choosing +.>And the maximum value corresponds to the state category as a judging result, so that the current state of the personnel is accurately judged.
As one embodiment of the invention, phy (t) is compared with a preset reference threshold valueAnd performing comparison, wherein,
if it isJudging that the physiological state of the personnel is normal;
otherwise, get personnelAction state category within a time period;
by the formulaCalculating to obtain a personnel motion quantity coefficient Y;
wherein, the liquid crystal display device comprises a liquid crystal display device,a second preset time period; b is->Action state class number in time period x epsilon 1, B];/>Duration for the x-th action state category; />An influence function of the xth action state category;
y is matched with a preset threshold valueAnd (3) performing comparison:
if it isEarly warning is carried out;
otherwise, judging that the physiological state of the personnel is normal.
Through the technical scheme, the embodiment is as followsIn the state, further willphy (t) and a preset reference threshold +.>Performing comparison, and adding->Is a preset early warning threshold value->Is a preset reference threshold value, is determined according to the normal range data fitting of the physiological parameters of the human body, and is +.>Thus->In the case of normal physiological conditions, if +.>When it is, by judging personnel->The state within the period is further determined, in particular by the formula +.>Calculating to obtain a personnel motion quantity coefficient Y; influence function->Fitting the measured data according to the degree of influence of the different status categories on the status of the person, thus by +.>Can judge->The state of motion of the person in the period of time, and thus +.>The physiological parameter states of the personnel are abnormal, and then the personnel are timely reminded in an early warning mode,the accuracy of judging the physical state of the personnel is improved.
The second preset period of timeThe present embodiment is not further limited according to the application field selectivity setting of the system.
The foregoing describes one embodiment of the present invention in detail, but the description is only a preferred embodiment of the present invention and should not be construed as limiting the scope of the invention. All equivalent changes and modifications within the scope of the present invention are intended to be covered by the present invention.

Claims (7)

1. A safety monitoring system based on character gesture recognition, the system comprising:
the image acquisition module is used for acquiring image information of personnel;
the identification module is used for identifying the personnel image information and obtaining the action state information of each personnel;
the intelligent wearable device is used for monitoring physiological parameter information of each person;
the safety early warning module is used for respectively carrying out state analysis and physiological parameter analysis on each person according to the state information and the physiological parameter information, carrying out comparison analysis on the analysis results of the state analysis and the physiological parameter analysis, and carrying out safety monitoring on the state of the person according to the state analysis and the comparison analysis results;
the physiological parameter analysis process comprises the following steps:
acquiring real-time data and historical data of each physiological parameter of a person according to physiological parameter information, analyzing the historical data, and acquiring average value data and peak value data of each physiological parameter of the person;
substituting the real-time data, the mean value data and the peak value data of each physiological parameter of the personnel into a preset physiological analysis model to obtain a judgment result of the physiological state of the personnel;
the process of analyzing the real-time data, the mean value data and the peak value data of each physiological parameter by the preset physiological analysis model is as follows:
by the formulaCalculating and obtaining a person physiological state value phy (t) at the current time point;
wherein S is the number of physiological parameter monitoring items, z E [1, S];A threshold interval corresponding to the z-th physiological parameter monitoring item; />Monitoring a measured value for the z-th physiological parameter; />A standard reference value corresponding to a z-th physiological parameter monitoring item; w is a judgment function, when->When (I)>;/>The corresponding early warning value of the z-th physiological parameter monitoring item; />The first preset time period is set; />Is->In the period->A maximum value; />、/>、/>Is a preset coefficient, and->;/>The dimensionality-removed weighting coefficient corresponding to the z-th physiological parameter monitoring item;
the physiological state value phy (t) of the person is set to be a preset warning threshold valueAnd (3) performing comparison:
if it isGenerating an early warning signal;
otherwise, judging by combining the state analysis result.
2. The safety monitoring system according to claim 1, wherein the process of recognizing the image information by the recognition module includes:
identifying the body contour of the person based on the AI technology, and adding identification points on the body contour of the person;
acquiring distribution information of identification points through key frames in the image information;
the process of the state analysis comprises the following steps:
judging the current action gesture of the personnel according to the distribution information of the identification points;
and judging the action state type of the personnel according to the action postures of the personnel.
3. The person gesture recognition based safety monitoring system of claim 2, wherein the recognition points include a center recognition point and a plurality of edge recognition points;
the process for judging the current posture of the personnel comprises the following steps:
respectively establishing vectors of the edge recognition points relative to the center recognition points to obtain a personnel vector sequence;
presetting an action gesture library, and setting a reference model for each action gesture;
and respectively comparing the personnel vector sequences with reference models of different postures, and determining the current action posture of the personnel according to the comparison result.
4. A safety monitoring system based on character gesture recognition according to claim 3, wherein the process of comparing the sequence of human vectors with reference models of different gestures, respectively, comprises size comparison:
the size comparison process comprises the following steps:
extracting vector module sequences in the personnel vector sequences according to a preset fixed orderAnd satisfy->
By the formulaCalculating the size deviation coefficient of the vector mode sequence of the obtained personnel relative to the j-th gesture reference model +.>
Wherein n is the number of edge recognition points, i E [1, n];For j-th pose reference model +.>Corresponding to the reference range interval>As a first judgment function, when->When (I)>The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,;/>for interval->Intermediate value of>For interval->Is a range value of (2); />The size characteristic coefficient corresponding to the ith vector in the jth gesture reference model is obtained;
will beAnd a preset threshold->And (3) performing comparison:
if it isJudging and the j-th gestureThe reference model is successfully matched;
otherwise, judging that the matching is failed.
5. The person gesture recognition based safety monitoring system of claim 4, wherein the process of comparing the person vector sequences with reference models of different gestures, respectively, further comprises vector comparison;
the vector comparison process comprises the following steps:
comparing the personnel vector sequence with the corresponding vector of the gesture reference model successfully matched;
by the formulaCalculating to obtain vector deviation coefficient of personnel vector sequence relative to j-th attitude reference model>
Wherein, the liquid crystal display device comprises a liquid crystal display device,for the second judgment function, ++>Is the ith vector in the sequence of human vectors; />、/>Respectively +.j in the j-th gesture reference model>Corresponding to the reference range boundary vector;
if it isIs positioned at->And->In the acute angle range, then->The method comprises the steps of carrying out a first treatment on the surface of the Otherwise the first set of parameters is selected,;/>for vector->Vector->Angle of (1)>For vector->Vector->Is included in the plane of the first part; />Is an angle conversion function;the direction characteristic coefficient corresponding to the ith vector in the jth gesture reference model is obtained;
selectingAnd taking the action gesture corresponding to the minimum value reference model as a judging result.
6. The person gesture recognition based safety monitoring system of claim 5, wherein the motion gesture library comprises an abnormal gesture type and a normal gesture type;
the process of state analysis further comprises:
acquiring the action postures of the personnel in a plurality of key frames in the image information, and judging the action postures of the personnel:
if the action gesture of the person belongs to the abnormal gesture type, early warning is carried out;
otherwise, comparing the action postures of the personnel in the key frames with a preset state class library;
a plurality of action state categories are preset in the preset state category library, and each action state category is provided with a corresponding gesture type set;
by the formulaCalculating to obtain the matching value +.>
Wherein, the liquid crystal display device comprises a liquid crystal display device,for the number of human action gestures present in the corresponding gesture type set of the q-th action state category,;/>the occurrence probability of the action gesture of the kth personnel in the gesture type set corresponding to the qth action state category is given;
judgingAnd the maximum value corresponds to the action state type as a judging result.
7. The person gesture recognition based safety monitoring system of claim 6, wherein phy (t) is compared to a predetermined reference threshold valuePerforming comparison, wherein->
If it isJudging that the physiological state of the personnel is normal;
otherwise, get personnelAction state category within a time period;
by the formulaCalculating to obtain a personnel motion quantity coefficient Y;
wherein, the liquid crystal display device comprises a liquid crystal display device,a second preset time period; b is->Action state class number in time period x epsilon 1, B];/>Duration for the x-th action state category; />The influence function is an influence function of the xth action state category, and is obtained by fitting according to the measurement and calculation data of the influence degree of different state categories on the personnel state;
y is matched with a preset threshold valueAnd (3) performing comparison:
if it isEarly warning is carried out;
otherwise, judging that the physiological state of the personnel is normal.
CN202310525777.9A 2023-05-11 2023-05-11 Safety monitoring system based on figure gesture recognition Active CN116269355B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310525777.9A CN116269355B (en) 2023-05-11 2023-05-11 Safety monitoring system based on figure gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310525777.9A CN116269355B (en) 2023-05-11 2023-05-11 Safety monitoring system based on figure gesture recognition

Publications (2)

Publication Number Publication Date
CN116269355A CN116269355A (en) 2023-06-23
CN116269355B true CN116269355B (en) 2023-08-01

Family

ID=86796182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310525777.9A Active CN116269355B (en) 2023-05-11 2023-05-11 Safety monitoring system based on figure gesture recognition

Country Status (1)

Country Link
CN (1) CN116269355B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116703429B (en) * 2023-08-07 2023-12-15 深圳市磐锋精密技术有限公司 Intelligent charging tray access system based on Internet of things
CN117351405B (en) * 2023-12-06 2024-02-13 江西珉轩智能科技有限公司 Crowd behavior analysis system and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111904400A (en) * 2020-07-17 2020-11-10 三峡大学 Electronic wrist strap
CN115969383A (en) * 2023-02-16 2023-04-18 北京科技大学 Human body physiological fatigue detection method based on electrocardiosignals and respiratory signals

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005000265A (en) * 2003-06-10 2005-01-06 Hitachi Ltd Method for building health conditions-specific onset risk knowledge and health management equipment
JP2012045373A (en) * 2010-07-26 2012-03-08 Sharp Corp Biometric apparatus, biometric method, control program for biometric apparatus, and recording medium recording the control program
RU2522400C1 (en) * 2013-04-05 2014-07-10 Общество С Ограниченной Ответственностью "Хилби" Method for determining human sleep phase favourable to awakening
CN114067358A (en) * 2021-11-02 2022-02-18 南京熊猫电子股份有限公司 Human body posture recognition method and system based on key point detection technology
CN113850535B (en) * 2021-11-30 2022-03-01 中通服建设有限公司 Intelligent construction site personnel management method based on wearable equipment
CN114283494A (en) * 2021-12-14 2022-04-05 联仁健康医疗大数据科技股份有限公司 Early warning method, device, equipment and storage medium for user falling
CN114668388A (en) * 2022-02-16 2022-06-28 深圳技术大学 Intelligent elderly health monitoring method, device, terminal and storage medium
CN114916935B (en) * 2022-07-20 2022-09-20 南京慧智灵杰信息技术有限公司 Posture analysis auxiliary correction system based on correction process of correction personnel
CN115299887B (en) * 2022-10-10 2023-01-03 安徽星辰智跃科技有限责任公司 Detection and quantification method and system for dynamic metabolic function
CN115736902B (en) * 2022-12-01 2023-07-25 广州市汇源通信建设监理有限公司 Constructor management system based on intelligent wearable equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111904400A (en) * 2020-07-17 2020-11-10 三峡大学 Electronic wrist strap
CN115969383A (en) * 2023-02-16 2023-04-18 北京科技大学 Human body physiological fatigue detection method based on electrocardiosignals and respiratory signals

Also Published As

Publication number Publication date
CN116269355A (en) 2023-06-23

Similar Documents

Publication Publication Date Title
CN116269355B (en) Safety monitoring system based on figure gesture recognition
CN109009017B (en) Intelligent health monitoring system and data processing method thereof
CN109117730B (en) Real-time electrocardiogram atrial fibrillation judgment method, device and system and storage medium
CN109700450B (en) Heart rate detection method and electronic equipment
JP2019084343A (en) Method and apparatus for high accuracy photoplethysmogram based atrial fibrillation detection using wearable device
US20060235319A1 (en) Trainable diagnostic system and method of use
CN111009321A (en) Application method of machine learning classification model in juvenile autism auxiliary diagnosis
CN109620244A (en) The Infants With Abnormal behavioral value method of confrontation network and SVM is generated based on condition
US20210290139A1 (en) Apparatus and method for cardiac signal processing, monitoring system comprising the same
CN115736902A (en) Constructor management system based on intelligent wearable equipment
WO2021217906A1 (en) Posture detection method, apparatus and device based on gait features, and storage medium
CN114469022A (en) Method and device for reviewing alarm event and readable storage medium
CN107970027A (en) A kind of radial artery detection and human body constitution identifying system and method
CN110659594B (en) Thermal comfort attitude estimation method based on AlphaPose
CN111407261A (en) Method and device for measuring periodic information of biological signal and electronic equipment
CN116747495A (en) Action counting method and device, terminal equipment and readable storage medium
CN116115239A (en) Embarrassing working gesture recognition method for construction workers based on multi-mode data fusion
JP2020048622A (en) Biological state estimation apparatus
CN114742090A (en) Cockpit man-machine interaction system based on mental fatigue monitoring
CN110575177B (en) Gait classification and quantification method based on Mahalanobis distance
TW202247816A (en) Non-contact heart rhythm category monitoring system and method
CN111637610A (en) Indoor environment health degree adjusting method and system based on machine vision
CN112527118A (en) Head posture recognition method based on dynamic time warping
CN112274119B (en) Pulse wave model prediction method based on neural network
CN117643461B (en) Heart rate intelligent monitoring system and method based on artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant