CN113378702A - Multi-feature fusion fatigue monitoring and identifying method for pole climbing operation - Google Patents

Multi-feature fusion fatigue monitoring and identifying method for pole climbing operation Download PDF

Info

Publication number
CN113378702A
CN113378702A CN202110641953.6A CN202110641953A CN113378702A CN 113378702 A CN113378702 A CN 113378702A CN 202110641953 A CN202110641953 A CN 202110641953A CN 113378702 A CN113378702 A CN 113378702A
Authority
CN
China
Prior art keywords
index
fatigue
human body
trained
heart rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110641953.6A
Other languages
Chinese (zh)
Other versions
CN113378702B (en
Inventor
龚向阳
王激华
杨跃平
张明达
王思谨
陈高辉
马丽军
万能
江炯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
Original Assignee
Ningbo Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Power Supply Co of State Grid Zhejiang Electric Power Co Ltd filed Critical Ningbo Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
Priority to CN202110641953.6A priority Critical patent/CN113378702B/en
Publication of CN113378702A publication Critical patent/CN113378702A/en
Application granted granted Critical
Publication of CN113378702B publication Critical patent/CN113378702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Psychiatry (AREA)
  • Mathematical Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Cardiology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Signal Processing (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Developmental Disabilities (AREA)

Abstract

The invention relates to a fatigue monitoring and identifying method for multi-feature fusion of pole climbing operation, which is characterized in that a hardware facility is arranged in a power dispatching center, a human body infrared video is collected, human body infrared video data are sent to a background server in real time, the human body infrared video data are analyzed and divided into single-frame pictures, the single-frame pictures are input into a trained deep learning model, an action prediction result is output, a respiration index and a heart rate index are detected by a vital sign monitoring module based on a UWB radar to obtain the respiration index and heart rate index results, a LVQ-BP artificial neural network is adopted for feature index fusion to form a comprehensive fatigue index, whether a human body is in a fatigue state or not is judged according to the comprehensive fatigue index, the method can monitor the body state of a worker in the power dispatching center in real time, has high accuracy, and guarantees the safety of power operation and the life safety of the worker, thereby reducing the incidence of accidents.

Description

Multi-feature fusion fatigue monitoring and identifying method for pole climbing operation
Technical Field
The invention relates to the technical field of fatigue monitoring and identification, in particular to a multi-feature fusion fatigue monitoring and identification method for pole climbing operation.
Background
In the operation and maintenance of an electric power system, in order to ensure the reliable and stable operation of a power transmission network, maintenance personnel are required to regularly perform pole climbing operation to overhaul a power transmission line.
The traditional manual pole climbing operation has very high operation strength, discrete operation places and random time points, so that an operator on a pole climbing machine is very easy to work fatiguedly, and once the fatigue operation occurs on a climbing pole, a series of life dangers can be caused. Because the operation places are discrete and the time points are random, the working state of the operation personnel cannot be monitored and ensured in the field, otherwise, the cost of manpower and material resources is very high. Therefore, the working state of the current pole climbing operator is lack of monitoring and maintenance.
Outdoor pole-climbing aerial work, the operational environment is complicated, and the executive process is changeable, if adopt in the traditional approach based on single sensor monitor, can't effectively monitor this kind of operational environment and discernment characteristic incident, but adopt multiple sensor, exist again because information dimension, chronogenesis mutual independence between the multiple sensor, be difficult to effectively integrate into structured information.
At present, a series of sound monitoring measures are not set for pole climbing operation, so that the life safety of workers cannot be guaranteed in the pole climbing operation, and a certain pole climbing operation accident rate can occur every year.
Disclosure of Invention
The invention aims to provide a multi-feature fusion fatigue monitoring and identifying method for pole climbing operation, which can monitor the working state of a pole climbing worker in real time and ensure the safety of the pole climbing operation so as to reduce the accident rate.
The invention adopts the technical scheme that a multi-feature fusion fatigue monitoring and identifying method for pole climbing operation comprises the following steps:
(1) a depth camera, a vital sign monitoring module based on a UWB radar and a thermal imaging infrared sensing module which is arranged below the depth camera and is electrically connected with the depth camera are arranged on the pole climbing machine;
(2) detecting a human body infrared video under the lens of the depth camera through the thermal imaging infrared sensing module, and sending human body infrared video data to the background server in real time;
(3) the background server analyzes the human body infrared video data, divides the analyzed human body infrared video data into single-frame pictures, inputs the single-frame pictures into the trained deep learning model, and outputs an action prediction result;
(4) detecting the breathing index and the heart rate index of an operator through a vital sign monitoring module based on a UWB radar, and sending detected data to a background server to obtain breathing index and heart rate index results;
(5) inputting the motion prediction result, the respiration index and the heart rate index into an operation monitoring module, wherein the operation monitoring module adopts an LVQ-BP artificial neural network to perform characteristic index fusion, classifies each characteristic index through the LVQ neural network, then performs multi-characteristic fusion on the classified indexes based on the BP neural network to form a final comprehensive index, judges whether the human body is in a fatigue state according to the comprehensive fatigue index, and judges that the human body is in the fatigue state if the comprehensive fatigue index exceeds a fatigue threshold value, and sends out an early warning.
The invention has the beneficial effects that: according to the method, the human body infrared video is collected, the human body infrared video is input into a trained deep learning model to obtain the action prediction result, meanwhile, a vital sign monitoring module based on a UWB radar is adopted to detect the respiratory index and the heart rate index of a worker to obtain the respiratory index and the heart rate index result, and finally, the feature index fusion is carried out through an artificial neural network to obtain the comprehensive fatigue index to judge whether the worker is in the fatigue state.
In the step (3), the specific method for obtaining the trained deep learning model comprises the following steps:
(3-1) collecting limb motion video stream data of various fatigue states of a human body, analyzing the limb motion video stream data of various fatigue states of the human body, dividing the analyzed video stream data into single-frame pictures, taking each picture as a sample to be trained to form a sample set to be trained, labeling the sample to be trained in the sample set to be trained, and generating a labeling vector file, wherein the specific labeling process comprises the following steps: marking key points of the human skeleton on each sample to be trained, and connecting the key points to describe the shape of the limb;
(3-2) cutting the to-be-trained samples in the to-be-trained sample set into N small pictures with the height of H and the width of W to form a training sample set;
(3-3) converting the labeling vector file in the step (3-1) into a binary image, and cutting the obtained binary image into N labels with the height of H and the width of W, wherein the labels correspond to the small images one by one;
(3-4) constructing a deep learning model, wherein the deep learning model comprises a VGG-19 network structure;
and (3-5) training the deep learning model in the step (3-4) to obtain the trained deep learning model.
In the step (3-5), the specific process of training the deep learning model is as follows: inputting the training sample set into a VGG-19 network for processing to obtain a primary output result, then inputting the primary output result into the VGG-19 network again for processing to obtain an output result of the next stage, sequentially circulating, and obtaining an expanded convolutional network receptive field through multi-stage circulation to obtain a final output result; and (4) inputting the final output result and the label in the step (3-3) into a loss function to calculate a loss value, updating the network weight by using a back propagation algorithm with the loss value equal to 0 as a target, and continuously iterating to realize the training of the deep learning model.
In step (3-5), the loss function is:
Figure BDA0003108262940000031
where T represents the different stages, N represents the person in the graph, J represents the key point, and P represents the thermodynamic diagram.
The method for acquiring the trained deep learning model has the advantages of high precision, high efficiency and low cost.
In step (4), the specific method for obtaining the results of the respiration index and the heart rate index comprises the following steps:
(4-1) the vital sign monitoring module based on the UWB radar transmits electromagnetic wave signals to detect a human body;
(4-2) receiving the echo signal by the vital sign monitoring module based on the UWB radar;
and (4-3) firstly carrying out direct-current component removal and digital filtering processing on the received echo signals, then carrying out characteristic point extraction and short-time Fourier transform processing to obtain breathing indexes and heart rate indexes, and then carrying out smoothing processing on the obtained breathing indexes and heart rate indexes to obtain final breathing index and heart rate index results.
The non-contact detection based on the UWB radar has small direct influence on a human body object to be detected, is very sensitive to the movement of a detection target, and has high detection precision and high operation speed.
Drawings
FIG. 1 is a flow chart of a multi-feature fused fatigue monitoring and identification method for pole climbing work according to the present invention;
FIG. 2 is a schematic structural diagram of a VGG-19 network structure in the present invention;
FIG. 3 is a flow chart of the present invention for obtaining a respiration index and a heart rate index;
Detailed Description
The invention is further described below with reference to the accompanying drawings in combination with specific embodiments so that those skilled in the art can practice the invention with reference to the description, and the scope of the invention is not limited to the specific embodiments.
The invention relates to a fatigue monitoring and identifying method for multi-feature fusion of pole climbing operation, which comprises the following steps as shown in figure 1:
(1) a depth camera, a vital sign monitoring module based on a UWB radar and a thermal imaging infrared sensing module which is arranged below the depth camera and is electrically connected with the depth camera are arranged on the pole climbing machine;
(2) detecting a human body infrared video under the lens of the depth camera through the thermal imaging infrared sensing module, and sending human body infrared video data to the background server in real time;
(3) the background server analyzes the human body infrared video data, divides the analyzed human body infrared video data into single-frame pictures, inputs the single-frame pictures into the trained deep learning model, and outputs an action prediction result;
(4) detecting the breathing index and the heart rate index of an operator through a vital sign monitoring module based on a UWB radar, and sending detected data to a background server to obtain breathing index and heart rate index results;
(5) inputting the motion prediction result, the respiration index and the heart rate index into an operation monitoring module, wherein the operation monitoring module adopts an LVQ-BP artificial neural network to perform characteristic index fusion, classifies each characteristic index through the LVQ neural network, then performs multi-characteristic fusion on the classified indexes based on the BP neural network to form a final comprehensive fatigue index, judges whether the human body is in a fatigue state according to the comprehensive fatigue index, and judges that the human body is in the fatigue state if the comprehensive fatigue index exceeds a fatigue threshold value, and sends out an early warning.
According to the method, the human body infrared video is collected, the human body infrared video is input into a trained deep learning model to obtain the action prediction result, meanwhile, a vital sign monitoring module based on a UWB radar is used for detecting the respiratory index and the heart rate index of a worker to obtain the respiratory index and the heart rate index result, and finally, the characteristic indexes are fused through an artificial neural network to obtain the comprehensive fatigue index to judge whether the worker is in a fatigue state. The method is a non-contact vital sign detection mode, does not contact an organism, penetrates a certain medium at a certain distance, detects or induces physiological signals by means of external energy (detection medium) under the condition of no constraint on the organism, and is quick and wide in applicability.
In the step (3), the specific method for obtaining the trained deep learning model comprises the following steps:
(3-1) collecting limb motion video stream data of various fatigue states of a human body, analyzing the limb motion video stream data of various fatigue states of the human body, dividing the analyzed video stream data into single-frame pictures, taking each picture as a sample to be trained to form a sample set to be trained, labeling the sample to be trained in the sample set to be trained, and generating a labeling vector file, wherein the specific labeling process comprises the following steps: marking key points of the human skeleton on each sample to be trained, and connecting the key points to describe the shape of the limb;
(3-2) cutting the to-be-trained samples in the to-be-trained sample set into N small pictures with the height of H and the width of W to form a training sample set;
(3-3) converting the labeling vector file in the step (3-1) into a binary image, and cutting the obtained binary image into N labels with the height of H and the width of W, wherein the labels correspond to the small images one by one;
(3-4) constructing a deep learning model, wherein the deep learning model comprises a VGG-19 network structure;
and (3-5) training the deep learning model in the step (3-4) to obtain the trained deep learning model.
In the step (3-5), the specific process of training the deep learning model is as follows: as shown in fig. 2, inputting the training sample set into the VGG-19 network for processing to obtain a preliminary output result, then inputting the preliminary output result into the VGG-19 network again for processing to obtain an output result of the next stage, sequentially circulating, and obtaining an expanded convolutional network receptive field through multi-stage circulation to obtain a final output result; and (4) inputting the final output result and the label in the step (3-3) into a loss function to calculate a loss value, updating the network weight by using a back propagation algorithm with the loss value equal to 0 as a target, and continuously iterating to realize the training of the deep learning model.
In step (3-5), the loss function is:
Figure BDA0003108262940000051
where T represents the different stages, N represents the person in the graph, J represents the key point, and P represents the thermodynamic diagram.
The method for acquiring the trained deep learning model has the advantages of high precision, high efficiency and low cost.
In step (4), as shown in fig. 3, the specific method for obtaining the results of the respiration index and the heart rate index includes the following steps:
(4-1) the vital sign monitoring module based on the UWB radar transmits electromagnetic wave signals to detect a human body;
(4-2) receiving the echo signal by the vital sign monitoring module based on the UWB radar;
and (4-3) firstly carrying out direct-current component removal and digital filtering processing on the received echo signals, then carrying out characteristic point extraction and short-time Fourier transform processing to obtain breathing indexes and heart rate indexes, and then carrying out smoothing processing on the obtained breathing indexes and heart rate indexes to obtain final breathing index and heart rate index results.
The non-contact detection based on the UWB radar has small direct influence on a human body object to be detected, is very sensitive to the movement of a detection target, and has high detection precision and high operation speed. The signal output by the ultra-wideband radar is collected, the body movement index is obtained by analyzing the frequency domain information of the signal, useful sign information is extracted from the signal by adopting feature point extraction and frequency domain analysis, and the frequency domain analysis is carried out by adopting a window function when the sign information is extracted, so that the operation speed is improved.

Claims (5)

1. A fatigue monitoring and identifying method for multi-feature fusion of pole climbing operation is characterized in that: the method comprises the following steps:
(1) a depth camera, a vital sign monitoring module based on a UWB radar and a thermal imaging infrared sensing module which is arranged below the depth camera and is electrically connected with the depth camera are arranged on the pole climbing machine;
(2) detecting a human body infrared video under the lens of the depth camera through the thermal imaging infrared sensing module, and sending human body infrared video data to the background server in real time;
(3) the background server analyzes the human body infrared video data, divides the analyzed human body infrared video data into single-frame pictures, inputs the single-frame pictures into the trained deep learning model, and outputs an action prediction result;
(4) detecting the breathing index and the heart rate index of an operator through a vital sign monitoring module based on a UWB radar, and sending detected data to a background server to obtain breathing index and heart rate index results;
(5) inputting the motion prediction result, the respiration index and the heart rate index into a fatigue monitoring module, wherein the fatigue monitoring module adopts an LVQ-BP artificial neural network to perform characteristic index fusion, classifies each characteristic index through the LVQ neural network, then performs multi-characteristic fusion on the classified indexes based on the BP neural network to form a final comprehensive index, judges whether the human body is in a fatigue state according to the comprehensive fatigue index, and judges that the human body is in the fatigue state if the comprehensive fatigue index exceeds a fatigue threshold value, and sends out an early warning.
2. The fatigue monitoring and identification method for multi-feature fusion of pole climbing operations according to claim 1, characterized in that: in the step (3), the specific method for obtaining the trained deep learning model comprises the following steps:
(3-1) collecting daily limb action video stream data of various fatigue states of a human body, analyzing the limb action video stream data of various fatigue states of the human body, cutting the analyzed video stream data into single-frame pictures, taking each picture as a sample to be trained to form a sample set to be trained, labeling the sample to be trained in the sample set to be trained, and generating a labeling vector file, wherein the specific labeling process comprises the following steps: marking key points of the human skeleton on each sample to be trained, and connecting the key points to describe the shape of the limb;
(3-2) cutting the to-be-trained samples in the to-be-trained sample set into N small pictures with the height of H and the width of W to form a training sample set;
(3-3) converting the labeling vector file in the step (3-1) into a binary image, and cutting the obtained binary image into N labels with the height of H and the width of W, wherein the labels correspond to the small images one by one;
(3-4) constructing a deep learning model, wherein the deep learning model comprises a VGG-19 network structure;
and (3-5) training the deep learning model in the step (3-4) to obtain the trained deep learning model.
3. The fatigue monitoring and identification method for multi-feature fusion of pole climbing operations according to claim 2, characterized in that: in the step (3-5), the specific method for training the deep learning model is as follows: inputting the training sample set into a VGG-19 network for processing to obtain a primary output result, then inputting the primary output result into the VGG-19 network again for processing to obtain an output result of the next stage, sequentially circulating, and obtaining an expanded convolutional network receptive field through multi-stage circulation to obtain a final output result; and (4) inputting the final output result and the label in the step (3-3) into a loss function to calculate a loss value, updating the network weight by using a back propagation algorithm with the loss value equal to 0 as a target, and continuously iterating to realize the training of the deep learning model.
4. The fatigue monitoring and identification method for multi-feature fusion of pole climbing operations according to claim 3, characterized in that: in step (3-5), the loss function is:
Figure FDA0003108262930000021
where T represents the different stages, N represents the person in the graph, J represents the key point, and P represents the thermodynamic diagram.
5. The fatigue monitoring and identification method for multi-feature fusion of pole climbing operations according to claim 1, characterized in that: in step (4), the specific method for obtaining the results of the respiration index and the heart rate index comprises the following steps:
(4-1) the vital sign monitoring module based on the UWB radar transmits electromagnetic wave signals to detect a human body;
(4-2) receiving the echo signal by the vital sign monitoring module based on the UWB radar;
and (4-3) firstly carrying out direct-current component removal and digital filtering processing on the received echo signals, then carrying out characteristic point extraction and short-time Fourier transform processing to obtain breathing indexes and heart rate indexes, and then carrying out smoothing processing on the obtained breathing indexes and heart rate indexes to obtain final breathing index and heart rate index results.
CN202110641953.6A 2021-06-09 2021-06-09 Multi-feature fusion fatigue monitoring and identifying method for pole climbing operation Active CN113378702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110641953.6A CN113378702B (en) 2021-06-09 2021-06-09 Multi-feature fusion fatigue monitoring and identifying method for pole climbing operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110641953.6A CN113378702B (en) 2021-06-09 2021-06-09 Multi-feature fusion fatigue monitoring and identifying method for pole climbing operation

Publications (2)

Publication Number Publication Date
CN113378702A true CN113378702A (en) 2021-09-10
CN113378702B CN113378702B (en) 2023-04-07

Family

ID=77573135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110641953.6A Active CN113378702B (en) 2021-06-09 2021-06-09 Multi-feature fusion fatigue monitoring and identifying method for pole climbing operation

Country Status (1)

Country Link
CN (1) CN113378702B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116667203A (en) * 2023-05-30 2023-08-29 国网湖北省电力有限公司超高压公司 Electric power basic operation safety protection method and system based on gas detector

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108392211A (en) * 2018-01-11 2018-08-14 浙江大学 A kind of fatigue detection method based on Multi-information acquisition
CN108852377A (en) * 2018-04-13 2018-11-23 中国科学院苏州生物医学工程技术研究所 Human motion fatigue based on multi-physiological-parameter monitors system
CN109460703A (en) * 2018-09-14 2019-03-12 华南理工大学 A kind of non-intrusion type fatigue driving recognition methods based on heart rate and facial characteristics
CN110811649A (en) * 2019-10-31 2020-02-21 太原理工大学 Fatigue driving detection method based on bioelectricity and behavior characteristic fusion
CN110859609A (en) * 2019-11-26 2020-03-06 郑州迈拓信息技术有限公司 Multi-feature fusion fatigue driving detection method based on voice analysis
CN111166357A (en) * 2020-01-06 2020-05-19 四川宇然智荟科技有限公司 Fatigue monitoring device system with multi-sensor fusion and monitoring method thereof
US20200156649A1 (en) * 2017-09-25 2020-05-21 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Evaluating a Degree of Fatigue of a Vehicle Occupant in a Vehicle
CN111968341A (en) * 2020-08-21 2020-11-20 无锡威孚高科技集团股份有限公司 Fatigue driving detection system and method
CN112131981A (en) * 2020-09-10 2020-12-25 山东大学 Driver fatigue detection method based on skeleton data behavior recognition

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200156649A1 (en) * 2017-09-25 2020-05-21 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Evaluating a Degree of Fatigue of a Vehicle Occupant in a Vehicle
CN108392211A (en) * 2018-01-11 2018-08-14 浙江大学 A kind of fatigue detection method based on Multi-information acquisition
CN108852377A (en) * 2018-04-13 2018-11-23 中国科学院苏州生物医学工程技术研究所 Human motion fatigue based on multi-physiological-parameter monitors system
CN109460703A (en) * 2018-09-14 2019-03-12 华南理工大学 A kind of non-intrusion type fatigue driving recognition methods based on heart rate and facial characteristics
CN110811649A (en) * 2019-10-31 2020-02-21 太原理工大学 Fatigue driving detection method based on bioelectricity and behavior characteristic fusion
CN110859609A (en) * 2019-11-26 2020-03-06 郑州迈拓信息技术有限公司 Multi-feature fusion fatigue driving detection method based on voice analysis
CN111166357A (en) * 2020-01-06 2020-05-19 四川宇然智荟科技有限公司 Fatigue monitoring device system with multi-sensor fusion and monitoring method thereof
CN111968341A (en) * 2020-08-21 2020-11-20 无锡威孚高科技集团股份有限公司 Fatigue driving detection system and method
CN112131981A (en) * 2020-09-10 2020-12-25 山东大学 Driver fatigue detection method based on skeleton data behavior recognition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
EDMOND Q WU等: "Detecting Fatigue Status of Pilots Based on Deep", 《IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS》 *
王超 等: "登杆之前须核对专人监护须执行", 《农电技术》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116667203A (en) * 2023-05-30 2023-08-29 国网湖北省电力有限公司超高压公司 Electric power basic operation safety protection method and system based on gas detector
CN116667203B (en) * 2023-05-30 2023-11-03 国网湖北省电力有限公司超高压公司 Electric power basic operation safety protection method and system based on gas detector

Also Published As

Publication number Publication date
CN113378702B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN109657592B (en) Face recognition method of intelligent excavator
CN110745704B (en) Tower crane early warning method and device
CN110543857A (en) Contraband identification method, device and system based on image analysis and storage medium
CN112396658B (en) Indoor personnel positioning method and system based on video
CN105139029B (en) A kind of Activity recognition method and device of prison prisoner
CN106580282A (en) Human body health monitoring device, system and method
CN108052900A (en) A kind of method by monitor video automatic decision dressing specification
CN111209848A (en) Real-time fall detection method based on deep learning
CN110210739B (en) Nuclear radiation detection method and system based on artificial intelligence
CN113378702B (en) Multi-feature fusion fatigue monitoring and identifying method for pole climbing operation
CN114155492A (en) High-altitude operation safety belt hanging rope high-hanging low-hanging use identification method and device and electronic equipment
CN109567832A (en) A kind of method and system of the angry driving condition of detection based on Intelligent bracelet
CN111695520A (en) High-precision child sitting posture detection and correction method and device
CN115223249A (en) Quick analysis and identification method for unsafe behaviors of underground personnel based on machine vision
CN111667599A (en) Face recognition card punching system and method
CN117576632B (en) Multi-mode AI large model-based power grid monitoring fire early warning system and method
CN111931748B (en) Worker fatigue detection method suitable for storage battery production workshop
CN111832450B (en) Knife holding detection method based on image recognition
CN104392201A (en) Human fall identification method based on omnidirectional visual sense
CN111694980A (en) Robust family child learning state visual supervision method and device
Reyes et al. Safety gear compliance detection using data augmentation-assisted transfer learning in construction work environment
US11954955B2 (en) Method and system for collecting and monitoring vehicle status information
CN111339970B (en) Smoking behavior detection method suitable for public environment
CN114202726A (en) Power transmission line scene target detection method based on self-supervision learning
CN113609987A (en) Transformer substation video monitoring system and method based on Boost pedestrian air defense misjudgment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant