CN115937829A - Method for detecting abnormal behaviors of operators in crane cab - Google Patents

Method for detecting abnormal behaviors of operators in crane cab Download PDF

Info

Publication number
CN115937829A
CN115937829A CN202211488878.5A CN202211488878A CN115937829A CN 115937829 A CN115937829 A CN 115937829A CN 202211488878 A CN202211488878 A CN 202211488878A CN 115937829 A CN115937829 A CN 115937829A
Authority
CN
China
Prior art keywords
operator
eye
behavior
mobile phone
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211488878.5A
Other languages
Chinese (zh)
Inventor
刘艳
徐望明
李彬
严书桃
马聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Three Gorges Internet Of Things Intellectual Property Operation Co ltd
Original Assignee
Hubei Three Gorges Internet Of Things Intellectual Property Operation Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Three Gorges Internet Of Things Intellectual Property Operation Co ltd filed Critical Hubei Three Gorges Internet Of Things Intellectual Property Operation Co ltd
Priority to CN202211488878.5A priority Critical patent/CN115937829A/en
Publication of CN115937829A publication Critical patent/CN115937829A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The invention provides a method for detecting abnormal behaviors of operators in a crane cab, which comprises the following steps: reading a monitoring video of a crane cab, and acquiring a working site image of an operator; extracting the characteristics of the collected images of the working sites of the operators; comprehensively analyzing the extracted image characteristic information, establishing a corresponding behavior judgment logic rule based on behavior prior, and judging abnormal behaviors of the personnel according to the behavior judgment logic rule; and outputting a corresponding alarm signal according to the result of the abnormal behavior judgment of the personnel in the S3. The invention combines the lightweight convolutional neural network with the characteristic analysis based on the behavior prior and the behavior logic judgment rule, and has both accuracy and rapidity, so that the invention can be deployed on embedded edge computing equipment to run in real time and has higher detection accuracy.

Description

Method for detecting abnormal behaviors of operators in crane cab
Technical Field
The invention relates to the technical field of image processing and recognition, in particular to a method for detecting abnormal behaviors of operators in a crane cab.
Background
The crane cab operator in the working state can seriously affect the working efficiency and even possibly cause safety accidents if abnormal behaviors such as continuous eye closure or frequent blinking, yawning, continuous head lowering or frequent head nodding caused by fatigue, continuous mobile phone playing or call receiving and making and the like are performed. Therefore, there is a great need for real-time detection and warning of abnormal behavior of the operator of the crane cab.
The existing human behavior detection methods mainly comprise manual detection, detection by using contact equipment and the like, and the methods cannot meet the practical requirements in the aspects of detection precision, detection efficiency, detection cost and the like and can cause adverse effects on the normal work of operators in a crane cab. The abnormal behavior of the crane cab operator is detected based on the machine vision technology, and the method has the advantages of high detection precision, low cost, friendliness to the operator and the like. However, the behavior detection method based on machine vision is time-consuming in algorithmic reasoning, and it becomes a problem to detect abnormal behaviors accurately and timely and to alarm. At present, an effective scheme for detecting abnormal behaviors of operators in a crane cab in real time is not available.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a method for detecting abnormal behaviors of operators in a crane cab, which solves the problems that the human behavior detection method in the prior art cannot meet the practical requirements in the aspects of detection precision, detection efficiency, detection cost and the like, and can cause adverse effects on the normal work of the operators in the crane cab.
The invention aims to detect abnormal behaviors of operators in a crane cab, read a real-time monitoring picture of the crane cab and detect whether the operators have continuous eye closure or frequent blinking, yawning, head lowering or nodding behaviors and whether the operators play mobile phones or receive and make calls or not in real time by using a machine vision algorithm, and quickly and accurately send corresponding alarms to different abnormal behaviors.
In order to achieve the purpose, the method comprises the steps of firstly obtaining monitoring video frames of a crane cab, processing and analyzing images of each frame, obtaining face information and human body posture information of an operator, detecting and positioning a mobile phone target, associating the information to carry out comprehensive analysis, establishing a series of logic judgment rules based on behavior prior, judging whether concerned abnormal behaviors occur or not, and finally outputting a signal for sending an alarm or not after the abnormal behaviors are identified. The method comprises the following specific steps:
s1, reading a monitoring video of a crane cab, and acquiring a working site image of an operator.
S2, extracting the characteristics of the collected images of the working site of the operator, wherein the characteristics comprise 3 parts, namely face information extraction, human body posture information extraction and mobile phone target detection positioning, wherein:
the face information extraction is to acquire coordinate information of m face key points of an operator, wherein the coordinate information comprises a left eye, a right eye and b mouths, namely m =2a + b, and in order to ensure the real-time performance and accuracy of the algorithm, the light-weight convolutional neural network 1 is adopted to position the face position of the operator and detect the face key point information of the operator;
the extraction of the human body posture information is to obtain the coordinate information of 5 human body key points of an operator, wherein the coordinate information comprises 1 nose part, 1 left shoulder and right shoulder and 1 left hand and right hand respectively, and the human body key points of the operator are positioned by adopting a lightweight convolutional neural network 2 in order to ensure the real-time performance and the accuracy of an algorithm;
the mobile phone target detection and positioning is to judge whether a mobile phone appears in the collected working image of the operator and position the mobile phone coordinate information, and in order to ensure the real-time performance and accuracy of the algorithm, a lightweight convolutional neural network 3 is adopted to detect and position the mobile phone target.
S3, comprehensively analyzing the extracted image characteristic information, establishing a corresponding behavior judgment logic rule based on behavior prior, and judging abnormal behaviors of people according to the behavior judgment logic rule, specifically:
(1) And (3) judging the continuous eye closing behavior, calculating the eye transverse-longitudinal ratio EAR of the operator by using the coordinates of the eye key points in the face information, wherein the eye transverse-longitudinal ratio EAR is used for representing the eye opening and closing degree, and the key point of the eye contour is P i I =1,2, …, a, a ≧ 8 and a is an even number, where P 1 And P 2 Is the corner key point of the eye, P 3 And P 4 、P 5 And P 6 、...、P a-1 And P a Several pairs of key points corresponding to the upper and lower eyelids, respectively, are expressed in P i -P j I represents two points P i And P j Is a Euclidean distance of
Figure BDA0003964022140000021
If the EAR value of one eye is smaller than the threshold value T e Considering that the eyes of the operator are in the eye-closing state, otherwise, the eyes of the operator are in the eye-opening state, counting the duration time t of the eyes being in the eye-closing state simultaneously in order to comprehensively judge whether the continuous eye-closing behavior of the operator occurs or not 1 If t is 1 Greater than a set threshold τ 1 I.e. t 11 And judging that the continuous eye closing action of the operator occurs.
(2) Judging frequent blinking behaviors, calculating the eye aspect ratio EAR of an operator to represent the eye opening and closing degree, and setting the key point of the eye contour as P i I =1,2, …, a, a ≧ 8 and a is an even number, where P 1 And P 2 Is the corner key point of the eye, P 3 And P 4 、P 5 And P 6 、...、P a-1 And P a Several pairs of key points corresponding to the upper and lower eyelids, respectively, are expressed in P i -P j I represents two points P i And P j Is a Euclidean distance of
Figure BDA0003964022140000022
If a certainEAR value of eye only is less than threshold T e The operator is considered to be in the eye-closing state, otherwise, in the eye-opening state, and the state transition from eye-opening to short continuous eye-closing to next frame eye-opening detected in the video stream is regarded as a one-shot blinking process, and the duration of eye-closing in the one-shot blinking process may be short but gradually becomes longer as the fatigue degree becomes stronger, so that the predetermined time T is individually counted to comprehensively determine whether the operator has frequent eye blinking behavior 1 The number N of frames in each eye in the eye closing state is larger than the threshold value p if the proportion of N to the total number N of the video frames in the statistical time exceeds the threshold value p
Figure BDA0003964022140000031
It is determined that frequent blinking behavior of the operator has occurred.
(3) And (3) judging yawning behaviors, calculating the mouth aspect ratio MAR of an operator by using the coordinates of key points of the mouth in the face information, wherein the mouth aspect ratio MAR is used for representing the opening and closing degree of the mouth, and the key point of the mouth outline is set as Q i I =1,2, …, a, a ≧ 8 and a is an even number, where Q 1 And Q 2 Is the key point of the mouth angle, Q 3 And Q 4 、Q 5 And Q 6 、...、Q a-1 And Q a Several pairs of key points corresponding to the upper and lower lip edges, respectively, are expressed in | | Q i -Q j I represents two points Q i And Q j Is a Euclidean distance of
Figure BDA0003964022140000032
If the mouth MAR value is greater than the threshold T m Considering that the mouth of the operator is in an open state, otherwise, the mouth of the operator is in a closed state, and counting the duration t of the open state of the mouth of the operator for comprehensively judging whether yawning behavior occurs to the operator 2 If t is 2 Greater than a set threshold τ 2 I.e. t 22 And judging that yawning behavior occurs to the operator.
(4) And (4) judging the continuous head lowering behavior, and calculating the distance D between the shoulders of the operator by using the coordinates of 3 key points of the nose tip and the shoulders in the human body posture information 1 And the tip of the noseDistance D to the shoulder line 2 When the operator normally works, the operator sitting on the driver's seat has little possibility of continuously leaning and is in a non-head-lowering state, so D 1 And D 2 Less variation, and when the operator lowers his head, D 1 Still not much changed, but D 2 Will be significantly reduced, thus reducing D 2 And D 1 Ratio of
Figure BDA0003964022140000033
As a basis for determining the head-down state, if r is smaller than the threshold value T r Considering the operator to be in a head-lowering state, otherwise, considering the operator to be in a non-head-lowering state, and counting the duration time t of the head-lowering state in order to comprehensively judge whether the operator has the continuous head-lowering behavior 3 If t is 3 Greater than a set threshold τ 3 I.e. t 33 And judging that the operator has continuous head lowering behavior.
(5) Judging the frequent nodding behavior by using the distance D between shoulders of an operator 1 Distance D from nose tip to connecting line of shoulders 2 Calculating a ratio
Figure BDA0003964022140000034
If r is less than the threshold T r Considering the operator to be in a low head state, otherwise, considering the state transition from a low head of a certain frame to a high head of the next frame detected in the video stream as a head nodding behavior, and separately counting the specified time T to comprehensively judge whether the frequent head nodding behavior of the operator occurs 2 Number of inner nods K, K if K exceeds threshold K>And K, judging that the operator frequently nods.
(6) Judging the behaviors of playing or receiving and making calls, analyzing by using the detected target position information of the mobile phone and combining with human posture information and human face information, if the mobile phone is detected in the current video frame, calculating the distance d between the mobile phone and the hand of an operator, and if d is smaller than a threshold value T d If yes, judging that the mobile phone is held by an operator; further judging the position relation between the hand of the operator holding the mobile phone and the face of the operator, and if the position relation is below the face of the operator, determining that the operator is in the position of watching the handThe machine state, if the side of the human face is above the shoulder, the operator is considered to be in the call receiving and making state; in order to comprehensively judge whether the operator plays the mobile phone or receives and makes calls, the duration t of the state of watching the mobile phone is counted 4 If t is 4 Greater than a set threshold τ 4 I.e. t 44 If yes, the operator is judged to play the mobile phone, and the duration t of the mobile phone in the call receiving and making state is counted 5 If t is 5 Greater than a set threshold τ 5 I.e. t 55 And judging that the call receiving and making actions of the operator occur.
And S4, outputting a corresponding alarm signal according to the result of the abnormal behavior judgment of the personnel in the S3.
Compared with the prior art, the invention has the following beneficial effects:
the method is oriented to the working environment of the crane cab operator, adopts a non-contact mode based on a machine vision algorithm, utilizes a lightweight convolutional neural network to extract the face information and the human body posture information of the operator and combines a target detection technology to realize the real-time detection and alarm of the specific abnormal behavior of the crane cab operator; the method combines the lightweight convolutional neural network with the characteristic analysis based on the behavior prior and the behavior logic judgment rule, and has both accuracy and rapidity, so that the method can be deployed on embedded edge computing equipment to run in real time and has higher detection accuracy.
Drawings
Fig. 1 is a flowchart of an abnormal operator behavior detection method according to the present invention.
Fig. 2 is a schematic distribution diagram of key points of an eye contour.
Fig. 3 is a schematic diagram of the distribution of key points of the mouth contour.
Detailed Description
The technical solution of the present invention is further described with reference to the drawings and the embodiments.
As shown in fig. 1 and 2, an embodiment of the present invention provides a method and an apparatus for detecting abnormal behavior of a crane cab operator, including the following steps:
s1, reading a monitoring video of a crane cab, and acquiring a working site image of an operator.
S2, carrying out feature extraction on the collected image of the working site of the operator, wherein the feature extraction comprises 3 parts, namely face information extraction, human posture information extraction and mobile phone target detection positioning, and the method comprises the following steps of:
the face information extraction is to acquire coordinate information of m face key points of an operator, wherein the coordinate information comprises a left eye, a right eye and b mouths, namely m =2a + b, and in order to ensure the real-time performance and accuracy of the algorithm, the light-weight convolutional neural network 1 is adopted to position the face position of the operator and detect the face key point information of the operator; in the embodiment, the preferred use is 8 left eyes, 8 right eyes and 8 mouths, which are 24 face key points. In this embodiment, the lightweight convolutional neural network 1 preferably uses a PFLD (functional Facial Landmark Detector) network, a trunk feature extraction network of the lightweight convolutional neural network is based on MobileNetV2, the extracted features are downsampled twice to obtain 3 image features of different scales, and the image features are spliced and then subjected to full-connection layer prediction to obtain face key point coordinates.
The extraction of the human body posture information is to acquire coordinate information of 5 human body key points of an operator, wherein the coordinate information comprises 1 nose part, 1 left shoulder, 1 right shoulder and 1 left hand and right hand, and in order to ensure the real-time performance and the accuracy of an algorithm, the light-weight convolutional neural network 2 is adopted to position the human body key points of the operator. In this embodiment, the lightweight convolutional neural network 2 preferably uses a blazepos network, and uses an encoder-decoder network architecture to quickly predict the required human key points through human key point heat map detection and key point coordinate regression.
The mobile phone target detection and positioning is to judge whether a mobile phone appears in the collected working image of the operator and position mobile phone coordinate information, and a lightweight convolutional neural network 3 is adopted to detect and position a mobile phone target in order to ensure the real-time performance and accuracy of the algorithm. In the embodiment, the lightweight convolutional neural network 3 preferably adopts a YOLOv5s network, a trunk part of the lightweight convolutional neural network is composed of a Focus module and a plurality of CSP modules and used for extracting image features of 3 scales, a Neck part is composed of a feature pyramid and a PAN and used for fusing image features of different scales, and a Head part is used for finishing target detection result output and comprises a target position, a category and a confidence coefficient.
S3, comprehensively analyzing the extracted image characteristic information, establishing a corresponding behavior judgment logic rule based on behavior prior, and judging abnormal behaviors of the personnel according to the behavior judgment logic rule, specifically:
(1) And (3) judging continuous eye closing behaviors, calculating eye transverse-longitudinal ratio EAR of operators by using coordinates of eye key points in the face information to express eye opening and closing degree, and setting the key point of eye contour as P i I =1,2, …,8, where P 1 And P 2 Is the corner key point of the eye, P 3 And P 4 、P 5 And P 6 、P 7 And P 8 Respectively a plurality of pairs of key points corresponding to the upper eyelid and the lower eyelid by P i -P j I represents two points P i And P j Is a Euclidean distance of
Figure BDA0003964022140000051
If the EAR value of one eye is smaller than the threshold value T e Considering that the eyes of the operator are in the eye closing state, otherwise, the eyes of the operator are in the eye opening state, counting the duration time t of the eyes in the eye closing state simultaneously in order to comprehensively judge whether the continuous eye closing behavior of the operator occurs, and calculating the duration time t of the eyes in the eye closing state simultaneously 1 If t is 1 Greater than a set threshold τ 1 I.e. t 11 If yes, the operator is determined to have continuous eye closing behavior, in this embodiment, T e The value is 0.2, tau 1 The value was 3 seconds.
(2) Judging frequent blinking behaviors, calculating the eye transverse-longitudinal ratio EAR of an operator for representing the eye opening and closing degree, and setting a key point of an eye contour as P i I =1,2, …,8, where P 1 And P 2 Is the corner key point of the eye, P 3 And P 4 、P 5 And P 6 、P 7 And P 8 Several pairs of key points corresponding to the upper and lower eyelids, respectively, are expressed in P i -P j I represents two points P i And P j Is a Euclidean distance of
Figure BDA0003964022140000052
If the EAR value of one eye is smaller than the threshold value T e The operator is considered to be in the eye-closing state, otherwise, in the eye-opening state, and the state transition from eye-opening to short continuous eye-closing to next frame eye-opening detected in the video stream is regarded as a one-shot blinking process, and the duration of eye-closing in the one-shot blinking process may be short but gradually becomes longer as the fatigue degree becomes stronger, so that the predetermined time T is individually counted to comprehensively determine whether the operator has frequent eye blinking behavior 1 The number N of frames in the closed-eye state of each eye exceeds a threshold value p if the proportion of N to the total number N of the video frames in the statistical time exceeds the threshold value p
Figure BDA0003964022140000061
The occurrence of frequent blinking behavior of the operator is determined, preferably T in this embodiment 1 The value is 60 seconds and 0.4.
(3) And (3) judging yawning behaviors, calculating the mouth aspect ratio MAR of an operator by using the coordinates of key points of the mouth in the face information, wherein the mouth aspect ratio MAR is used for representing the opening and closing degree of the mouth, and the key point of the mouth outline is set as Q i I =1,2, …,8, where Q 1 And Q 2 Is the key point of the mouth angle, Q 3 And Q 4 、Q 5 And Q 6 、Q 7 And Q 8 Respectively a plurality of pairs of key points corresponding to the edges of the upper and lower lips, using | | Q i -Q j | | denotes two points Q i And Q j Is a Euclidean distance of
Figure BDA0003964022140000062
If mouth MAR value is greater than threshold T m Considering that the mouth of the operator is in an open state, otherwise, considering that the mouth of the operator is in a closed state, counting the duration t of the open state of the mouth in order to comprehensively judge whether yawning behavior occurs to the operator, and calculating 2 If t is 2 Greater than a set threshold τ 2 I.e. t 22 If yes, the operator is judged to have yawning behavior, preferably T in the embodiment m A value of 0.7, τ 2 The value was 3 seconds.
(4) And (4) judging the continuous head lowering behavior, and calculating the distance D between the shoulders of the operator by using the coordinates of 3 key points of the nose tip and the shoulders in the human body posture information 1 Distance D from nose tip to connecting line of shoulders 2 When the operator normally works, the operator sitting on the driver's seat has little possibility of continuously leaning and is in a non-head-lowering state, so D 1 And D 2 The variation is not great, and when the operator lowers his head, D 1 Still not much changed, but D 2 Will be significantly reduced, thus reducing D 2 And D 1 Ratio of (A to B)
Figure BDA0003964022140000063
As a basis for determining the low head state, if r is less than a threshold value T r Considering that the operator is in a head-lowering state, otherwise, considering that the operator is in a non-head-lowering state, and counting the duration time t of the operator in the head-lowering state in order to comprehensively judge whether the operator has the continuous head-lowering behavior 3 If t is 3 Greater than a set threshold τ 3 I.e. t 33 If so, the operator is judged to have a continuous head lowering behavior, preferably T in this embodiment r A value of 0.33, tau 3 Taking 10 seconds.
(5) Frequent nodding behavior determination by using the distance D between shoulders of the operator 1 Distance D from the tip of the nose to the connecting line of the shoulders 2 Calculating a ratio
Figure BDA0003964022140000064
If r is less than the threshold T r Considering the operator to be in a low head state, otherwise, considering the state transition from a low head of a certain frame to a high head of the next frame detected in the video stream as a head nodding behavior, and separately counting the specified time T to comprehensively judge whether the frequent head nodding behavior of the operator occurs 2 Inner number of dotting times K, if K exceeds threshold value K, i.e. K>K, judging that frequent nodding behaviors occur to the operator, and preferably selecting the operation mode in the embodimentOf (A) T 2 The value is 60 seconds and 3.
(6) Judging the behaviors of playing or receiving and making calls, analyzing by using the detected target position information of the mobile phone and combining with human posture information and human face information, if the mobile phone is detected in the current video frame, calculating the distance d between the mobile phone and the hand of an operator, and if d is smaller than a threshold value T d If yes, judging that the mobile phone is held by an operator; further judging the position relation between the hand of the operator holding the mobile phone and the face of the operator, if the hand is below the face of the operator, the operator is considered to be in a mobile phone watching state, and if the hand is on the side of the face of the operator and above the shoulder, the operator is considered to be in a call receiving and making state; in order to comprehensively judge whether the operator plays the mobile phone or makes a call, the duration t of the state of watching the mobile phone is counted 4 If t is 4 Greater than a set threshold τ 4 I.e. t 44 If yes, the operator is judged to play the mobile phone, and the duration t of the mobile phone in the call receiving and making state is counted 5 If t is 5 Greater than a set threshold τ 5 I.e. t 55 Then the operator is judged to have the action of making and receiving calls, preferably, in this embodiment, tau 4 Value of 10 seconds,. Tau 5 The value is 60 seconds.
And S4, outputting a corresponding alarm signal according to the result of the abnormal behavior judgment of the personnel in the S3. Specifically, when one or more of continuous eye closing behaviors, frequent blinking behaviors, yawning behaviors, continuous head lowering behaviors, frequent head nodding behaviors, mobile phone playing or phone call receiving behaviors of an operator are judged, an acousto-optic alarm is started inside a crane cab to remind the operator of paying attention, and meanwhile, an alarm is sent out in a ground dispatching monitoring room to remind safety monitoring personnel of paying attention to the internal conditions of the crane cab so as to promote manual supervision. Finally, the safety of crane operation is improved, and the safe production is guaranteed.
Finally, the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, and all of them should be covered in the claims of the present invention.

Claims (10)

1. A method for detecting abnormal behaviors of operators in a crane cab is characterized by comprising the following steps:
s1, reading a monitoring video of a crane cab, and acquiring a working site image of an operator;
s2, extracting features of the collected images of the working sites of the operators, wherein the feature extraction comprises 3 parts, namely face information extraction, human body posture information extraction and mobile phone target detection positioning;
s3, comprehensively analyzing the extracted image characteristic information, establishing a corresponding behavior judgment logic rule based on behavior prior, and judging abnormal behaviors of the personnel according to the behavior judgment logic rule; wherein the abnormal behavior determination comprises: continuous eye closing behavior judgment, frequent blink behavior judgment, yawning behavior judgment, continuous head lowering behavior judgment, frequent head nodding behavior judgment, mobile phone playing or call receiving and making behavior judgment;
and S4, outputting a corresponding alarm signal according to the result of the abnormal behavior judgment of the personnel in the S3.
2. A crane cab operator abnormal behavior detection method as claimed in claim 1, wherein: the face information extraction in step S2 is to obtain coordinate information of m face key points of the operator, including a left eye, a right eye, and b mouths, that is, m =2a + b, and to ensure algorithm real-time performance and accuracy, a lightweight convolutional neural network 1 is used to locate the face position of the operator and detect the face key point information of the operator.
3. The method for detecting abnormal behavior of operator in cab of crane according to claim 1, wherein: the human body posture information extraction in the step S2 is to obtain coordinate information of 5 human body key points of the operator, wherein the coordinate information comprises 1 nose part, 1 left shoulder part, 1 right shoulder part and 1 left hand part, and the light-weight convolutional neural network 2 is adopted to position the human body key points of the operator for ensuring the real-time performance and accuracy of the algorithm.
4. The method for detecting abnormal behavior of operator in cab of crane according to claim 1, wherein: the mobile phone target detection and positioning in the step S2 is to judge whether a mobile phone appears in the collected working image of the operator and position mobile phone coordinate information, and to ensure algorithm real-time performance and accuracy, a lightweight convolutional neural network 3 is adopted to detect and position the mobile phone target.
5. The method for detecting abnormal behavior of operator in cab of crane as claimed in claim 2, wherein: regarding the continuous eye closing behavior determination in the step S3, the eye transverse-longitudinal ratio EAR of the operator is calculated by using the coordinates of the eye key points in the face information, the eye transverse-longitudinal ratio EAR is used for representing the degree of opening and closing of the eyes, and the key point of the eye contour is set as P i I =1,2, …, a, a ≧ 8 and a is an even number, where P 1 And P 2 Is the corner key point of the eye, P 3 And P 4 、P 5 And P 6 、...、P a-1 And P a Several pairs of key points corresponding to the upper and lower eyelids, respectively, are expressed in P i -P j I represents two points P i And P j Is a Euclidean distance of
Figure FDA0003964022130000011
If EAR value of one eye is smaller than threshold value T e Considering that the eyes of the operator are in the eye-closing state, otherwise, the eyes of the operator are in the eye-opening state, counting the duration time t of the eyes being in the eye-closing state simultaneously in order to comprehensively judge whether the continuous eye-closing behavior of the operator occurs or not 1 If t is 1 Greater than a set threshold τ 1 I.e. t 11 And judging that the continuous eye closing action of the operator occurs.
6. A crane cab operator exception line as claimed in claim 2The detection method is characterized in that: regarding the frequent blinking behavior determination in step S3, calculating an eye aspect ratio EAR of the operator to represent the degree of eye openness, and setting a key point of the eye contour as P i I =1,2, …, a, a ≧ 8 and a is an even number, where P 1 And P 2 Is the corner key point of the eye, P 3 And P 4 、P 5 And P 6 、...、P a-1 And P a Several pairs of key points corresponding to the upper and lower eyelids, respectively, are expressed in P i -P j I represents two points P i And P j Is a Euclidean distance of
Figure FDA0003964022130000021
If EAR value of one eye is smaller than threshold value T e The operator is considered to be in the eye-closing state, otherwise, in the eye-opening state, and the state transition from eye-opening to short continuous eye-closing to next frame eye-opening detected in the video stream is regarded as a one-shot blinking process, and the duration of eye-closing in the one-shot blinking process may be short but gradually becomes longer as the fatigue degree becomes stronger, so that the predetermined time T is individually counted to comprehensively determine whether the operator has frequent eye blinking behavior 1 The number N of frames in each eye in the eye closing state is larger than the threshold value p if the proportion of N to the total number N of the video frames in the statistical time exceeds the threshold value p
Figure FDA0003964022130000022
It is determined that frequent blinking behavior of the operator has occurred.
7. The method for detecting abnormal behavior of operator in cab of crane as claimed in claim 2, wherein: regarding the yawning behavior determination in the step S3, the mouth aspect ratio MAR of the operator is calculated by using the coordinates of the key point of the mouth in the face information to represent the opening and closing degree of the mouth, and the key point of the mouth contour is set as Q i I =1,2, …, a, a ≧ 8 and a is an even number, where Q 1 And Q 2 Is the key point of the mouth angle, Q 3 And Q 4 、Q 5 And Q 6 、...、Q a-1 And Q a Several pairs of key points corresponding to the upper and lower lip edges, respectively, are expressed in | | Q i -Q j | | denotes two points Q i And Q j Is a Euclidean distance of
Figure FDA0003964022130000023
If mouth MAR value is greater than threshold T m Considering that the mouth of the operator is in an open state, otherwise, the mouth of the operator is in a closed state, and counting the duration t of the open state of the mouth of the operator for comprehensively judging whether yawning behavior occurs to the operator 2 If t is 2 Greater than a set threshold τ 2 I.e. t 22 And judging that yawning behavior occurs to the operator.
8. The method for detecting abnormal behavior of operator in cab of crane as claimed in claim 2, wherein: regarding the continuous head lowering behavior judgment in the step S3, the shoulder distance D of the operator is calculated by using the coordinates of 3 key points of the nose tip and the shoulders in the human body posture information 1 Distance D from nose tip to connecting line of shoulders 2 When the operator normally works, the operator sitting on the driver's seat has little possibility of continuously leaning and is in a non-head-lowering state, so D 1 And D 2 The variation is not great, and when the operator lowers his head, 1 still not much changed, but D 2 Will be significantly reduced, thus reducing D 2 And D 1 Ratio of
Figure FDA0003964022130000024
As a basis for determining the low head state, if r is less than a threshold value T r Considering the operator to be in a head-lowering state, otherwise, considering the operator to be in a non-head-lowering state, and counting the duration time t of the head-lowering state in order to comprehensively judge whether the operator has the continuous head-lowering behavior 3 If t is 3 Greater than a set threshold τ 3 I.e. t 33 And judging that the operator has continuous head lowering behavior.
9. The method for detecting abnormal behavior of operator in cab of crane as claimed in claim 2, wherein: regarding the frequent nodding behavior determination in the step S3, the operator' S shoulders distance D is used 1 Distance D from the tip of the nose to the connecting line of the shoulders 2 Calculating a ratio
Figure FDA0003964022130000031
If r is less than the threshold T r Considering that the operator is in a head-down state, otherwise, considering the state transition from head-down of a certain frame to head-up of the next frame detected in the video stream as a head-up behavior, and separately counting the specified time T in order to comprehensively judge whether the frequent head-up behavior occurs to the operator 2 Number of inner nods K, K if K exceeds threshold K>And K, judging that the operator frequently nods.
10. The method for detecting abnormal behavior of operator in cab of crane as claimed in claim 2, wherein: regarding the judgment of the mobile phone playing or call receiving behavior in the step S3, the detected target position information of the mobile phone is utilized and analyzed in combination with the human body posture information and the human face information, if the mobile phone is detected in the current video frame, the distance d between the mobile phone and the hand of the operator is calculated, and if d is smaller than the threshold T d If yes, judging that the mobile phone is held by an operator; further judging the position relation between the hand of the operator holding the mobile phone and the face of the operator, if the hand is below the face of the operator, the operator is considered to be in a mobile phone watching state, and if the hand is on the side of the face of the operator and above the shoulder, the operator is considered to be in a call receiving and making state; in order to comprehensively judge whether the operator plays the mobile phone or makes a call, the duration t of the state of watching the mobile phone is counted 4 If t is 4 Greater than a set threshold τ 4 I.e. t 44 If yes, the action of playing the mobile phone of the operator is judged, and the duration time t of the action of receiving and making a call is counted 5 If t is 5 Greater than a set threshold τ 5 I.e. t 55 Then the occurrence of the operator is determinedAnd a telephone call receiving and making action.
CN202211488878.5A 2022-11-25 2022-11-25 Method for detecting abnormal behaviors of operators in crane cab Pending CN115937829A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211488878.5A CN115937829A (en) 2022-11-25 2022-11-25 Method for detecting abnormal behaviors of operators in crane cab

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211488878.5A CN115937829A (en) 2022-11-25 2022-11-25 Method for detecting abnormal behaviors of operators in crane cab

Publications (1)

Publication Number Publication Date
CN115937829A true CN115937829A (en) 2023-04-07

Family

ID=86549918

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211488878.5A Pending CN115937829A (en) 2022-11-25 2022-11-25 Method for detecting abnormal behaviors of operators in crane cab

Country Status (1)

Country Link
CN (1) CN115937829A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116392086A (en) * 2023-06-06 2023-07-07 浙江多模医疗科技有限公司 Method, system, terminal and storage medium for detecting stimulus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116392086A (en) * 2023-06-06 2023-07-07 浙江多模医疗科技有限公司 Method, system, terminal and storage medium for detecting stimulus
CN116392086B (en) * 2023-06-06 2023-08-25 浙江多模医疗科技有限公司 Method, terminal and storage medium for detecting stimulation

Similar Documents

Publication Publication Date Title
CN110210323B (en) Drowning behavior online identification method based on machine vision
CN104616438B (en) A kind of motion detection method of yawning for fatigue driving detection
CN107133564B (en) Tooling cap detection method
CN111027478A (en) Driver and passenger behavior analysis and early warning system based on deep learning
CN112036299A (en) Examination cheating behavior detection method and system under standard examination room environment
CN110738135A (en) worker work step specification visual identification judgment and guidance method and system
CN112016409A (en) Deep learning-based process step specification visual identification determination method and system
CN110705500A (en) Attention detection method and system for personnel working image based on deep learning
CN105844245A (en) Fake face detecting method and system for realizing same
CN109620184A (en) Mobile phone-wearable device integral type human body burst injury real-time monitoring alarming method
CN115937829A (en) Method for detecting abnormal behaviors of operators in crane cab
CN111616718A (en) Method and system for detecting fatigue state of driver based on attitude characteristics
CN111325133A (en) Image processing system based on artificial intelligence recognition
CN111985328A (en) Unsafe driving behavior detection and early warning method based on facial feature analysis
CN112528843A (en) Motor vehicle driver fatigue detection method fusing facial features
CN105989329A (en) Method and device for detecting use of handheld device by person
CN108133573A (en) Drowsy driving warning system
CN115223249A (en) Quick analysis and identification method for unsafe behaviors of underground personnel based on machine vision
CN115797856A (en) Intelligent construction scene safety monitoring method based on machine vision
CN108108651B (en) Method and system for detecting driver non-attentive driving based on video face analysis
CN112528767A (en) Machine vision-based construction machinery operator fatigue operation detection system and method
CN105989328A (en) Method and device for detecting use of handheld device by person
CN112784684A (en) Intelligent on-duty analysis and evaluation method and system thereof
CN114973214A (en) Unsafe driving behavior identification method based on face characteristic points
CN112307920B (en) High-risk worker behavior early warning device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination