CN109919134B - Vision-based method for detecting abnormal behaviors of operating vehicle personnel - Google Patents
Vision-based method for detecting abnormal behaviors of operating vehicle personnel Download PDFInfo
- Publication number
- CN109919134B CN109919134B CN201910229246.9A CN201910229246A CN109919134B CN 109919134 B CN109919134 B CN 109919134B CN 201910229246 A CN201910229246 A CN 201910229246A CN 109919134 B CN109919134 B CN 109919134B
- Authority
- CN
- China
- Prior art keywords
- person
- node
- interfered
- vehicle
- personnel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 206010000117 Abnormal behaviour Diseases 0.000 title claims abstract description 42
- 238000000034 method Methods 0.000 title claims abstract description 30
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 56
- 210000000707 wrist Anatomy 0.000 claims abstract description 44
- 230000002452 interceptive effect Effects 0.000 claims abstract description 33
- 210000003414 extremity Anatomy 0.000 claims abstract description 14
- 238000013135 deep learning Methods 0.000 claims abstract description 9
- 210000003128 head Anatomy 0.000 claims description 36
- 238000004364 calculation method Methods 0.000 claims description 11
- 210000003127 knee Anatomy 0.000 claims description 6
- 210000000537 nasal bone Anatomy 0.000 claims description 6
- 238000000605 extraction Methods 0.000 claims description 2
- 210000003423 ankle Anatomy 0.000 claims 2
- 230000002159 abnormal effect Effects 0.000 claims 1
- 230000001186 cumulative effect Effects 0.000 claims 1
- 238000002372 labelling Methods 0.000 abstract description 2
- 238000001514 detection method Methods 0.000 description 4
- 230000036544 posture Effects 0.000 description 4
- 210000004233 talus Anatomy 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000034994 death Effects 0.000 description 1
- 231100000517 death Toxicity 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000391 smoking effect Effects 0.000 description 1
Images
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Image Analysis (AREA)
- Emergency Alarm Devices (AREA)
Abstract
The invention provides a vision-based method for detecting abnormal behaviors of operating vehicle personnel, belonging to the field of vision identification. The method comprises the steps of firstly, comprehensively acquiring image information of a driver in a vehicle and surrounding people; then, extracting the 2D image, and labeling the bone nodes of the personnel in all the images by utilizing a deep learning algorithm in an openposition database; and then, defining the interfering personnel and the interfered personnel according to the driving state of the vehicle, and calculating the dangerous distance threshold value of each key limb part of the interfered personnel. Then, calculating the space distance between the wrist node of the interfering person and the key limb part of the interfered person based on the binocular distance measuring principle; and finally, detecting abnormal behaviors of the interference personnel by judging the time when the space distance is within the dangerous distance threshold. The method is suitable for detecting the abnormal behaviors of the personnel in the operation vehicle, provides key judgment basis for warning and timely alarming of the abnormal behaviors of the personnel in the vehicle, and has important significance for guaranteeing the life and property safety of the personnel in the vehicle.
Description
Technical Field
The invention belongs to the technical field of visual identification, and particularly relates to a visual-based method for detecting abnormal behaviors of operating vehicle personnel.
Background
With the progress of internet technology, the industry of our country, mainly special cars, fast cars and unmanned ticketing buses, is rapidly developed, and great convenience is brought to people. However, due to the lack of strict regulatory systems, there are increasing cases of serious injuries and deaths occurring in the vehicle interior due to the passengers interfering with the normal driving of the driver during the driving process or due to the passengers intruding by the driver. Therefore, a vision-based method for detecting abnormal behaviors of people in operating vehicles is sought, the method is used for detecting the postures of the people in the vehicles in real time, automatically distinguishing the abnormal behaviors of the people in the vehicles, providing key judgment basis for warning and timely alarming of the abnormal behaviors of the people in the vehicles, and has important significance for guaranteeing the life and property safety of the people in the vehicles.
In the actual running process of the vehicle, in order to guarantee the safety of the passenger, the passenger generally shares the position with the friend in real time through software to ensure that the passenger gets a rescue in time when being infringed. However, the method has strong subjectivity and cannot ensure that each passenger can immediately obtain early warning when being infringed. On the other hand, in order to ensure the safety of the driver, a protection device with an ejection function is usually installed around the driving seat. However, the method still depends on subjective judgment of a driver on specific situations, and cannot effectively perform early warning on abnormal behaviors of passengers in time. Therefore, a vision-based method for detecting abnormal behaviors of operating vehicle personnel is urgently sought.
Researches show that the abnormal behavior detection of the operating vehicle personnel based on vision needs to meet basic conditions of real-time monitoring of the posture of the personnel in the vehicle, accurate calculation of the relative position of the personnel, timely judgment of the abnormal behavior of the personnel and the like, and has great engineering challenge. The image information of the personnel in the vehicle is obtained in an all-round mode through the distributed cameras, the postures of the personnel in the vehicle are extracted in real time based on the openposition database, the spatial relative positions of the personnel in the vehicle are obtained by combining a binocular range finding principle, whether the personnel in the vehicle have abnormal behaviors or not is judged according to the spatial relative positions, and the possibility is provided for detecting the abnormal behaviors of the personnel in the operating vehicle based on vision.
A dangerous driving behavior real-time detection method based on deep learning is disclosed in patent CN201611267904.6 of technical university Kang Yu of China in 2016, and the method adopts an image acquisition system to obtain information of a driver and judges abnormal behaviors of the driver such as smoking, phone holding and the like through a deep learning method. 2016 Chengdu remote control technology company, xie Zhonghua, discloses a fatigue detection method and system based on a video intelligent algorithm in patent CN201610264968.4, the method acquires images of a driver through the acquisition of the images, and judges whether the driver is tired according to the opening and closing degree of the mouth and eyes of the driver. However, the above measurement method does not involve the detection of abnormal behavior between the driver and the passenger.
Disclosure of Invention
The invention aims to overcome the defects of the existing method, and provides a visual-based method for detecting abnormal behaviors of operating vehicle personnel aiming at the problem of detecting the abnormal behaviors of the operating vehicle personnel. The method utilizes a deep learning algorithm in an openposition database to extract a skeleton node sequence of passengers and drivers, and provides a position basis for posture judgment of people in the vehicle; the method comprises the steps that the spatial distance between key limb parts of people in the vehicle is obtained based on a binocular ranging principle, and a foundation is laid for judging abnormal behaviors; and finally, detecting the abnormal behaviors of the personnel operating the vehicle by calculating the time length of the distance between the key limb parts of the personnel in the vehicle being less than the danger threshold.
The measuring method adopts a distributed camera to collect image information of drivers and surrounding drivers in an operating vehicle; judging the positions of the interfering personnel and the interfered personnel in the vehicle by combining the running state of the operating vehicle; extracting a skeleton node sequence of people in the vehicle by using a deep learning algorithm in an openposition database; calculating the space distance between a wrist part node of an interfering person in the vehicle and a key limb part of the interfered person based on a double visual range principle; and setting a danger threshold, and completing the detection of the abnormal behaviors of the personnel by the time when the space distance is within the danger threshold.
The technical scheme of the invention is as follows:
a method for detecting abnormal behaviors of operating vehicle personnel based on vision comprises the steps of firstly, installing a camera at the top of an operating vehicle, and comprehensively acquiring image information of drivers in the vehicle and surrounding personnel; then, extracting a 2D image, and labeling bone nodes of people in all images by using a deep learning algorithm in an openposition database; and then, defining the interference personnel and the interfered personnel by combining the running state of the operating vehicle, and calculating the dangerous distance threshold value of each key limb part of the interfered personnel. Then, calculating the space distance between the wrist node of the interfering person and the key limb part of the interfered person based on the binocular distance measuring principle; finally, detecting abnormal behaviors of the interfering personnel by judging the time when the space distance is within the dangerous distance threshold; the method comprises the following specific steps:
first, the camera is installed and calibrated
First, two cameras i are arranged at the rear view mirror of the manned vehicle 21 and numbered: i =1,2. The shooting angles of the two cameras are adjusted, so that the image information including the driving part 20 and the surrounding area of the driving part can be directly acquired. The scan rate of each camera needs to be greater than 5 frames/second.
Then, the two cameras are combined into a binocular vision camera I. And the opencv database is utilized to independently calibrate the two cameras, and the opencv database is utilized to perform double-target calibration on the binocular vision camera I.
Second, real-time extraction of the pose of the person in the operating vehicle 21
Firstly, according to the distribution of seats in the operating vehicle 21, the number of the persons in the vehicle in the collected image is coded. If m persons are in the vehicle, the number of the person on the driving seat is marked as 1, and the numbers of the other persons are sequentially 2-m. The same personnel number needs to be consistent when the same personnel number is collected by different cameras. The person with the number j is set as the person j.
Then, the skeletal nodes of all the persons in the collected image are marked by using a deep learning algorithm in an openposition database. The skeletal nodes k (k =0,2, …, 17) of the person j (j =1,2, …, m) collected by the i-th camera (i =1,2) constitute a person j skeletal node setWherein it is present>Acquiring the coordinate of a skeleton node k of a person j in an ith camera in an image coordinate system>
Finally, obtaining a j skeleton node set of the in-vehicle personnel acquired by the ith camera according to calculationUsing binocular vision in opencv databaseThe lower space coordinate calculation method obtains the coordinates of the skeleton node k (k =0,2, …, 17) of the in-car space coordinate system of the person j (j =1,2, …, m) in the i-th camera
Third, calculation of dangerous distance threshold
First, the number of the interfering person in the vehicle is set to p, and the number of the interfered person is set to q. Based on the ECU protocol, the running speed V of the vehicle is extracted through the operating vehicle 21OBD interface. When the operating vehicle 21 is running, i.e. when | V | >0, the driver is defined as an interfered person q, i.e. q =1; the remaining persons are the persons performing the disturbance p, i.e. p ≠ 1. When the vehicle is not running, that is, when V =0, it is defined that the driver is the interfered person q =1, and the remaining persons are the interfering person p ≠ 1, or the driver is the interfering person p =1 and the remaining persons are the interfered person q ≠ 1.
Then, a danger distance threshold value of each key limb part of the interfered person q is calculated. The key limb parts comprise the head, the body and the arms; each of the key limb portions includes a nasal bone node 0, a neck bone node 1, a right shoulder bone node 2, a right elbow bone node 3, a right wrist bone node 4, a left shoulder bone node 5, a left elbow bone node 6, a left wrist bone node 7, a right span bone node 8, a right knee bone node 9, a right ankle bone node 10, a left span bone node 11, a left knee bone node 12, a left ankle bone node 13, a right eye node 14, a left eye node 15, a right temple bone node 16, and a left temple bone node 17;
the q arm of the person to be interfered is set to have the connecting line of the wrist bone node, the elbow bone node, the shoulder bone node k and k-1 (k =3,4,6,7) as the axis and the diameter c q A cylindrical body of (a); the body takes a nasal bone node 0 and a cervical bone node 1 as axes and has a shoulder width b q Is a cylinder with a diameter; the head takes a nasal bone node 0 and a neck bone node 1 as axes, and the distance a between temples q Is a cylinder of diameter.
Arm diameter c q According toThe actual situation is set by self;
shoulder width b q Comprises the following steps: interfered person q right shoulder bone node 2 coordinate in space coordinate system in vehicle in i camera 5 coordinate of the left shoulder skeleton node>Distance between:
head width a of interfered person q q Comprises the following steps: interfered person q right temple skeleton node 16 coordinate in-vehicle space coordinate system of i cameraIn combination with left temple bone node 17 coordinate>Distance between:
and finally, setting dangerous distance thresholds of the head, the body and the arms of the interfered person q as follows:
in the formula, A q 、B q 、C q The dangerous distance threshold of the head, the body and the arms of the person to be interfered is q, and the unit is mm.
Fourthly, judging abnormal behaviors of the personnel
The judgment of the abnormal behavior of the personnel comprises the following steps: and (3) performing actual distance calculation of the wrist position distance of the interfered person p and the head, body and arm of the interfered person q, and determining whether the interfering person p has abnormal behaviors.
Implementing an actual distance E between the position of the wrist of the interfering person p and the arm of the interfered person q q Comprises the following steps: coordinates of right wrist skeleton node 4 and left wrist skeleton node 7 of implementing interference person p in-vehicle space coordinate system in ith cameraNode adjacent to the skeleton at the arm q of the person q to be disturbed->Distance d between the connecting lines c . Wherein the person q affected has a bone-adjacent node on the arm>Is connected with the middle line as->
Wherein, (x, y, z) represents adjacent nodes of skeleton at q arms of interfered personAny point on the inter-line;
the head and the body of the interfered person q are set to be in the same axis. Implementing an actual distance D from the wrist position of the interfering person p to the head of the interfered person q q Actual distance from body F q The method comprises the following steps: coordinates of right wrist skeleton node 4 and left wrist skeleton node 7 of implementation interference person p in space coordinate system in ith camera middle vehicleThe node of the head and the neck of the person q to be interfered>Distance between connecting linesFrom d ab . Wherein, the interfered person q is at the bone node of the head and neckThe connecting line between the two lines is as follows,
wherein (x) 1 ,y 1 ,z 1 ) Representing the bone nodes at the head and neck of an interfered person qAny point on the inter-connecting line;
the formula for judging the wrist position of the interfering person p to be in the head or body of the interfered person q is as follows,
when the position of the wrist of the person implementing the interference p is positioned at the head of the person q to be interfered, delta>0; when the position of the wrist of the person implementing the interference p is at the body of the person q to be interfered, delta<0. Implementing the actual distance D between the wrist position of the interfering person p and the head of the interfered person q q Or the actual distance F from the body of the person q to be disturbed q In order to realize the purpose of the method,
and finally, judging whether the person in the vehicle has abnormal behavior. When the wrist position of the interfering person p is at the actual distance E from the arm and body of the interfered person q q 、F q Continuously less than a dangerous distance C q 、B q Exceeds 10s, the offender p takes an abnormal behavior. When the actual distance D between the wrist position of the interfering person p and the head of the interfered person q is implemented q Less than the dangerous distance A within 5s q Time of (2) is exceededAfter 1s, the implementing and interfering person p takes an abnormal behavior.
The invention has the beneficial effects that: the invention is suitable for detecting the abnormal behaviors of the personnel in the operating vehicle, provides key judgment basis for the warning and the timely alarm of the abnormal behaviors of the personnel in the vehicle, and has important significance for guaranteeing the safety of lives and properties of the personnel in the vehicle.
Drawings
Fig. 1 is a schematic view of a mounting position of a camera in a vehicle.
Fig. 2 is a sequence number diagram of each skeleton node of a person.
Fig. 3 is an image captured within a service vehicle.
In the figure: 0 nasal bone node; 1 cervical skeletal node; 2 right shoulder bone node; 3 the right elbow skeletal node; 4 right wrist bone node; 5 left shoulder bone node; 6 left elbow skeletal node; 7 left wrist skeletal joint; 8 right span bone node; 9 right knee skeletal node; 10 a right ankle bone node; 11 left span bone node; 12 a left knee skeletal node; 13 a left ankle bone node; 14 right eye node; 15. a left eye node; 16 right temple bone nodes; 17 left temple bone node; 18 a first camera; 19. a second camera; 20 driving part; 21 operating the vehicle; person No. 22; person No. 23.
FIG. 4 is a flow chart of the present invention.
Detailed Description
The following detailed description of the embodiments of the invention will be made with reference to the accompanying drawings and accompanying claims.
First, the camera is installed and calibrated
First, 2 cameras are arranged on both sides of the front part of the roof of the people carrier 4. The first camera 18 and the second camera 19 are disposed at a vehicle rearview mirror, and the installation position of the cameras in the vehicle is shown in fig. 1. The shooting angles of the first camera 18 and the second camera 19 are adjusted, so that the image information including the driving part 20 and the surrounding image information can be directly acquired. The two cameras are manufactured by BASLER corporation, the chip type is CMOS, and the scanning rate is 20 frames/second.
Then, the first camera 18 and the second camera 19 are combined into a binocular vision camera I. And the first camera 18 and the second camera 19 are calibrated independently by using an opencv database. Similarly, binocular calibration is performed on the binocular vision camera I by using the opencv database.
Second, the attitude of the person in the operating vehicle 21 is extracted in real time
Firstly, numbering the people in the vehicle in the collected image according to the distribution of seats in the vehicle. The captured image in the operating vehicle 21 is shown in fig. 3. There are 2 persons in the car, the person marked on the driver's seat is the first person 22, and the rest are the second person 23. The same personnel number needs to be consistent when the same personnel number is collected by different cameras. Here, a person with the number j is set to be the person j.
Then, all people in the collected image are marked by using a deep learning algorithm in an openposition database: 1. the bone nodes of person # 22 and person # two 23. The bone nodes of the first person 22 acquired by the first camera 18 form a first person 22 bone node set
The bone nodes of the second person 23 acquired by the second camera 19 form a second person 23 bone node set/>
The serial numbers of the bone nodes of the first person 22 and the second person 23 are shown in fig. 2.
Finally, according to the bone node sets of the first person 22 and the second person 23 in the vehicle collected by the first camera 18 and the second camera 19 obtained through calculationWith the binocular vision spatial coordinate calculation method in the opencv database, the coordinates of the bone nodes of person number one 22 in the first camera 18 in the in-vehicle spatial coordinate system are,
the coordinates of the in-vehicle space coordinate system of the bone node of person No. two 23 in first camera 18 are,
third, calculation of dangerous distance threshold
First, based on the ECU protocol, the traveling speed V =30 of the vehicle is extracted through the vehicle OBD interface. At this time, the driver is defined as an interfered person q, i.e. q =1; the remaining persons are the persons carrying out the intervention p, i.e. p =2.
Then, a danger distance threshold value of each key limb part of the interfered person q is calculated. Setting the arm diameter c q =20mm. Shoulder width b q For the right shoulder skeletal node 2 coordinates of person q within the in-vehicle space coordinate system in the first camera 185 coordinate of the left shoulder skeleton node>With an inter-distance of>Person q head width a q In a first camera 18, a person q right temple bone node 16 coordinate +>In combination with left temple bone node 17 coordinate>At a distance of->Finally, a danger distance threshold value (unit mm) A of the head, the body and the arms of the person q to be interfered is set q 、B q 、 C q Respectively as follows: a. The q =163.9/2+30=111.95mm、B q =370.6/2+50=235.3mm、 C q =20/2+50=60mm。
Fourthly, judging abnormal behaviors of the personnel
Implementing an actual distance E between the position of the wrist of the interfering person p and the arm of the interfered person q q For the coordinates of a person p in the in-vehicle space coordinate system in the first camera 18, the left wrist bone node 7Node adjacent to the bone of person q arm>Distance d between the connecting lines c =19.37mm。
The head and the body of the interfered person q are set to be in the same axis. Implementing an actual distance D from the wrist position of the interfering person p to the head of the interfered person q q Actual distance from body F q Are all the coordinates of the left wrist bone node 7 of the person p in the in-vehicle space coordinate system in the first camera 18With the head and neck of the person qAt the bone nodeDistance d between the connecting lines ab =245.5mm. Delta = -16930 for determining that the position of the wrist of the person p who performs the interference is at the head or body of the person q who is interfered<0. Thus, F q =245.5mm。
Finally, the actual distance F of the wrist position of person p from the body of person q q =245.5mm greater than the danger distance B q =235.3mm. Actual distance E of wrist position of person p from arm of person q q =19.37mm less than the danger distance C q =60mm, the duration exceeds 10s, and therefore the person p performs an abnormal behavior.
The above-mentioned method for detecting abnormal behavior of the person operating the vehicle 21 based on vision is only a preferred method of the present invention, so that all equivalent changes or modifications made according to the characteristics and principles described in the patent application scope of the present invention are included in the patent application scope of the present invention.
Claims (1)
1. A vision-based method for detecting abnormal behaviors of operating vehicle personnel is characterized by comprising the following steps:
first, the camera is installed and calibrated
First, two cameras i are placed at the rear-view mirror of a manned vehicle (21) and numbered: i =1,2; adjusting the shooting angles of the two cameras to directly acquire image information including a driving part (20) and a surrounding area thereof; the scanning rate of each camera needs to be more than 5 frames/second;
then, forming a binocular vision camera I by the two cameras; the opencv database is utilized to independently calibrate the two cameras, and meanwhile, the opencv database is utilized to perform binocular calibration on the binocular vision camera I;
second, the real-time extraction of the posture of the person in the operating vehicle (21)
Firstly, numbering the personnel in the vehicle in the collected image according to the distribution condition of seats in the operating vehicle (21); setting m persons in the vehicle, marking the number of the person on the driving seat as 1, and sequentially numbering the rest persons as 2-m; the numbers of the same person collected by different cameras need to be consistent; setting the person with the serial number j as the person j;
then, marking and collecting skeleton nodes of all the persons in the image by using a deep learning algorithm in an openposition database; collecting a bone node k of a person j in an ith camera, wherein i =1,2; j =1,2, …, m; k =0,2, …,17; form a person j skeleton node setWherein it is present>For the coordinate of the bone node k of the person j acquired in the ith camera in the image coordinate system->
Finally, obtaining a j skeleton node set of the in-vehicle personnel acquired by the ith camera according to calculationObtaining the coordinates of a bone node k of a person j in an ith camera in a vehicle space coordinate system by using a binocular vision space coordinate calculation method in an opencv database>
Third, calculation of a dangerous distance threshold
Firstly, setting the number of an interfering person in the vehicle as p and the number of an interfered person as q; extracting a running speed V of the vehicle through an OBD interface of the operating vehicle (21) based on an ECU protocol; when the operating vehicle (21) is running, namely | V | >0, defining the driver as an interfered person q, namely q =1; the other personnel are implementing interference personnel p, namely p is not equal to 1; when the vehicle is not driven, namely V =0, defining that a driver is an interfered person q =1, and the rest persons are interfering persons p ≠ 1, or the driver is an interfering person p =1 and the rest persons are interfered persons q ≠ 1;
then, calculating a dangerous distance threshold value of each key limb part of the interfered person q; the key limb parts comprise the head, the body and the arms; each node in the key limb part comprises a nasal skeleton node (0), a neck skeleton node (1), a right shoulder skeleton node (2), a right elbow skeleton node (3), a right wrist skeleton node (4), a left shoulder skeleton node (5), a left elbow skeleton node (6), a left wrist skeleton node (7), a right cross-section skeleton node (8), a right knee skeleton node (9), a right ankle skeleton node (10), a left cross-section skeleton node (11), a left knee skeleton node (12), a left ankle skeleton node (13), a right eye node (14), a left eye node (15), a right temple skeleton node (16) and a left temple skeleton node (17);
setting the q arm of the interfered person as an axis by using a connecting line of a wrist bone node, an elbow bone node and a shoulder bone node k and k-1, wherein k =3,4,6,7; diameter c q A cylindrical body of (a); the body takes a nasal bone node (0) and a neck bone node (1) as axes and has a shoulder width b q Is a cylinder with a diameter; the head takes a nasal bone node (0) and a neck bone node (1) as axes, and the distance a between temples q Is a cylinder with a diameter;
diameter of arm c q Setting by self according to the actual situation;
shoulder width b q Comprises the following steps: coordinates of q right shoulder skeleton node (2) of interfered person in-vehicle space coordinate system in ith cameraCoordinate of the left shoulder skeleton node (5)>Distance therebetween:
head width a of interfered person q q Comprises the following steps: coordinates of q right temple skeleton node (16) of interfered person in the in-vehicle space coordinate system of the ith cameraCoordinate of the left temple bone node (17)>Distance between:
and finally, setting danger distance thresholds of the head, the body and the arms of the interfered person q as follows:
in the formula, A q 、B q 、C q The dangerous distance threshold of the head, body and arm of the person to be interfered is q, and the unit is mm;
fourthly, judging abnormal behaviors of the personnel
The judgment of the abnormal behavior of the personnel comprises the following steps: calculating the actual distance between the wrist position of the interfering person p and the head, body and arm of the interfered person q, and judging whether the interfering person p has abnormal behavior;
implementing an actual distance E between the position of the wrist of the interfering person p and the arm of the interfered person q q Comprises the following steps: coordinates of right wrist skeleton node (4) and left wrist skeleton node (7) of implementation interference personnel p in space coordinate system in ith camera middle vehicleNode adjacent to skeleton at arm q of interfered person>Inter linkDistance d of the line c (ii) a Wherein the adjacent node of the skeleton at the arm q of the person to be disturbed>Is connected with each other by
Wherein, (x, y, z) represents adjacent nodes of skeleton at q arms of interfered personAny point on the connecting line;
setting the head and the body of an interfered person q to be in the same axis; implementing an actual distance D between the wrist position of the interfering person p and the head of the interfered person q q Actual distance from body F q The method comprises the following steps: coordinates of right wrist skeleton node (4) and left wrist skeleton node (7) of implementation interference personnel p in space coordinate system in ith camera middle vehicleThe node of the head and the neck of the person q to be interfered>Distance d between the connecting lines ab (ii) a Wherein, the interfered person q is at the skeleton node on the head and the neck>The connecting line between the two wires is,
wherein (x) 1 ,y 1 ,z 1 ) Representing the bone nodes at the head and neck of an interfered person qAny point on the connecting line;
the formula for judging the wrist position of the interfering person p to be in the head or body of the interfered person q is as follows,
when the wrist position of the person implementing the interference p is positioned at the head of the person q to be interfered, delta is larger than 0; when the wrist position of the person p is in the body of the person q to be interfered, delta is less than 0; implementing the actual distance D between the wrist position of the interfering person p and the head of the interfered person q q Or the actual distance F from the body of the person q to be disturbed q In order to realize the purpose,
finally, judging whether the person in the vehicle has abnormal behavior; when the wrist position of the interfering person p is at the actual distance E from the arm and body of the interfered person q q 、F q Continuously less than a dangerous distance C q 、B q If the time exceeds 10s, the implementation interfering person p takes abnormal behavior; when the actual distance D between the wrist position of the interfering person p and the head of the interfered person q is implemented q Less than the critical distance A in 5s q If the cumulative time exceeds 1s, the interfering person p performs an abnormal action.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910229246.9A CN109919134B (en) | 2019-03-25 | 2019-03-25 | Vision-based method for detecting abnormal behaviors of operating vehicle personnel |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910229246.9A CN109919134B (en) | 2019-03-25 | 2019-03-25 | Vision-based method for detecting abnormal behaviors of operating vehicle personnel |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109919134A CN109919134A (en) | 2019-06-21 |
CN109919134B true CN109919134B (en) | 2023-04-18 |
Family
ID=66966749
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910229246.9A Active CN109919134B (en) | 2019-03-25 | 2019-03-25 | Vision-based method for detecting abnormal behaviors of operating vehicle personnel |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109919134B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110751100A (en) * | 2019-10-22 | 2020-02-04 | 北京理工大学 | Auxiliary training method and system for stadium |
CN112434564B (en) * | 2020-11-04 | 2023-06-27 | 北方工业大学 | Detection system for abnormal aggregation behavior in bus |
CN112686090B (en) * | 2020-11-04 | 2024-02-06 | 北方工业大学 | Intelligent monitoring system for abnormal behavior in bus |
CN116363632A (en) * | 2021-12-23 | 2023-06-30 | 比亚迪股份有限公司 | Method and device for monitoring abnormal behavior in vehicle and computer storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104442566A (en) * | 2014-11-13 | 2015-03-25 | 长安大学 | Vehicle inside passenger dangerous state alarming device and alarming method |
CN105551182A (en) * | 2015-11-26 | 2016-05-04 | 吉林大学 | Driving state monitoring system based on Kinect human body posture recognition |
JP2016200910A (en) * | 2015-04-08 | 2016-12-01 | 日野自動車株式会社 | Driver state determination device |
CN107665326A (en) * | 2016-07-29 | 2018-02-06 | 奥的斯电梯公司 | Monitoring system, passenger transporter and its monitoring method of passenger transporter |
CN108446600A (en) * | 2018-02-27 | 2018-08-24 | 上海汽车集团股份有限公司 | A kind of vehicle driver's fatigue monitoring early warning system and method |
CN108986400A (en) * | 2018-09-03 | 2018-12-11 | 深圳市尼欧科技有限公司 | A kind of third party based on image procossing, which multiplies, drives safety automatic-alarming method |
-
2019
- 2019-03-25 CN CN201910229246.9A patent/CN109919134B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104442566A (en) * | 2014-11-13 | 2015-03-25 | 长安大学 | Vehicle inside passenger dangerous state alarming device and alarming method |
JP2016200910A (en) * | 2015-04-08 | 2016-12-01 | 日野自動車株式会社 | Driver state determination device |
CN105551182A (en) * | 2015-11-26 | 2016-05-04 | 吉林大学 | Driving state monitoring system based on Kinect human body posture recognition |
CN107665326A (en) * | 2016-07-29 | 2018-02-06 | 奥的斯电梯公司 | Monitoring system, passenger transporter and its monitoring method of passenger transporter |
CN108446600A (en) * | 2018-02-27 | 2018-08-24 | 上海汽车集团股份有限公司 | A kind of vehicle driver's fatigue monitoring early warning system and method |
CN108986400A (en) * | 2018-09-03 | 2018-12-11 | 深圳市尼欧科技有限公司 | A kind of third party based on image procossing, which multiplies, drives safety automatic-alarming method |
Also Published As
Publication number | Publication date |
---|---|
CN109919134A (en) | 2019-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109919134B (en) | Vision-based method for detecting abnormal behaviors of operating vehicle personnel | |
CN208344074U (en) | A kind of comprehensive DAS (Driver Assistant System) of the automobile based on machine vision | |
US10369926B2 (en) | Driver state sensing system, driver state sensing method, and vehicle including the same | |
CN104442566B (en) | A kind of passenger's precarious position warning device and alarm method | |
US8593519B2 (en) | Field watch apparatus | |
CN107697069A (en) | Fatigue of automobile driver driving intelligent control method | |
JP2020114377A (en) | System and method detecting problematic health situation | |
US11514688B2 (en) | Drowsiness detection system | |
US20180012090A1 (en) | Visual learning system and method for determining a driver's state | |
CN110826369A (en) | Driver attention detection method and system during driving | |
CN106184220B (en) | Abnormal driving detection method in a kind of track based on vehicle location track | |
CN111645694B (en) | Driver driving state monitoring system and method based on attitude estimation | |
CN104616438A (en) | Yawning action detection method for detecting fatigue driving | |
CN102555982A (en) | Safety belt wearing identification method and device based on machine vision | |
CN105252973B (en) | For the temperature monitoring method of automobile, device and equipment | |
CN212484555U (en) | Fatigue driving multi-source information detection system | |
CN103700220A (en) | Fatigue driving monitoring device | |
Pech et al. | Head tracking based glance area estimation for driver behaviour modelling during lane change execution | |
CN106650635A (en) | Method and system for detecting rearview mirror viewing behavior of driver | |
CN107571735A (en) | A kind of vehicle drivers status monitoring system and monitoring method | |
CN114005088A (en) | Safety rope wearing state monitoring method and system | |
CN107170190B (en) | A kind of dangerous driving warning system | |
CN108304745A (en) | A kind of driver's driving behavior detection method, device | |
CN109383516A (en) | A kind of anomaly analysis behavioral value system based on user behavior analysis | |
CN108256487A (en) | A kind of driving state detection device and method based on reversed binocular |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |