CN113705542A - Pedestrian behavior state identification method and system - Google Patents

Pedestrian behavior state identification method and system Download PDF

Info

Publication number
CN113705542A
CN113705542A CN202111251133.2A CN202111251133A CN113705542A CN 113705542 A CN113705542 A CN 113705542A CN 202111251133 A CN202111251133 A CN 202111251133A CN 113705542 A CN113705542 A CN 113705542A
Authority
CN
China
Prior art keywords
pedestrian
skeleton
node data
behavior state
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111251133.2A
Other languages
Chinese (zh)
Inventor
吕超
张哲雨
肖峣
龚建伟
臧政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beilihuidong Beijing Education Technology Co ltd
Beijing Institute of Technology BIT
Original Assignee
Beilihuidong Beijing Education Technology Co ltd
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beilihuidong Beijing Education Technology Co ltd, Beijing Institute of Technology BIT filed Critical Beilihuidong Beijing Education Technology Co ltd
Priority to CN202111251133.2A priority Critical patent/CN113705542A/en
Publication of CN113705542A publication Critical patent/CN113705542A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a pedestrian behavior state identification method and system. The method comprises the steps of collecting pedestrian image information by using a vehicle-mounted camera; determining pedestrian skeleton node data by adopting a skeleton identification method according to the pedestrian image information; extracting pedestrian skeleton characteristics according to the pedestrian skeleton node data; determining the behavior state of the pedestrian by adopting a pedestrian behavior classifier according to the skeleton characteristics of the pedestrian; the pedestrian behavior classifier takes the pedestrian skeleton characteristics as input and takes the pedestrian behavior state as output; the invention can improve the accuracy of identifying the pedestrian state.

Description

Pedestrian behavior state identification method and system
Technical Field
The invention relates to the field of image processing, in particular to a pedestrian behavior state identification method and system.
Background
With the intelligent starting of automobiles, the requirements of people on the safety of the automobiles are higher and higher; in an actual road traffic environment, a pedestrian is one of the most complex traffic elements, and the possibility of an accident occurring with a vehicle is high, so that it is very important to accurately identify the behavior state of the pedestrian and assist the vehicle or a driver to better respond.
At the present stage, the identification method aiming at the behavior state of the pedestrian is mainly judged directly through original image data; the identification method has the limitations that the data volume is large, the operation is complex, and meanwhile, the interference information is more, so that the method cannot accurately identify the behavior state of the pedestrian, and the practicability is not strong.
Disclosure of Invention
The invention aims to provide a pedestrian behavior state identification method and system, which can improve the accuracy of identifying the pedestrian state.
In order to achieve the purpose, the invention provides the following scheme:
a pedestrian behavior state recognition method, comprising:
acquiring pedestrian image information by using a vehicle-mounted camera; the pedestrian image information includes: setting a continuous image sequence in a time period; the image sequence comprises image information and timestamp information;
determining pedestrian skeleton node data by adopting a skeleton identification method according to the pedestrian image information; the pedestrian skeleton node data includes: upper and lower spinal insertion points, shoulder joints, elbow joints, hip joints, wrist joints, knee joints and ankle joints;
extracting pedestrian skeleton characteristics according to the pedestrian skeleton node data; the pedestrian skeleton features include: distance features, angle numerical features, and angle quantity features;
determining the behavior state of the pedestrian by adopting a pedestrian behavior classifier according to the skeleton characteristics of the pedestrian; the pedestrian behavior classifier takes the pedestrian skeleton characteristics as input and takes the pedestrian behavior state as output; the pedestrian behavior state includes: still, walking or running.
Optionally, the extracting pedestrian skeleton features according to the pedestrian skeleton node data specifically includes:
preprocessing the pedestrian skeleton node data; the pretreatment comprises the following steps: repairing skeleton missing and repairing node missing;
and extracting the pedestrian skeleton characteristics by utilizing the preprocessed pedestrian skeleton node data.
Optionally, the extraction of the pedestrian skeleton features by using the preprocessed pedestrian skeleton node data specifically includes:
using formulas
Figure DEST_PATH_IMAGE001
And formula
Figure 844082DEST_PATH_IMAGE002
Determining distance characteristics;
using formulas
Figure DEST_PATH_IMAGE003
And formula
Figure 258751DEST_PATH_IMAGE004
Determining an angle numerical characteristic;
using formulas
Figure DEST_PATH_IMAGE005
And formula
Figure 88778DEST_PATH_IMAGE006
Determining an angle quantity characteristic;
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE007
and
Figure 577397DEST_PATH_IMAGE008
respectively, are the characteristics of the distance,
Figure DEST_PATH_IMAGE009
the distance between the left and right knee nodes and the distance between the left and right ankle nodes divided by the height of the spine of the pedestrian,
Figure 875654DEST_PATH_IMAGE010
are respectively the numerical characteristics of the angle,
Figure DEST_PATH_IMAGE011
is an angle formed by the double knee joints of the human body,
Figure 452654DEST_PATH_IMAGE012
respectively, the characteristics of the number of the angles,nnode numbers are used.
Optionally, the determining the behavior state of the pedestrian by using a pedestrian behavior classifier according to the pedestrian skeleton feature further includes:
clustering pedestrian image information according to the pedestrian skeleton characteristics to establish a pedestrian behavior classifier; the clustering method comprises the following steps: a K-means + + clustering algorithm or a gaussian mixture model clustering algorithm.
A pedestrian behavior state recognition system comprising:
the pedestrian image information acquisition module is used for acquiring pedestrian image information by utilizing the vehicle-mounted camera; the pedestrian image information includes: setting a continuous image sequence in a time period; the image sequence comprises image information and timestamp information;
the pedestrian skeleton node data determining module is used for determining pedestrian skeleton node data by adopting a skeleton recognition method according to the pedestrian image information; the pedestrian skeleton node data includes: upper and lower spinal insertion points, shoulder joints, elbow joints, hip joints, wrist joints, knee joints and ankle joints;
the pedestrian framework feature extraction module is used for extracting pedestrian framework features according to the pedestrian framework node data; the pedestrian skeleton features include: distance features, angle numerical features, and angle quantity features;
the behavior state determining module is used for determining the behavior state of the pedestrian by adopting a pedestrian behavior classifier according to the skeleton characteristics of the pedestrian; the pedestrian behavior classifier takes the pedestrian skeleton characteristics as input and takes the pedestrian behavior state as output; the pedestrian behavior state includes: still, walking or running.
Optionally, the human skeleton feature extraction module specifically includes:
the pedestrian skeleton node data preprocessing unit is used for preprocessing the pedestrian skeleton node data; the pretreatment comprises the following steps: repairing skeleton missing and repairing node missing;
and the human skeleton feature extraction unit is used for extracting the pedestrian skeleton features by utilizing the preprocessed pedestrian skeleton node data.
Optionally, the human skeleton feature extraction unit specifically includes:
distance feature determination subunit for using formula
Figure DEST_PATH_IMAGE013
And formula
Figure 727515DEST_PATH_IMAGE014
Determining distance characteristics;
angular numerical characteristic determination subunit for utilizing a formula
Figure DEST_PATH_IMAGE015
And formula
Figure 339149DEST_PATH_IMAGE016
Determining an angle numerical characteristic;
an angle quantity characteristic determination subunit for utilizing the formula
Figure DEST_PATH_IMAGE017
And formula
Figure 339466DEST_PATH_IMAGE018
Determining an angle quantity characteristic;
wherein the content of the first and second substances,
Figure 525466DEST_PATH_IMAGE007
and
Figure 308745DEST_PATH_IMAGE008
respectively, are the characteristics of the distance,
Figure 647323DEST_PATH_IMAGE009
the distance between the left and right knee nodes and the distance between the left and right ankle nodes divided by the height of the spine of the pedestrian,
Figure 195372DEST_PATH_IMAGE010
are respectively the numerical characteristics of the angle,
Figure 370133DEST_PATH_IMAGE011
is an angle formed by the double knee joints of the human body,
Figure 878474DEST_PATH_IMAGE012
respectively, the characteristics of the number of the angles,nnode numbers are used.
Optionally, the method further comprises:
the pedestrian behavior classifier establishing module is used for clustering pedestrian image information according to the pedestrian skeleton characteristics to establish a pedestrian behavior classifier; the clustering method comprises the following steps: a K-means + + clustering algorithm or a gaussian mixture model clustering algorithm.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
according to the pedestrian behavior state identification method and system provided by the invention, the pedestrian skeleton information is extracted from the image information collected by the camera, the required features are extracted from the pedestrian skeleton information, and the difficulty of feature extraction is reduced. The pedestrian behavior state classifier is obtained by training through the clustering algorithm, the cost of obtaining the label through clustering calculation is reduced, and the training time of the system is shortened. Meanwhile, the data volume of the pedestrian skeleton nodes is far smaller than that of the original image, and the data transmission pressure in the system is reduced. The pedestrian behavior state can be recognized, and the recognition accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
Fig. 1 is a schematic flow chart of a pedestrian behavior state identification method provided by the invention;
FIG. 2 is a schematic view of a human skeleton node;
fig. 3 is a schematic structural diagram of a pedestrian behavior state recognition system provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide a pedestrian behavior state identification method and system, which can improve the accuracy of identifying the pedestrian state.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Fig. 1 is a schematic flow chart of a pedestrian behavior state identification method provided by the present invention, and as shown in fig. 1, the pedestrian behavior state identification method provided by the present invention includes:
s101, acquiring pedestrian image information by using a vehicle-mounted camera; the pedestrian image information includes: setting a continuous image sequence in a time period; the image sequence comprises image information and timestamp information;
s102, determining pedestrian skeleton node data by adopting a skeleton identification method according to the pedestrian image information; as shown in fig. 2, the pedestrian skeleton node data includes: the system comprises spinal upper and lower dead points (1 node and 8 node), shoulder joints (2 node and 5 node), elbow joints (3 node and 6 node), hip joints (9 node and 12 node), wrist joints (4 node and 7 node), knee joints (10 node and 13 node) and ankle joints (11 node and 14 node), wherein the node data is composed of two-dimensional coordinate information of the corresponding nodes in an image;
s103, extracting pedestrian skeleton features according to the pedestrian skeleton node data; the pedestrian skeleton features include: distance features, angle numerical features, and angle quantity features;
s103 specifically comprises the following steps:
preprocessing the pedestrian skeleton node data; the pretreatment comprises the following steps: repairing skeleton missing and repairing node missing;
the specific pretreatment process comprises the following steps:
for the partial frame image skeleton missing case:
firstly, judging whether the front j frame and the back k frame of the missing frame are missing or not, and if so, judging that the missing frame can not be repaired. If not missing, selecting the two most recent frames as the data source of the patch
Figure DEST_PATH_IMAGE019
. The node data of the missing frame i can be calculated by the following formula:
Figure 930613DEST_PATH_IMAGE020
in the formula
Figure DEST_PATH_IMAGE021
Respectively are the horizontal and vertical pixel coordinates of the node, and n is the node label.
For the case of missing partial nodes in the image:
firstly, judging whether corresponding nodes in skeletons in a previous frame and a next frame are missing or not, if the corresponding nodes are all present, then applying a linear interpolation mode to carry out patching, and if the corresponding nodes are missing, not carrying out patching, because the interpolation at the moment can cause larger errors. The formula for repairing each node is as follows:
Figure 965741DEST_PATH_IMAGE022
in the formula
Figure 877065DEST_PATH_IMAGE021
Respectively, the horizontal and vertical pixel coordinates of the nodes, as shown in the following tablenA node index is denoted, and i denotes that it is skeleton node data in the ith frame.
And extracting the pedestrian skeleton characteristics by utilizing the preprocessed pedestrian skeleton node data.
Using formulas
Figure 205410DEST_PATH_IMAGE001
And formula
Figure DEST_PATH_IMAGE023
Determining distance characteristics;
using formulas
Figure 361322DEST_PATH_IMAGE003
And formula
Figure 280868DEST_PATH_IMAGE004
Determining an angle numerical characteristic;
using formulas
Figure 931685DEST_PATH_IMAGE005
And formula
Figure 329300DEST_PATH_IMAGE006
Determining an angle quantity characteristic;
wherein the content of the first and second substances,
Figure 496976DEST_PATH_IMAGE007
and
Figure 820379DEST_PATH_IMAGE008
respectively, are the characteristics of the distance,
Figure 191448DEST_PATH_IMAGE009
the distance between the left and right knee nodes and the distance between the left and right ankle nodes divided by the height of the spine of the pedestrian,
Figure 376442DEST_PATH_IMAGE010
are respectively the numerical characteristics of the angle,
Figure 650822DEST_PATH_IMAGE011
is an angle formed by the double knee joints of the human body,
Figure 646591DEST_PATH_IMAGE012
respectively, the characteristics of the number of the angles,nnode numbers are used.
Figure 19803DEST_PATH_IMAGE024
Correction factors 10 and 5 are added to keep the features in order of magnitude consistent with the cluster features later.
Figure 992176DEST_PATH_IMAGE025
Figure 619598DEST_PATH_IMAGE026
Is composed of
Figure 35536DEST_PATH_IMAGE027
And the Euclidean distance between two nodes under the pixel coordinate system.
Figure 882662DEST_PATH_IMAGE010
Are respectively as
Figure 160191DEST_PATH_IMAGE028
Divided by an adjustment factor 90 to ensure that the features of each dimension remain consistent in magnitude.
Figure 422545DEST_PATH_IMAGE029
Is composed of
Figure 524231DEST_PATH_IMAGE028
The number of more than 175 deg. in 10 frames is divided by the adjustment factor 10,
Figure 91610DEST_PATH_IMAGE030
is composed of
Figure 422097DEST_PATH_IMAGE028
The number of frames greater than 175 is divided by the adjustment factor of 10.
S104, determining the behavior state of the pedestrian by adopting a pedestrian behavior classifier according to the skeleton characteristics of the pedestrian; the pedestrian behavior classifier takes the pedestrian skeleton characteristics as input and takes the pedestrian behavior state as output; the pedestrian behavior state includes: still, walking or running.
S104, clustering pedestrian image information according to the pedestrian skeleton characteristics, and establishing a pedestrian behavior classifier; the clustering method comprises the following steps: a K-means + + clustering algorithm or a gaussian mixture model clustering algorithm.
The clustered pedestrian image information is clustered to obtain three clusters, the pedestrian behavior state characteristic value division ranges corresponding to different clusters can be used as the basis for judging the pedestrian behavior state, namely the cluster to which the pedestrian skeleton characteristic belongs is judged by judging the characteristic value range to which the pedestrian skeleton characteristic belongs, and then the corresponding behavior state is judged. The pedestrian behavior state is static, walking and running
Figure DEST_PATH_IMAGE031
Judging pedestrian behavior states corresponding to different clusters according to actual experience, such as a static state of a row with small distance characteristics and a running state of a row with large angle numerical values and quantity characteristics; and judging the cluster to which the input pedestrian skeleton characteristic belongs by judging the range of the characteristic value to which the input pedestrian skeleton characteristic belongs, and further judging the corresponding behavior state.
Fig. 3 is a schematic structural diagram of a pedestrian behavior state recognition system provided by the present invention, and as shown in fig. 3, the pedestrian behavior state recognition system provided by the present invention includes:
a pedestrian image information acquisition module 301, configured to acquire pedestrian image information by using a vehicle-mounted camera; the pedestrian image information includes: setting a continuous image sequence in a time period; the image sequence comprises image information and timestamp information;
a pedestrian skeleton node data determining module 302, configured to determine pedestrian skeleton node data by using a skeleton recognition method according to the pedestrian image information; the pedestrian skeleton node data includes: upper and lower spinal insertion points, shoulder joints, elbow joints, hip joints, wrist joints, knee joints and ankle joints;
a human skeleton feature extraction module 303, configured to extract a pedestrian skeleton feature according to the pedestrian skeleton node data; the pedestrian skeleton features include: distance features, angle numerical features, and angle quantity features;
a behavior state determination module 304, configured to determine a behavior state of a pedestrian by using a pedestrian behavior classifier according to the skeleton feature of the pedestrian; the pedestrian behavior classifier takes the pedestrian skeleton characteristics as input and takes the pedestrian behavior state as output; the pedestrian behavior state includes: still, walking or running.
The human skeleton feature extraction module 303 specifically includes:
the pedestrian skeleton node data preprocessing unit is used for preprocessing the pedestrian skeleton node data; the pretreatment comprises the following steps: repairing skeleton missing and repairing node missing;
and the human skeleton feature extraction unit is used for extracting the pedestrian skeleton features by utilizing the preprocessed pedestrian skeleton node data.
The human skeleton feature extraction unit specifically comprises:
distance feature determination subunit for using formula
Figure 615923DEST_PATH_IMAGE013
And formula
Figure 108084DEST_PATH_IMAGE032
Determining distance characteristics;
angular numerical characteristic determination subunit for utilizing a formula
Figure 926873DEST_PATH_IMAGE003
And formula
Figure 811784DEST_PATH_IMAGE016
Determining an angle numerical characteristic;
an angle quantity characteristic determination subunit for utilizing the formula
Figure 517572DEST_PATH_IMAGE005
And formula
Figure 698411DEST_PATH_IMAGE018
Determining an angle quantity characteristic;
wherein the content of the first and second substances,
Figure 489649DEST_PATH_IMAGE007
and
Figure 178251DEST_PATH_IMAGE008
respectively, are the characteristics of the distance,
Figure 253392DEST_PATH_IMAGE009
the distance between the left and right knee nodes and the distance between the left and right ankle nodes divided by the height of the spine of the pedestrian,
Figure 87355DEST_PATH_IMAGE010
are respectively the numerical characteristics of the angle,
Figure 382202DEST_PATH_IMAGE011
is an angle formed by the double knee joints of the human body,
Figure 110380DEST_PATH_IMAGE012
the angle quantity characteristics are respectively, and n is a node label.
The invention provides a pedestrian behavior state recognition system, which further comprises:
the pedestrian behavior classifier establishing module is used for clustering pedestrian image information according to the pedestrian skeleton characteristics to establish a pedestrian behavior classifier; the clustering method comprises the following steps: a K-means + + clustering algorithm or a gaussian mixture model clustering algorithm.
The invention has the following advantages:
1. short training time and small transmission data quantity
The method is different from the traditional method which directly adopts the pedestrian image information as the pedestrian behavior recognition data source, firstly extracts the pedestrian skeleton key nodes from the original image information, and then extracts the features from the pedestrian skeleton key nodes, thereby reducing the difficulty of feature extraction, simplifying the subsequent clustering system, simultaneously reducing the cost of obtaining the label by clustering calculation, and shortening the training time of the system. Meanwhile, the data volume of the pedestrian skeleton nodes is far smaller than that of the original image, and the data transmission pressure in the system is reduced.
2. The robustness to interference is stronger
The data preprocessing method used by the method can be used for repairing missing or incomplete skeleton data, the integrity and continuity of the data are improved, and the influence of few data missing on the recognition effect is reduced.
3. The intuition is strong, and the system is conveniently adjusted according to specific conditions
The method adopts the pedestrian skeleton key nodes as the basis of behavior recognition, has stronger intuition, can easily give practical significance to the relevant characteristic parameters, has stronger interpretability, reduces the difficulty of characteristic selection, and is convenient to adjust the relevant characteristics or parameters according to specific conditions.
4. Without manually defining parameter ranges
The method adopts a clustering analysis method to cluster pedestrians with different characteristic parameters into k classes, and different classes of pedestrians have different behavior states in local areas; according to the clustering result, a pedestrian behavior recognizer can be trained, the recognizer can recognize the behavior state of the pedestrian according to the characteristic parameters, and weak generalization and uncertainty caused by judging the behavior state of the pedestrian through artificially and subjectively dividing the parameter range in the traditional pedestrian behavior recognition method are avoided.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (8)

1. A pedestrian behavior state recognition method is characterized by comprising the following steps:
acquiring pedestrian image information by using a vehicle-mounted camera; the pedestrian image information includes: setting a continuous image sequence in a time period; the image sequence comprises image information and timestamp information;
determining pedestrian skeleton node data by adopting a skeleton identification method according to the pedestrian image information; the pedestrian skeleton node data includes: upper and lower spinal insertion points, shoulder joints, elbow joints, hip joints, wrist joints, knee joints and ankle joints;
extracting pedestrian skeleton characteristics according to the pedestrian skeleton node data; the pedestrian skeleton features include: distance features, angle numerical features, and angle quantity features;
determining the behavior state of the pedestrian by adopting a pedestrian behavior classifier according to the skeleton characteristics of the pedestrian; the pedestrian behavior classifier takes the pedestrian skeleton characteristics as input and takes the pedestrian behavior state as output; the pedestrian behavior state includes: still, walking or running.
2. The method according to claim 1, wherein the extracting pedestrian skeleton features according to the pedestrian skeleton node data specifically comprises:
preprocessing the pedestrian skeleton node data; the pretreatment comprises the following steps: repairing skeleton missing and repairing node missing;
and extracting the pedestrian skeleton characteristics by utilizing the preprocessed pedestrian skeleton node data.
3. The method according to claim 2, wherein the extraction of the pedestrian skeleton features by using the preprocessed pedestrian skeleton node data specifically comprises:
using formulas
Figure 43490DEST_PATH_IMAGE001
And formula
Figure 347432DEST_PATH_IMAGE002
Determining distance characteristics;
using formulas
Figure 479947DEST_PATH_IMAGE003
And formula
Figure 202047DEST_PATH_IMAGE004
Determining an angle numerical characteristic;
using formulas
Figure 865109DEST_PATH_IMAGE005
And formula
Figure 171194DEST_PATH_IMAGE006
Determining an angle quantity characteristic;
wherein the content of the first and second substances,
Figure 328637DEST_PATH_IMAGE007
and
Figure 154511DEST_PATH_IMAGE008
respectively, are the characteristics of the distance,
Figure 240672DEST_PATH_IMAGE009
the left and right kneesThe distance between the nodes and the distance between the left and right ankle nodes divided by the height of the spine of the pedestrian,
Figure 801097DEST_PATH_IMAGE010
are respectively the numerical characteristics of the angle,
Figure 11499DEST_PATH_IMAGE011
is an angle formed by the double knee joints of the human body,
Figure 941147DEST_PATH_IMAGE012
the angle quantity characteristics are respectively, and n is a node label.
4. The method according to claim 1, wherein the method for identifying the pedestrian behavior state by using a pedestrian behavior classifier according to the pedestrian skeleton features further comprises the following steps:
clustering pedestrian image information according to the pedestrian skeleton characteristics to establish a pedestrian behavior classifier; the clustering method comprises the following steps: a K-means + + clustering algorithm or a gaussian mixture model clustering algorithm.
5. A pedestrian behavior state recognition system, comprising:
the pedestrian image information acquisition module is used for acquiring pedestrian image information by utilizing the vehicle-mounted camera; the pedestrian image information includes: setting a continuous image sequence in a time period; the image sequence comprises image information and timestamp information;
the pedestrian skeleton node data determining module is used for determining pedestrian skeleton node data by adopting a skeleton recognition method according to the pedestrian image information; the pedestrian skeleton node data includes: upper and lower spinal insertion points, shoulder joints, elbow joints, hip joints, wrist joints, knee joints and ankle joints;
the pedestrian framework feature extraction module is used for extracting pedestrian framework features according to the pedestrian framework node data; the pedestrian skeleton features include: distance features, angle numerical features, and angle quantity features;
the behavior state determining module is used for determining the behavior state of the pedestrian by adopting a pedestrian behavior classifier according to the skeleton characteristics of the pedestrian; the pedestrian behavior classifier takes the pedestrian skeleton characteristics as input and takes the pedestrian behavior state as output; the pedestrian behavior state includes: still, walking or running.
6. The pedestrian behavior state recognition system according to claim 5, wherein the human skeleton feature extraction module specifically comprises:
the pedestrian skeleton node data preprocessing unit is used for preprocessing the pedestrian skeleton node data; the pretreatment comprises the following steps: repairing skeleton missing and repairing node missing;
and the human skeleton feature extraction unit is used for extracting the pedestrian skeleton features by utilizing the preprocessed pedestrian skeleton node data.
7. The pedestrian behavior state recognition system according to claim 6, wherein the human skeleton feature extraction unit specifically includes:
distance feature determination subunit for using formula
Figure 211591DEST_PATH_IMAGE013
And formula
Figure 462575DEST_PATH_IMAGE014
Determining distance characteristics;
angular numerical characteristic determination subunit for utilizing a formula
Figure 463285DEST_PATH_IMAGE003
And formula
Figure 263751DEST_PATH_IMAGE015
Determining an angle numerical characteristic;
an angle quantity characteristic determination subunit for utilizing the formula
Figure 924671DEST_PATH_IMAGE005
And formula
Figure 177797DEST_PATH_IMAGE016
Determining an angle quantity characteristic;
wherein the content of the first and second substances,
Figure 979269DEST_PATH_IMAGE007
and
Figure 384974DEST_PATH_IMAGE008
respectively, are the characteristics of the distance,
Figure 997221DEST_PATH_IMAGE009
the distance between the left and right knee nodes and the distance between the left and right ankle nodes divided by the height of the spine of the pedestrian,
Figure 989841DEST_PATH_IMAGE017
are respectively the numerical characteristics of the angle,
Figure 814577DEST_PATH_IMAGE011
is an angle formed by the double knee joints of the human body,
Figure 340368DEST_PATH_IMAGE018
the angle quantity characteristics are respectively, and n is a node label.
8. The pedestrian behavior state recognition system according to claim 5, further comprising:
the pedestrian behavior classifier establishing module is used for clustering pedestrian image information according to the pedestrian skeleton characteristics to establish a pedestrian behavior classifier; the clustering method comprises the following steps: a K-means + + clustering algorithm or a gaussian mixture model clustering algorithm.
CN202111251133.2A 2021-10-27 2021-10-27 Pedestrian behavior state identification method and system Pending CN113705542A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111251133.2A CN113705542A (en) 2021-10-27 2021-10-27 Pedestrian behavior state identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111251133.2A CN113705542A (en) 2021-10-27 2021-10-27 Pedestrian behavior state identification method and system

Publications (1)

Publication Number Publication Date
CN113705542A true CN113705542A (en) 2021-11-26

Family

ID=78647054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111251133.2A Pending CN113705542A (en) 2021-10-27 2021-10-27 Pedestrian behavior state identification method and system

Country Status (1)

Country Link
CN (1) CN113705542A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103810496A (en) * 2014-01-09 2014-05-21 江南大学 3D (three-dimensional) Gaussian space human behavior identifying method based on image depth information
US20140341439A1 (en) * 2013-05-17 2014-11-20 Tata Consultancy Services Limited Identification of People Using Multiple Skeleton Recording Devices
CN105930767A (en) * 2016-04-06 2016-09-07 南京华捷艾米软件科技有限公司 Human body skeleton-based action recognition method
CN109376663A (en) * 2018-10-29 2019-02-22 广东工业大学 A kind of human posture recognition method and relevant apparatus
CN110796110A (en) * 2019-11-05 2020-02-14 西安电子科技大学 Human behavior identification method and system based on graph convolution network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140341439A1 (en) * 2013-05-17 2014-11-20 Tata Consultancy Services Limited Identification of People Using Multiple Skeleton Recording Devices
CN103810496A (en) * 2014-01-09 2014-05-21 江南大学 3D (three-dimensional) Gaussian space human behavior identifying method based on image depth information
CN105930767A (en) * 2016-04-06 2016-09-07 南京华捷艾米软件科技有限公司 Human body skeleton-based action recognition method
CN109376663A (en) * 2018-10-29 2019-02-22 广东工业大学 A kind of human posture recognition method and relevant apparatus
CN110796110A (en) * 2019-11-05 2020-02-14 西安电子科技大学 Human behavior identification method and system based on graph convolution network

Similar Documents

Publication Publication Date Title
CN109657631B (en) Human body posture recognition method and device
CN108446678B (en) Dangerous driving behavior identification method based on skeletal features
CN108537191B (en) Three-dimensional face recognition method based on structured light camera
CN106709936A (en) Single target tracking method based on convolution neural network
CN103984948B (en) A kind of soft double-deck age estimation method based on facial image fusion feature
CN108960141A (en) Pedestrian's recognition methods again based on enhanced depth convolutional neural networks
CN104978550A (en) Face recognition method and system based on large-scale face database
CN110516633B (en) Lane line detection method and system based on deep learning
CN104361314A (en) Method and device for positioning power transformation equipment on basis of infrared and visible image fusion
CN104598885A (en) Method for detecting and locating text sign in street view image
CN101853388B (en) Unchanged view angle behavior identification method based on geometric invariable
CN103955680A (en) Action recognition method and device based on shape context
CN103971112A (en) Image feature extracting method and device
CN106251362A (en) A kind of sliding window method for tracking target based on fast correlation neighborhood characteristics point and system
CN104599291B (en) Infrared motion target detection method based on structural similarity and significance analysis
CN104036229A (en) Regression-based active appearance model initialization method
CN106447695A (en) Same object determining method and device in multi-object tracking
CN104573722A (en) Three-dimensional face race classifying device and method based on three-dimensional point cloud
CN102521582B (en) Human upper body detection and splitting method applied to low-contrast video
CN111444916A (en) License plate positioning and identifying method and system under unconstrained condition
CN116935361A (en) Deep learning-based driver distraction behavior detection method
CN113343927B (en) Intelligent face recognition method and system suitable for facial paralysis patient
CN106909916A (en) A kind of method based on cell phone platform quick detection with identification pavement zebra stripes
CN113705542A (en) Pedestrian behavior state identification method and system
CN102081741A (en) Pedestrian detecting method and system based on visual attention principle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211126

RJ01 Rejection of invention patent application after publication