CN112084980A - Pedestrian step state identification method and device - Google Patents

Pedestrian step state identification method and device Download PDF

Info

Publication number
CN112084980A
CN112084980A CN202010965318.9A CN202010965318A CN112084980A CN 112084980 A CN112084980 A CN 112084980A CN 202010965318 A CN202010965318 A CN 202010965318A CN 112084980 A CN112084980 A CN 112084980A
Authority
CN
China
Prior art keywords
connected domain
sensor data
frame
data
nth frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010965318.9A
Other languages
Chinese (zh)
Inventor
袁克亚
姚东星
杨伟清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Data Driven Technology Co ltd
Original Assignee
Beijing Data Driven Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Data Driven Technology Co ltd filed Critical Beijing Data Driven Technology Co ltd
Priority to CN202010965318.9A priority Critical patent/CN112084980A/en
Publication of CN112084980A publication Critical patent/CN112084980A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Abstract

The invention provides a method and a device for identifying the step state of a pedestrian, comprising the following steps of: collecting an Nth frame data matrix; carrying out image binarization processing on the Nth frame data matrix to obtain a signal amplitude data matrix; performing connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result; performing image feature extraction on the step detection result of the connected domain to obtain feature data of the N-th frame of connected domain; and comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame, determining the step state of the object to be detected according to the comparison result, and analyzing the walking state of the pedestrian so as to monitor the behavior of the pedestrian.

Description

Pedestrian step state identification method and device
Technical Field
The invention relates to the technical field of image processing, in particular to a pedestrian step state identification method and device.
Background
The intelligent ground is a lattice sensor network laid on the ground surface to sense the position of the pedestrian steps. When the pedestrian walks on the ground, the pedestrian steps on the corresponding dot matrix position on the intelligent ground, the sensor on the dot matrix position is excited to generate a signal, and the signal is sampled and transmitted to the rear-end signal processing system to be detected to form the step position of the pedestrian. And analyzing the step state of the pedestrian in the walking process of the pedestrian. However, there is no method for analyzing the step status of the pedestrian in the prior art.
No effective solution has been proposed to the above problems.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for identifying a step state of a pedestrian, which can analyze a walking state of the pedestrian so as to monitor a behavior of the pedestrian.
In a first aspect, an embodiment of the present invention provides a method for identifying a step state of a pedestrian, where the method includes:
collecting an Nth frame data matrix;
carrying out image binarization processing on the Nth frame data matrix to obtain a signal amplitude data matrix;
performing connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result;
performing image feature extraction on the step detection result of the connected domain to obtain feature data of the N-th frame of connected domain;
and comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame, and determining the step state of the object to be detected according to the comparison result.
Further, the N frame data matrix includes a plurality of sensor data, and the performing image binarization processing on the N frame data matrix to obtain a signal amplitude data matrix includes:
comparing the amplitude value corresponding to each sensor data with a set threshold value;
if the amplitude value is larger than the set threshold value, retaining the amplitude value corresponding to the sensor data;
if the amplitude value is smaller than the set threshold value, setting the amplitude value corresponding to the sensor data to be 0;
and forming the signal amplitude data matrix by using the sensor data with the reserved amplitude value and the sensor data set to be 0.
Further, the performing connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result includes repeatedly performing the following processing until the sensor data in the signal amplitude data matrix are traversed:
selecting any sensor data from the signal amplitude data matrix as current sensor data, and marking the current sensor data;
selecting sensor data adjacent to the current sensor data position from the sensor data which are not marked, and marking the selected sensor data;
and forming the connected domain by the current sensor data and the selected sensor data, and taking the connected domain as the step detection result of the connected domain.
Further, the step detection result of the connected domain includes at least one connected domain, and the image feature extraction is performed on the step detection result of the connected domain to obtain the feature data of the N-th frame of connected domain, including:
determining the number of sensor data in the connected domain;
obtaining the pixel area of the N frame connected domain according to the number of the sensor data;
and obtaining the average signal intensity of the N frame connected domain according to the amplitude value corresponding to the sensor data.
Further, the comparing the feature data of the connected domain of the nth frame with the feature data of the connected domain of the N-1 th frame, and determining the step state of the object to be detected according to the comparison result includes:
comparing the pixel area and the average signal intensity of the connected domain of the Nth frame with the pixel area and the average signal intensity of the connected domain at the same position of the N-1 th frame;
if the pixel area of the connected domain of the Nth frame is not smaller than the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is larger than the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in a falling state;
if the pixel area of the connected domain of the Nth frame is equal to the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is equal to the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in a staying state;
and if the pixel area of the connected domain of the Nth frame is not larger than the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is smaller than the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in the lifting state.
In a second aspect, an embodiment of the present invention provides a device for identifying a step state of a pedestrian, the device including:
the acquisition unit is used for acquiring the Nth frame data matrix;
a binarization processing unit, configured to perform image binarization processing on the nth frame data matrix to obtain a signal amplitude data matrix;
the connected domain processing unit is used for carrying out connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result;
the extraction unit is used for extracting image characteristics of the step detection result of the connected domain to obtain the characteristic data of the N-th frame of connected domain;
and the comparison unit is used for comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame and determining the step state of the object to be detected according to the comparison result.
Further, the nth frame data matrix includes a plurality of sensor data, and the binarization processing unit is specifically configured to:
comparing the amplitude value corresponding to each sensor data with a set threshold value;
if the amplitude value is larger than the set threshold value, retaining the amplitude value corresponding to the sensor data;
if the amplitude value is smaller than the set threshold value, setting the amplitude value corresponding to the sensor data to be 0;
and forming the signal amplitude data matrix by using the sensor data with the reserved amplitude value and the sensor data set to be 0.
Further, the connected component processing unit repeatedly executes the following processing until the sensor data in the signal amplitude data matrix are traversed:
selecting any sensor data from the signal amplitude data matrix as current sensor data, and marking the current sensor data;
selecting sensor data adjacent to the current sensor data position from the sensor data which are not marked, and marking the selected sensor data;
and forming the connected domain by the current sensor data and the selected sensor data, and taking the connected domain as the step detection result of the connected domain.
In a third aspect, an embodiment of the present invention provides an electronic device, including a memory and a processor, where the memory stores a computer program operable on the processor, and the processor implements the method described above when executing the computer program.
In a fourth aspect, embodiments of the invention provide a computer readable medium having non-volatile program code executable by a processor, the program code causing the processor to perform the method as described above.
The embodiment of the invention provides a method and a device for identifying the step state of a pedestrian, comprising the following steps of: collecting an Nth frame data matrix; carrying out image binarization processing on the Nth frame data matrix to obtain a signal amplitude data matrix; performing connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result; performing image feature extraction on the step detection result of the connected domain to obtain feature data of the N-th frame of connected domain; and comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame, determining the step state of the object to be detected according to the comparison result, and analyzing the walking state of the pedestrian so as to monitor the behavior of the pedestrian.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a method for identifying a step status of a pedestrian according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a 67 th frame data matrix according to a second embodiment of the present invention;
fig. 3 is a schematic diagram of a 16 × 8 signal amplitude data matrix according to a second embodiment of the present invention;
fig. 4 is a schematic diagram of a connected domain of a 67 th frame data matrix according to a second embodiment of the present invention;
fig. 5 is a schematic diagram of a connected domain of a 66 th frame data matrix according to a second embodiment of the present invention;
fig. 6 is a schematic view of a pedestrian step status recognition device according to a third embodiment of the present invention.
Icon:
1-a collection unit; 2-a binarization processing unit; 3-connected domain processing unit; 4-an extraction unit; 5-a comparison unit.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The intelligent ground is a lattice sensor network laid on the ground surface to sense the position of the pedestrian steps. When the pedestrian walks on the ground, the pedestrian steps on the corresponding dot matrix position on the intelligent ground, the sensor on the dot matrix position is excited to generate a signal, and the signal is sampled and transmitted to the rear-end signal processing system to be detected to form the step position of the pedestrian.
Signals of the dot matrix sensor on the intelligent ground at different moments are equivalent to a camera which continuously takes pictures of a specific area, and the ground induction conditions in the area can be recorded in sequence as time goes on. When the pedestrian walks on the ground, can be got off by the dot matrix sensor record on intelligence ground, obtain the state of pedestrian's step. According to the state of the footsteps of the pedestrians, the walking state of the pedestrians can be predicted, and therefore basis is provided for subsequent pedestrian behavior analysis.
For the understanding of the present embodiment, the following detailed description will be given of the embodiment of the present invention.
The first embodiment is as follows:
fig. 1 is a flowchart of a method for identifying a step status of a pedestrian according to an embodiment of the present invention.
Referring to fig. 1, the method includes the steps of:
step S101, collecting an Nth frame data matrix;
here, the Nth frame data matrix is an M N intelligent ground lattice, where M and N are positive integers, M > 16, and N > 4.
Step S102, carrying out image binarization processing on the Nth frame data matrix to obtain a signal amplitude data matrix;
here, the image binarization processing is also referred to as a 0-1 detection method, and compares the sensor data in the nth frame data matrix with a set threshold value to obtain a signal amplitude data matrix.
Step S103, performing connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result;
here, connected component analysis is performed on the sensor data in the signal amplitude data matrix, sensor data at adjacent positions are classified into the same connected component, and the formed connected component is used as a connected component step detection result.
Step S104, extracting image features of the step detection result of the connected domain to obtain feature data of the N frame of connected domain; and the characteristic data of the N-th frame connected component comprises the pixel area of the connected component and the average signal intensity of the connected component.
And S105, comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame, and determining the step state of the object to be detected according to the comparison result.
By repeating the above steps, the step state of the object to be detected, which changes with time, can be obtained. Wherein the object to be detected includes, but is not limited to, a pedestrian.
Further, the nth frame data matrix includes a plurality of sensor data, and the step S102 includes the steps of:
step S201, comparing the amplitude value corresponding to each sensor data with a set threshold value;
step S202, if the amplitude value is larger than a set threshold value, retaining the amplitude value corresponding to the sensor data;
step S203, if the amplitude value is smaller than the set threshold value, setting the amplitude value corresponding to the sensor data to be 0;
in step S204, the sensor data with the reserved amplitude value and the sensor data set to 0 form a signal amplitude data matrix.
Specifically, the nth frame data matrix comprises a plurality of sensor data, each sensor data corresponds to an amplitude value, the amplitude value corresponding to each sensor data is compared with a set threshold value, and if the amplitude value is larger than the set threshold value, the amplitude value corresponding to the sensor data is reserved; and if the amplitude value is smaller than the preset amplitude value, replacing the amplitude value corresponding to the sensor data with 0, and at the moment, forming a signal amplitude data matrix by the sensor data with the reserved amplitude value and the sensor data with the amplitude value replaced with 0.
Further, step S103 includes the following steps, which are repeatedly executed until the sensor data in the signal amplitude data matrix are traversed:
step S201, selecting any sensor data from the signal amplitude data matrix as current sensor data, and marking the current sensor data;
step S202, selecting sensor data adjacent to the current sensor data position from the sensor data which are not marked, and marking the selected sensor data;
step S203, forming a connected domain by the current sensor data and the selected sensor data, and taking the connected domain as a connected domain step detection result.
Specifically, any sensor data is selected from the signal amplitude data matrix, and the selected sensor data is used as a current detection point, namely the current sensor data. Since the current sensor data has already been selected, it needs to be marked in order to avoid being selected again.
Sensor data adjacent to the current sensor data position is selected from the sensor data that have not been marked, and the selected sensor data is marked. The adjacent position means whether or not there is sensor data above, below, left, right, upper left, lower left, upper right, and lower right of the current sensor data, with the current sensor data as a center. And forming a connected domain by the current sensor data and the selected sensor data, and taking the connected domain as a connected domain step detection result.
Further, the connected component step detection result includes at least one connected component, and step S104 includes the following steps:
step S301, determining the number of sensor data in a connected domain;
step S302, obtaining the pixel area of the N frame connected domain according to the number of the sensor data;
step S303, obtaining the average signal intensity of the N frame connected domain according to the amplitude value corresponding to the sensor data.
Specifically, a plurality of sensor data are included in the connected component, the number of the sensor data is counted, and the number of the sensor data is used as the pixel area of the N-th frame of the connected component.
In the connected domain, each sensor data corresponds to an amplitude value, the amplitude values corresponding to each sensor data are added and summed, and the sum is divided by the number of the sensor data, so that the average signal intensity of the N-th frame of the connected domain is obtained.
Further, step S105 includes the steps of:
step S401, comparing the pixel area and the average signal intensity of the connected domain of the Nth frame with the pixel area and the average signal intensity of the connected domain at the same position of the Nth-1 frame;
specifically, the calculation process of the connected component of the N-1 th frame is the same as the calculation process of the connected component of the N-th frame. After acquiring the connected domain of the N frame, according to the position of the connected domain, the position of the connected domain of the N-1 frame is obtained, that is, the position of the connected domain of the N frame is the same as that of the connected domain of the N-1 frame, and then the pixel area and the average signal intensity of the connected domain of the N frame are compared with the pixel area and the average signal intensity of the connected domain of the N-1 frame.
Step S402, if the pixel area of the connected domain of the Nth frame is not smaller than the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is larger than the average signal intensity of the connected domain of the N-1 th frame, determining that the footstep of the object to be detected is in a falling state;
step S403, if the pixel area of the connected domain of the Nth frame is equal to the pixel area of the connected domain of the (N-1) th frame, and the average signal intensity of the connected domain of the Nth frame is equal to the average signal intensity of the connected domain of the (N-1) th frame, determining that the step of the object to be detected is in a staying state;
step S404, if the pixel area of the connected domain of the Nth frame is not larger than the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is smaller than the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in the lifting state.
The embodiment of the invention provides a method for identifying the step state of a pedestrian, which comprises the following steps: collecting an Nth frame data matrix; carrying out image binarization processing on the Nth frame data matrix to obtain a signal amplitude data matrix; performing connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result; performing image feature extraction on the step detection result of the connected domain to obtain feature data of the N-th frame of connected domain; and comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame, determining the step state of the object to be detected according to the comparison result, and analyzing the walking state of the pedestrian so as to monitor the behavior of the pedestrian.
Example two:
fig. 2 is a schematic diagram of a 67 th frame data matrix according to a second embodiment of the present invention.
Referring to fig. 2, the smart floor lattice is 16 × 8, 16 is the number of rows, and 8 is the number of columns. Performing image binarization processing on the 67 th frame data matrix to obtain a signal amplitude data matrix, and taking sensor data with an amplitude value reserved in the signal amplitude data matrix as a detection result of the 67 th frame data matrix, which is specifically referred to fig. 3.
And performing connected domain processing on the detection result of the 67 th frame data matrix, classifying the sensor data adjacent to each other into the same connected domain, thereby obtaining a connected domain, and taking the connected domain as a connected domain step detection result. And (3) performing image feature extraction on the connected domain step detection result to obtain that the pixel area of the 67 th frame of connected domain is 2, and the average signal intensity is 82.5450. Since two sensor data are located adjacent to each other, one connected component is formed, and therefore the pixel area of the 67 th frame connected component is 2. Referring to fig. 4, one of the sensor data [ X, Y ]: [4, 3], X represents the number of rows, Y represents the number of columns, and the position of the sensor data is the 4 th row and the 3 rd column. Index represents the signal strength, and the signal strength of the sensor data is 134.6. Another sensor data [ X, Y ]: [5, 3], X represents the number of rows, Y represents the number of columns, and the position of the sensor data is the 5 th row and the 3 rd column. The signal strength of the sensor data was 30.49. The average signal strength 82.5450 for the connected component is the sum of 134.6 and 30.49, divided by 2.
Referring to fig. 5, the same calculation method as that of the 67 th frame data matrix is adopted to obtain the connected component step detection result of the 66 th frame data matrix, and the image feature extraction is performed on the connected component step detection result of the 66 th frame data matrix, so that the pixel area of the 66 th frame connected component is 2, and the average signal intensity is 135.86.
The position of the connected component in the connected component step detection result of the 66 th frame data matrix is the same as the position of the connected component in the connected component step detection result of the 67 th frame data matrix. The connected component step detection result of the 66 th frame data matrix includes a connected component composed of two sensor data. One of the sensor data [ X, Y ]: [4, 3], X represents the number of rows, Y represents the number of columns, and the position of the sensor data is the 4 th row and the 3 rd column. Index represents the signal strength, and the signal strength of the sensor data is 191.1. Another sensor data [ X, Y ]: [5, 3], X represents the number of rows, Y represents the number of columns, and the position of the sensor data is the 5 th row and the 3 rd column. The signal strength of the sensor data is 80.62. The average signal strength 135.86 for the connected domain is the sum of 191.1 and 80.62, divided by 2.
As can be seen from the above, the pixel area of the 67 th frame connected component is the same as the pixel area of the 66 th frame connected component; and the average signal intensity of the 67 th frame of connected domain is smaller than that of the 66 th frame of connected domain, so that the step state of the object to be detected in the 67 th frame data matrix is determined to be uplifted.
Example three:
fig. 6 is a schematic view of a pedestrian step status recognition device according to a third embodiment of the present invention.
Referring to fig. 6, the apparatus includes: the device comprises an acquisition unit 1, a binarization processing unit 2, a connected domain processing unit 3, an extraction unit 4 and a comparison unit 5.
The acquisition unit 1 is used for acquiring an Nth frame data matrix;
a binarization processing unit 2, configured to perform image binarization processing on the nth frame data matrix to obtain a signal amplitude data matrix;
the connected domain processing unit 3 is used for carrying out connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result;
the extraction unit 4 is used for extracting image features of the step detection result of the connected domain to obtain feature data of the N-th frame of connected domain;
and the comparison unit 5 is used for comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame and determining the step state of the object to be detected according to the comparison result.
Further, the nth frame data matrix includes a plurality of sensor data, and the binarization processing unit 2 is specifically configured to:
comparing the amplitude value corresponding to each sensor data with a set threshold value;
if the amplitude value is larger than the set threshold value, retaining the amplitude value corresponding to the sensor data;
if the amplitude value is smaller than the set threshold value, setting the amplitude value corresponding to the sensor data to be 0;
the sensor data with the reserved amplitude values and the sensor data set to 0 are formed into a signal amplitude data matrix.
Further, the connected component processing unit 3 repeatedly executes the following processing until the sensor data in the signal amplitude data matrix are traversed:
selecting any sensor data from the signal amplitude data matrix as current sensor data, and marking the current sensor data;
selecting sensor data adjacent to the current sensor data position from the sensor data which are not marked, and marking the selected sensor data;
and forming a connected domain by the current sensor data and the selected sensor data, and taking the connected domain as a connected domain step detection result.
Further, the connected component step detection result includes at least one connected component, and the extraction unit 4 is specifically configured to:
determining the number of sensor data in a connected domain;
obtaining the pixel area of the N frame connected domain according to the number of the sensor data;
and obtaining the average signal intensity of the N frame connected domain according to the amplitude value corresponding to the sensor data.
Further, the comparing unit 5 is specifically configured to:
comparing the pixel area and the average signal intensity of the connected domain of the Nth frame with the pixel area and the average signal intensity of the connected domain at the same position of the Nth-1 th frame;
if the pixel area of the connected domain of the Nth frame is not smaller than the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is larger than the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in a falling state;
if the pixel area of the connected domain of the Nth frame is equal to the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is equal to the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in a staying state;
and if the pixel area of the connected domain of the Nth frame is not larger than the pixel area of the connected domain of the N-1 th frame and the average signal intensity of the connected domain of the Nth frame is smaller than the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in the lifting state.
The embodiment of the invention provides a device for identifying the step state of a pedestrian, which comprises: collecting an Nth frame data matrix; carrying out image binarization processing on the Nth frame data matrix to obtain a signal amplitude data matrix; performing connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result; performing image feature extraction on the step detection result of the connected domain to obtain feature data of the N-th frame of connected domain; and comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame, determining the step state of the object to be detected according to the comparison result, and analyzing the walking state of the pedestrian so as to monitor the behavior of the pedestrian.
The embodiment of the present invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the computer program, the steps of the pedestrian step state identification method provided in the above embodiment are implemented.
The embodiment of the present invention further provides a computer readable medium having a non-volatile program code executable by a processor, where the computer readable medium stores a computer program, and the computer program is executed by the processor to perform the steps of the pedestrian step state identification method according to the above embodiment.
The computer program product provided in the embodiment of the present invention includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment, which is not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (10)

1. A method for identifying a step state of a pedestrian, the method comprising:
collecting an Nth frame data matrix;
carrying out image binarization processing on the Nth frame data matrix to obtain a signal amplitude data matrix;
performing connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result;
performing image feature extraction on the step detection result of the connected domain to obtain feature data of the N-th frame of connected domain;
and comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame, and determining the step state of the object to be detected according to the comparison result.
2. The method for identifying the step status of the pedestrian according to claim 1, wherein the data matrix of the nth frame includes a plurality of sensor data, and the performing the image binarization processing on the data matrix of the nth frame to obtain the signal amplitude data matrix comprises:
comparing the amplitude value corresponding to each sensor data with a set threshold value;
if the amplitude value is larger than the set threshold value, retaining the amplitude value corresponding to the sensor data;
if the amplitude value is smaller than the set threshold value, setting the amplitude value corresponding to the sensor data to be 0;
and forming the signal amplitude data matrix by using the sensor data with the reserved amplitude value and the sensor data set to be 0.
3. The method for identifying the step status of the pedestrian according to claim 1, wherein the performing connected component processing on the signal amplitude data matrix to obtain the result of the connected component step detection comprises repeatedly performing the following processing until the sensor data in the signal amplitude data matrix are traversed:
selecting any sensor data from the signal amplitude data matrix as current sensor data, and marking the current sensor data;
selecting sensor data adjacent to the current sensor data position from the sensor data which are not marked, and marking the selected sensor data;
and forming the connected domain by the current sensor data and the selected sensor data, and taking the connected domain as the step detection result of the connected domain.
4. The method according to claim 1, wherein the connected component step detection result includes at least one connected component, and the performing image feature extraction on the connected component step detection result to obtain feature data of the N-th frame of connected components includes:
determining the number of sensor data in the connected domain;
obtaining the pixel area of the N frame connected domain according to the number of the sensor data;
and obtaining the average signal intensity of the N frame connected domain according to the amplitude value corresponding to the sensor data.
5. The pedestrian step state identification method according to claim 4, wherein the step state determination step of determining the step state of the object to be detected according to the comparison result by comparing the feature data of the connected component of the nth frame with the feature data of the connected component of the N-1 th frame comprises:
comparing the pixel area and the average signal intensity of the connected domain of the Nth frame with the pixel area and the average signal intensity of the connected domain at the same position of the N-1 th frame;
if the pixel area of the connected domain of the Nth frame is not smaller than the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is larger than the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in a falling state;
if the pixel area of the connected domain of the Nth frame is equal to the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is equal to the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in a staying state;
and if the pixel area of the connected domain of the Nth frame is not larger than the pixel area of the connected domain of the N-1 th frame, and the average signal intensity of the connected domain of the Nth frame is smaller than the average signal intensity of the connected domain of the N-1 th frame, determining that the step of the object to be detected is in the lifting state.
6. A device for recognizing a step status of a pedestrian, the device comprising:
the acquisition unit is used for acquiring the Nth frame data matrix;
a binarization processing unit, configured to perform image binarization processing on the nth frame data matrix to obtain a signal amplitude data matrix;
the connected domain processing unit is used for carrying out connected domain processing on the signal amplitude data matrix to obtain a connected domain step detection result;
the extraction unit is used for extracting image characteristics of the step detection result of the connected domain to obtain the characteristic data of the N-th frame of connected domain;
and the comparison unit is used for comparing the characteristic data of the connected domain of the Nth frame with the characteristic data of the connected domain of the (N-1) th frame and determining the step state of the object to be detected according to the comparison result.
7. The device according to claim 6, wherein the nth frame data matrix includes a plurality of sensor data, and the binarization processing unit is specifically configured to:
comparing the amplitude value corresponding to each sensor data with a set threshold value;
if the amplitude value is larger than the set threshold value, retaining the amplitude value corresponding to the sensor data;
if the amplitude value is smaller than the set threshold value, setting the amplitude value corresponding to the sensor data to be 0;
and forming the signal amplitude data matrix by using the sensor data with the reserved amplitude value and the sensor data set to be 0.
8. The pedestrian step state recognition device according to claim 6, wherein the connected component processing unit repeatedly performs the following processing until the sensor data in the signal magnitude data matrix are traversed:
selecting any sensor data from the signal amplitude data matrix as current sensor data, and marking the current sensor data;
selecting sensor data adjacent to the current sensor data position from the sensor data which are not marked, and marking the selected sensor data;
and forming the connected domain by the current sensor data and the selected sensor data, and taking the connected domain as the step detection result of the connected domain.
9. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1-5 when executing the computer program.
10. A computer-readable medium having non-volatile program code executable by a processor, wherein the program code causes the processor to perform the method of any of claims 1-5.
CN202010965318.9A 2020-09-14 2020-09-14 Pedestrian step state identification method and device Pending CN112084980A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010965318.9A CN112084980A (en) 2020-09-14 2020-09-14 Pedestrian step state identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010965318.9A CN112084980A (en) 2020-09-14 2020-09-14 Pedestrian step state identification method and device

Publications (1)

Publication Number Publication Date
CN112084980A true CN112084980A (en) 2020-12-15

Family

ID=73736282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010965318.9A Pending CN112084980A (en) 2020-09-14 2020-09-14 Pedestrian step state identification method and device

Country Status (1)

Country Link
CN (1) CN112084980A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103984040A (en) * 2014-05-20 2014-08-13 刘达 Biological recognition method based on infrared sensor array algorithm
CN110728258A (en) * 2019-10-22 2020-01-24 杭州姿感科技有限公司 Step detection method and system based on matching of connected domains of front frame and rear frame
CN110837794A (en) * 2019-11-04 2020-02-25 杭州姿感科技有限公司 Pedestrian number statistical method and device
CN111428653A (en) * 2020-03-27 2020-07-17 湘潭大学 Pedestrian congestion state determination method, device, server and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103984040A (en) * 2014-05-20 2014-08-13 刘达 Biological recognition method based on infrared sensor array algorithm
CN110728258A (en) * 2019-10-22 2020-01-24 杭州姿感科技有限公司 Step detection method and system based on matching of connected domains of front frame and rear frame
CN110837794A (en) * 2019-11-04 2020-02-25 杭州姿感科技有限公司 Pedestrian number statistical method and device
CN111428653A (en) * 2020-03-27 2020-07-17 湘潭大学 Pedestrian congestion state determination method, device, server and storage medium

Similar Documents

Publication Publication Date Title
CN110807385B (en) Target detection method, target detection device, electronic equipment and storage medium
CN108053653B (en) Vehicle behavior prediction method and device based on LSTM
CN111813997B (en) Intrusion analysis method, device, equipment and storage medium
CN111144337B (en) Fire detection method and device and terminal equipment
CN110826429A (en) Scenic spot video-based method and system for automatically monitoring travel emergency
CN110458126B (en) Pantograph state monitoring method and device
JP4764487B2 (en) Video surveillance system
JP2009064175A (en) Object detection device and object detection method
CN109460787A (en) IDS Framework method for building up, device and data processing equipment
CN109313806A (en) Image processing apparatus, image processing system, image processing method and program
JP5388291B2 (en) Discriminator generation method, computer program, discriminator generation device, and predetermined object detection device
CN110674680A (en) Living body identification method, living body identification device and storage medium
CN111899470B (en) Human body falling detection method, device, equipment and storage medium
CN113658192A (en) Multi-target pedestrian track acquisition method, system, device and medium
CN111523416A (en) Vehicle early warning method and device based on highway ETC portal
CN112307994A (en) Obstacle identification method based on sweeper, electronic device and storage medium
CN108629310B (en) Engineering management supervision method and device
CN112257546B (en) Event early warning method and device, electronic equipment and storage medium
CN112084980A (en) Pedestrian step state identification method and device
CN115661131A (en) Image identification method and device, electronic equipment and storage medium
CN113050063A (en) Obstacle detection method, device and equipment based on laser sensor and storage medium
CN115512306A (en) Method for early warning of violence events in elevator based on image processing
CN114943720A (en) Electric power image processing method and device
CN111814764A (en) Lost article determining system
CN112102362A (en) Pedestrian step track determination method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination