CN112836667B - Method for judging falling and reverse running of passengers going upstairs escalator - Google Patents

Method for judging falling and reverse running of passengers going upstairs escalator Download PDF

Info

Publication number
CN112836667B
CN112836667B CN202110193028.1A CN202110193028A CN112836667B CN 112836667 B CN112836667 B CN 112836667B CN 202110193028 A CN202110193028 A CN 202110193028A CN 112836667 B CN112836667 B CN 112836667B
Authority
CN
China
Prior art keywords
escalator
displacement
pixel
height
passenger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110193028.1A
Other languages
Chinese (zh)
Other versions
CN112836667A (en
Inventor
周耀华
左宗鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jisheng Network Technology Co ltd
Original Assignee
Shanghai Jisheng Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jisheng Network Technology Co ltd filed Critical Shanghai Jisheng Network Technology Co ltd
Priority to CN202110193028.1A priority Critical patent/CN112836667B/en
Publication of CN112836667A publication Critical patent/CN112836667A/en
Application granted granted Critical
Publication of CN112836667B publication Critical patent/CN112836667B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • G06V20/47Detecting features for summarising video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for judging whether a passenger on an ascending staircase falls down or runs backwards, which comprises the following steps: step 1, marking according to an acquired historical event video, and performing feature extraction and classifier training to acquire a model file; and 2, extracting the real-time video, and predicting the event by using the characteristics and the model file extracted in the step 1. According to the method for judging whether the passengers on the escalator fall down and run backwards, whether partial passengers on the escalator take the escalator in an irregular manner or whether dangerous events occur is judged through an image analysis technology, the real-time monitoring video can be distinguished from the falling and run backwards on the premise of low calculation amount, management center software is reported, and timely intervention and guidance are performed on dangerous behaviors.

Description

Method for judging falling and retrograde of passenger on ascending escalator
Technical Field
The invention relates to an application method of image analysis in the technical field of escalator safety, in particular to a method for judging whether a passenger on an ascending escalator falls down and drives in the wrong direction.
Background
The development of current image monitoring equipment, image file have become the main means of acquireing information, and image information extraction based on degree of deep learning provides powerful driving force for the intellectuality to a large amount falls to the ground in the security protection field.
Image analysis is still in the starting stage in the elevator field, and more elevator manufacturers use image monitoring to assist manual management in the aspects of maintenance, elevator safety management and the like.
In the safety management of the escalator, the serious consequences caused by irregular elevator taking behaviors or the irregular elevator taking behaviors on the escalator are to remind or provide help in time. It is difficult to timely remind passengers when irregular boarding is difficult to find in complete manual management, or timely rescue is provided when serious consequences are generated.
By means of the image monitoring equipment, when irregular elevator taking behaviors or events needing rescue are analyzed in real time on the escalator, the image monitoring equipment can replace manpower to remind or inform workers to timely rescue.
The main irregular taking or timely rescue actions on the escalator comprise: the passenger can be prompted to avoid the retrograde motion by the retrograde motion behavior through the voice equipment in time, and the passenger can be informed of the assistance in time when falling down.
How to use the monitoring device to monitor or distinguish the above two behaviors is very necessary, and currently, the monitoring device mainly adopts the following steps when detecting and distinguishing the two behaviors:
1. and extracting the position posture of the key point of the human body based on deep learning to judge whether the human body falls down. For the attitude feature extraction, in a real application, when there are many escalator passengers or the image resolution is low, it is usually difficult to extract the joint points of the passengers to judge the attitude of the passengers, and such a method has a large calculation amount and a high deployment cost.
2. The method has small calculation amount, but is difficult to extract features to distinguish the retrograde motion or the falling and provide more accurate management for workers.
Disclosure of Invention
The invention aims to provide a novel feature extraction method, which can solve the existing problems and apply image analysis to the technical field of escalator safety to distinguish the falling and reverse events of passengers on an ascending escalator.
In order to achieve the above object, the present invention provides a method for determining a fall and a reverse of an escalator passenger, wherein the method comprises: step 1, marking according to the obtained historical event video, and performing feature extraction and classifier training to obtain a model file; and 2, extracting the real-time video, and predicting the event by using the characteristics and the model file extracted in the step 1.
In the above method for determining that a passenger on an ascending escalator falls down and travels in the reverse direction, the extracting features in step 1 includes:
step 1.1, determining a quadrilateral frame through 4 points by adopting a given video for determining a scene, and determining a coordinate position of the escalator in an image;
step 1.2, calibrating the horizontal angle theta between the camera and the escalator v Vertical angleDegree theta k Vertical coordinate Y of upper part of escalator s The pixel height H of one step on the upper part of the escalator s And the vertical coordinate gamma of the lower part of the escalator e The pixel height H of a step at the lower part of the escalator s
Step 1.3, regarding the change of the step pixel height in the vertical direction as a linear relation, and calculating the coordinate in the vertical direction and the step pixel height;
step 1.4, saving the parameters obtained in step 1.1, step 1.2 and step 1.3 as xml configuration files;
step 1.5, reading a frame of image, intercepting a partial target area of the escalator according to the framed position in the step 1.1, and carrying out graying and resolution adjustment on the image of the target area, wherein the resolution adjustment rule is high-width scaling; the scaling size is calculated as: r =10/H e
Step 1.6, calculating dense optical flow fields of front and rear adjacent frame images, and obtaining the displacement of pixel points in the horizontal direction and the vertical direction;
step 1.7, calculating the descending displacement distance of all pixel points along the escalator direction according to the light flow field diagram and the calibrated horizontal angle to form a displacement diagram l 1 Adapting the extracted features to different scenes;
step 1.8, according to the calibrated height normalized displacement chart I of the steps of the upper escalator and the lower escalator 1 Forming a normalized displacement map I 2 Calculating I by using 1/100 step height as a displacement unit 2 ,I 2 =100*I 1 /f step (i);
Step 1.9, calculating a displacement object mask image and converting the displacement image I 2 The pixel point with the middle displacement direction not being downward is marked as 0, otherwise, the pixel point is marked as 255; forming a displacement mask pattern I 3
Figure BDA0002945899010000031
Step 1.10, obtaining the pixel area of the displacement object, and counting I 3 Area of non-0 pixels, forming an area feature F n0 As the 1 st feature;
F n0 =∑ i,j I 3 (i, j)/255, n is the frame number;
step 1.11, mask map I 3 And I 2 Performing AND operation to generate a downlink object displacement diagram I 4 ,I 4 =I 3 &I 2
Step 1.12, calculate the displacement map I 4 The mean value and variance of the displacement of the middle non-0 value point form a mean value and variance characteristic F n1 、F n2 As the 2 nd and 3 rd features;
step 1.13, calculate I 4 Drawing a minimum circumscribed rectangle formed by a non-0 value area, wherein the height of the obtained rectangle is H, and the width of the obtained rectangle is W;
step 1.14, performing 11 equal divisions in the vertical direction on the minimum circumscribed rectangle, wherein the height of each block is H/11, the width is W, and counting the median of all downlink displacements in each block, and marking the median as Z k
Step 1.15, adding Z k+1 -Z k The difference of (A) forms the characteristic value F n3 To F n12 As the 4 th to 13 th features;
step 1.16, recording the vertical coordinate y of the central point of the minimum circumscribed rectangle in the image;
step 1.17, count the latest 24 frames F n1 And calculating the variance as the 14 th feature F n13 A time-series variation characteristic;
step 1.18, calculating the relative change of the central point of the latest 24 frames according to the recorded vertical coordinate of the central point of the minimum circumscribed rectangle, and subtracting the central point of the circumscribed rectangle of the previous frame from the central point of the circumscribed rectangle of the next frame to be used as a 15 th feature;
and step 1.19, combining 15 features of the latest 24 frames to form 24-by-15 feature vectors to obtain an extracted feature matrix.
In the above method for determining the falling and reverse of the passenger on the ascending escalator, in step 1.3, the height correspondence relationship is:
Figure BDA0002945899010000051
where i is the vertical ordinate in the image, f step (i) Is the pixel height of the step at the i coordinate.
The method for judging whether the passenger falls down or runs backwards on the ascending escalator, wherein in the step 1.6, v i,j Represents the horizontal displacement of the corresponding pixel, h i,j Representing a vertical displacement of the corresponding pixel; the displacement is obtained by the following formula:
Figure BDA0002945899010000052
in the above method for determining the falling and reverse of the passenger on the ascending escalator, in step 1.7, the displacement transformation formula is:
Figure BDA0002945899010000053
in the above method for determining falling and reverse running of the passenger on the ascending escalator, in step 1.12, the mean value and variance characteristics F n1 、F n2 Calculated by the following formula:
F n1 =∑ i,j I 4 (i,j)/F n 0;
Figure BDA0002945899010000061
the method for judging whether the passenger falls down or goes backwards in the ascending escalator, wherein in the step 1.17, the 14 th characteristic F n13 Calculated by the following formula:
Figure BDA0002945899010000062
the method for judging whether the passenger on the ascending escalator falls down to move in the reverse direction comprises the following steps of 1: step 1.20, collecting a retrograde motion tumbling video of the previous escalator passengers, and marking the video as retrograde motion or tumbling; step 1.21, extracting characteristics of each video according to a characteristic extraction method in the text; step 1.22, inputting the extracted characteristics and calibration results of each video into an SVM classifier for training; and step 1.23, storing the obtained classification model file after training is finished.
In the above method for determining that a passenger on an ascending escalator falls down and drives backwards, in step 2, the predicting a real-time video includes: step 2.1, acquiring images of the monitoring camera in real time through an RTSP (real time streaming protocol); step 2.2, initializing an SVM classifier by using a model file obtained in classifier training; step 2.3, the characteristic matrix is obtained by using the extraction mode in the step 1; step 2.4, inputting the feature matrix into an SVM classifier to obtain a predicted classification result; step 2.5, informing a management system of the monitoring center of the obtained result in an HTTP mode; and 2.6, the management system makes corresponding processing and prompting according to the received event and state.
In the above method for determining that a passenger on an ascending escalator falls down and travels in the reverse direction, in step 2.6, the corresponding processing and prompting includes: a. the passenger moves in the reverse direction: playing a request back warning by the field voice; b. the passenger falls down: the management software carries out popup alarm and has the intervention of staff.
The method for judging whether the passenger falls down or moves backwards in the ascending staircase provided by the invention has the following advantages:
the method judges whether the part of passengers on the escalator is not in standard riding or whether dangerous events occur or not through an image analysis technology, can realize the falling and retrograde motion distinguishing of the real-time monitoring video and report to management center software on the premise of low calculation amount, and can carry out timely intervention guidance on dangerous behaviors.
Drawings
Fig. 1 is a feature extraction flowchart of the method for judging a fall and a reverse of an ascending escalator passenger according to the present invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings.
The invention provides a method for judging whether a passenger on an ascending staircase falls down or reverses, which comprises the following steps:
step 1, marking according to an acquired historical event video, and performing feature extraction and classifier training to acquire a model file; and 2, extracting the real-time video, and predicting the event by using the characteristics and the model file extracted in the step 1.
As shown in fig. 1, the step 1 of performing feature extraction includes:
step 1.1, determining a quadrilateral frame through 4 points by adopting a given video of a determined scene, and determining the coordinate position of the escalator in an image.
Step 1.2, calibrating the horizontal angle theta between the camera and the escalator v Perpendicular angle theta k Upper longitudinal coordinate Y of escalator s The pixel height H of one step on the upper part of the escalator s Vertical coordinate Y of lower position of escalator s The pixel height H of a step at the lower part of the escalator s
Step 1.3, regarding the change of the height of the step pixel in the vertical direction as a linear relation, and calculating the coordinate in the vertical direction and the height of the step pixel; the corresponding relation of the heights is as follows:
Figure BDA0002945899010000081
where i is the vertical ordinate in the image, f step (i) Is the pixel height of the step at the i coordinate.
And step 1.4, storing the parameters obtained in the step 1.1, the step 1.2 and the step 1.3 as an xml configuration file for subsequent or later use without repeated calibration and calculation.
Step 1.5, reading a frame of image, intercepting a target area of the escalator part according to the framed position in the step 1.1, and carrying out graying and resolution adjustment on the image of the target area, wherein the resolution adjustment rule is high-width scaling; the scaling is calculated as: r =10/H e
Step 1.6, calculating dense optical flow fields of front and rear adjacent frame images, and obtaining the displacement of pixel points in the horizontal direction and the vertical direction; v. of i,j Representing the corresponding pixel horizontal bitMove, h i,j Representing a vertical displacement of the corresponding pixel; the displacement is obtained by the following formula:
Figure BDA0002945899010000091
step 1.7, calculating the descending displacement distance of all pixel points along the escalator direction according to the light flow field diagram and the calibrated horizontal angle to form a displacement diagram l 1 Adapting the extracted features to different scenes;
the displacement transformation formula is as follows:
Figure BDA0002945899010000092
step 1.8, according to the calibrated height normalized displacement chart I of the steps of the upper escalator and the lower escalator 1 Forming a normalized displacement map I 2 Calculating I by using 1/100 step height as a displacement unit 2 ,I 2 =100*I 1 /f step (i)。
Step 1.9, calculating a displacement object mask map, and converting the displacement map I 2 The pixel point with the middle displacement direction not being downward is marked as 0, otherwise, the pixel point is marked as 255; forming a displacement mask pattern I 3
Figure BDA0002945899010000093
Step 1.10, obtaining the pixel area of the displacement object, and counting I 3 Area of non-0 pixels to form an area feature F n0 As the 1 st feature; f n0 =∑ i,j I 3 (i, j)/255, n is the frame number.
Step 1.11, mask map I 3 And I 2 Performing AND operation to generate a downlink object displacement map I 4 ,I 4 =I 3 &I 2
Step 1.12, calculate the displacement map I 4 The mean and variance of the displacement of the middle and non-0 value points form the mean and the squareDifference characteristic F n1 、F n2 As the 2 nd and 3 rd features;
calculated by the following formula:
F n1 =∑ i,j I 4 (i,j)/F n0
Figure BDA0002945899010000101
step 1.13, calculate I 4 The minimum bounding rectangle formed by the non-0 value area is shown, and the height of the obtained rectangle is H, and the width of the obtained rectangle is W.
Step 1.14, performing 11 equal divisions in the vertical direction on the minimum circumscribed rectangle, wherein the height of each block is H/11, the width is W, and counting the median of all downlink displacements in each block, and marking the median as Z k
Step 1.15, adding Z k+1 -Z k The difference of (A) forms the characteristic value F n3 To F n12 As the 4 th to 13 th features.
And 1.16, recording the vertical coordinate y of the central point of the minimum circumscribed rectangle in the image.
Step 1.17, count the latest 24 frames F n1 And calculating the variance as the 14 th feature F n13 A time-series variation characteristic;
calculated by the following formula:
Figure BDA0002945899010000111
step 1.18, calculating the relative change of the central point of the nearest 24 frames according to the recorded vertical coordinate of the central point of the minimum circumscribed rectangle, and subtracting the central point of the circumscribed rectangle of the previous frame from the central point of the circumscribed rectangle of the next frame to be used as the 15 th feature.
And step 1.19, combining 15 features of the latest 24 frames to form 24-by-15 feature vectors to obtain an extracted feature matrix.
In step 1, the classifier training comprises: step 1.20, collecting a retrograde motion tumbling video of the previous escalator passengers, and marking the video as retrograde motion or tumbling; step 1.21, extracting characteristics of each video according to a characteristic extraction method in the text; step 1.22, inputting the extracted characteristics and the calibration result of each video into an SVM classifier for training; and step 1.23, storing the obtained classification model file after training is finished.
SVM (Support Vector Machines), which is one of the most robust and accurate methods among all known data mining algorithms, belongs to a binary classification algorithm and can Support linear and nonlinear classification. The basic model of support vector machine SVM is to define a linear classifier with maximum separation in feature space. The method is a binary classification model, and after a kernel skill is adopted, the support vector machine can be used for nonlinear classification.
In step 2, predicting the real-time video includes: step 2.1, acquiring images of the monitoring camera in real time through an RTSP (real time streaming protocol); step 2.2, initializing an SVM classifier by using a model file obtained in classifier training; step 2.3, using the extraction mode in the step 1 to obtain a feature matrix; step 2.4, inputting the feature matrix into an SVM classifier to obtain a predicted classification result; step 2.5, notifying the obtained result to software of a management system of the monitoring center in an HTTP mode; and 2.6, the software of the management system makes corresponding processing and prompting according to the received event and state. The specific corresponding processing and prompting comprises: a. the passenger moves in the reverse direction: playing a request back warning by the field voice; b. the passenger falls down: the management software carries out popup alarm and has the intervention of staff.
The RTSP Protocol (Real Time Streaming Protocol), RFC2326, real-Time Streaming Protocol, is an application layer Protocol in the TCP/IP Protocol system. The protocol defines how one-to-many applications efficiently communicate multimedia data over IP networks.
The method for judging whether the passenger on the ascending stair falls down and reverses the way provided by the invention is further described with reference to the embodiment.
Example 1
A method for determining a fall and a reverse of an escalator passenger, comprising:
s1, a camera is installed below the escalator at a distance of 3 meters from the horizontal position of the front edge plate, the left-right deviation angle is smaller than 25 degrees, and the vertical height is smaller than 4 meters.
S2, deploying an X86 server, wherein the operating system is a Windows server2012 or Windows server2016 version.
And S3, the server accesses the camera through a TCP/IP network, fills the RTSP address of the camera in the configuration interface and acquires the real-time video stream of the camera.
And S4, decoding the obtained video image into a YUV420 format.
S5, framing the positions of the left handrail belt, the right handrail belt, the upper edge and the lower edge of the escalator on the image to obtain the position of the escalator in the image.
S6, filling the horizontal included angle degree between the escalator and the camera, the height of the step of the escalator at the topmost part and the height of the step of the escalator at the bottommost part.
And S7, storing the parameters into an xml file and taking the parameters as initialized parameters.
And S8, scaling the acquired image, and dividing the fixed normalized step height (in the embodiment, the height of 10 pixels) by the actual average step height to obtain the scaled image.
And S9, judging whether the currently acquired image frame is a first frame or not, if so, caching the image frame without processing.
And S10, if the current frame is not the first frame, acquiring the image cached in the previous frame and calculating the optical flow field image together with the current frame.
S11, normalizing the moving direction and the distance of each pixel point in the acquired optical flow image, wherein the normalization process is as follows:
A. and calculating the displacement value of the point along the direction of the escalator according to the horizontal angle between the escalator and the camera.
B. Taking one percent of the height of the escalator as a unit 1 according to the normalized height of the escalator, and obtaining the height of the escalator according to a formula I 2 =100*I 1 /f step (i) And discretizing the displacement value of each pixel point.
And S12, judging the direction of each pixel point one by one after normalizing the light flow graph, if the direction is an ascending direction, setting the displacement value of the corresponding pixel point to be 0, otherwise, calculating the displacement value of the pixel point along the escalator direction according to the horizontal angle of the escalator to serve as the value of the pixel point, and generating a final displacement graph.
And S13, generating a black and white mask image according to the displacement image, wherein 0 value in the mask image represents a pixel point which does not move downwards, and 255 represents a pixel point which moves downwards.
S14, counting the number of non-zero-value pixel points in the displacement graph as the pixel area characteristic F of the downward displacement object n0 . Calculating SUM SUM of all non-zero pixel points in the displacement image, and calculating average displacement characteristic F of all the points n1 Then, all non-0 pixel points are traversed to calculate the variance of the displacement as a characteristic F n2
S15, calculating a minimum circumscribed rectangle of a non-0 pixel area in the displacement graph, equally dividing the minimum circumscribed rectangle by 11 in the vertical direction of the rectangle, respectively calculating a displacement median value of each sub-area, and subtracting an adjacent upper sub-rectangle from a lower sub-rectangle to obtain a difference value as a characteristic [ F ] n3 ,F n4 ,F n5 ,F n6 ,F n7 ,F n8 ,F n9 ,F n10 ,F n11 ,F n12 ]。
S16, caching current frame characteristics F n0 To F n12
And S17, recording the central point of the minimum circumscribed rectangle in the book.
And S18, acquiring the characteristics of 23 frames before the current frame. If the number of frames is less than 23, the process returns to S10. Otherwise, combining the features of the current frame with the previous 23 frames into a 24 × 13 feature vector.
S19, calculating F in 24 frames n1 Average of (2), recalculating each frame and F n1 The variance of the mean, as the 14 th eigenvalue per frame.
S20, calculating the difference value of the circumscribed rectangle center point of the adjacent next frame and the previous frame in the 24 frames. As the 15 th feature.
And S21, inputting the feature matrix of 24 × 15 into an SVM classifier for classification.
And S22, pushing the judged event type to management software through HTTP, and carrying out corresponding processing by the management software according to the event type.
The method for judging whether the passenger falls down to move backwards on the ascending staircase provided by the invention is realized by installing image monitoring analysis software by adopting an image processing server and communicating a monitoring camera, the image processing server, monitoring center management software and the like through the Ethernet. The method can realize tumble and retrograde distinguishing of the real-time monitoring video and report to management center software on the premise of low calculation amount, and timely intervenes and guides dangerous behaviors.
While the present invention has been described in detail with reference to the preferred embodiments, it should be understood that the above description should not be taken as limiting the invention. Various modifications and alterations to this invention will become apparent to those skilled in the art upon reading the foregoing description. Accordingly, the scope of the invention should be determined from the following claims.

Claims (9)

1. A method for determining a fall and a reverse of an escalator passenger, the method comprising:
step 1, marking according to the obtained historical event video, and performing feature extraction and classifier training to obtain a model file;
step 2, extracting a real-time video, and predicting an event by using the characteristics and the model file extracted in the step 1;
in the step 1, the feature extraction includes:
step 1.1, determining a quadrilateral frame through 4 points by adopting a given video for determining a scene, and determining a coordinate position of the escalator in an image;
step 1.2, calibrating the horizontal angle theta between the camera and the escalator v Perpendicular angle theta h Upper longitudinal coordinate Y of escalator s The pixel height H of one step on the upper part of the escalator s Longitudinal coordinate Y of lower part of escalator e The pixel height H of a step at the lower part of the escalator e
Step 1.3, regarding the change of the height of the step pixel in the vertical direction as a linear relation, and calculating the coordinate in the vertical direction and the height of the step pixel;
step 1.4, saving the parameters obtained in the step 1.1, the step 1.2 and the step 1.3 as xml configuration files;
step 1.5, reading a frame of image, intercepting a partial target area of the escalator according to the framed position in the step 1.1, and carrying out graying and resolution adjustment on the image of the target area, wherein the resolution adjustment rule is high-width scaling; the scaling is calculated as: r =10/H e
Step 1.6, calculating dense optical flow fields of front and rear adjacent frame images, and obtaining the displacement of pixel points in the horizontal direction and the vertical direction;
step 1.7, calculating the descending displacement distance of all pixel points along the escalator direction according to the light flow field diagram and the calibrated horizontal angle to form a displacement diagram I 1 Adapting the extracted features to different scenes;
step 1.8, according to the calibrated height normalized displacement chart I of the steps of the upper escalator and the lower escalator 1 Forming a normalized displacement map I 2 Calculating I by using 1/100 step height as a displacement unit 2 ,I 2 =100*I 1 /f step (i);
Step 1.9, calculating a displacement object mask map, and converting the displacement map I 2 The pixel point with the middle displacement direction not being downward is marked as 0, otherwise, the pixel point is marked as 255; forming a displacement mask pattern I 3
Figure FDA0003837572150000021
Step 1.10, obtaining the pixel area of the displacement object, and counting I 3 Area of non-0 pixels to form an area feature F n0 As the 1 st feature;
F n0 =∑ i,j I 3 (i, j)/255, n is the frame number;
step 1.11, mask map I 3 And I 2 Performing AND operation to generate a downlink object displacement map I 4 ,I 4 =I 3 &I 2
Step 1.12, calculate the displacement map I 4 The mean value and variance of the displacement of the middle non-0 value point form a mean value and variance characteristic F n1 、F n2 As the 2 nd and 3 rd features;
step 1.13, calculate I 4 Drawing a minimum circumscribed rectangle formed by a non-0 value area, wherein the height of the obtained rectangle is H, and the width of the obtained rectangle is W;
step 1.14, performing 11 equal divisions in the vertical direction on the minimum circumscribed rectangle, wherein the height of each block is H/11, the width is W, and counting the median of all downlink displacements in each block, and marking the median as Z k
Step 1.15, adding Z k+1 -Z k Form the characteristic value F n3 To F n12 As the 4 th to 13 th features;
step 1.16, recording a vertical coordinate y of a central point of a minimum circumscribed rectangle in an image;
step 1.17, count the latest 24 frames F n1 And calculating a variance, the variance being taken as a 14 th feature F n13 A time-series variation characteristic;
step 1.18, calculating the relative change of the central point of the nearest 24 frames according to the recorded vertical coordinate of the central point of the minimum circumscribed rectangle, and subtracting the central point of the circumscribed rectangle of the previous frame from the central point of the circumscribed rectangle of the next frame to be used as the 15 th feature;
and step 1.19, combining 15 features of the latest 24 frames to form 24-by-15 feature vectors to obtain an extracted feature matrix.
2. The method for judging whether an escalator passenger falls down and reverses the direction of travel according to claim 1, wherein in step 1.3, the height correspondence is:
Figure FDA0003837572150000031
where i is the vertical ordinate in the image, f step (i) Is the pixel height of the step at the i coordinate.
3. Method for judging the fall or the retrograde motion of an escalator passenger according to claim 1, wherein in step 1.6, v i,j Represents the horizontal displacement of the corresponding pixel, h i,j Representing a vertical displacement of the corresponding pixel; the displacement is obtained by the following formula:
Figure FDA0003837572150000041
4. the method for judging whether an escalator passenger falls down and reverses the direction of travel according to claim 1, wherein in step 1.7, the displacement transformation formula is:
Figure FDA0003837572150000042
5. the method for determining the fall and reverse of an escalator passenger as claimed in claim 1, wherein in step 1.12, the mean and variance characteristics F n1 、F n2 Calculated by the following formula:
F n1 =∑ i,j I 4 (i,j)/F n0
Figure FDA0003837572150000051
6. method for determining the fall and the reverse of an escalator passenger according to claim 1, characterized in that in step 1.17, the 14 th feature F n13 Calculated by the following formula:
Figure FDA0003837572150000052
7. the method for judging a fall or a reverse of an escalator passenger according to claim 1, wherein the classifier training of step 1 comprises:
step 1.20, collecting a retrograde motion tumbling video of the previous escalator passengers, and marking the video as retrograde motion or tumbling;
step 1.21, extracting characteristics of each video according to a characteristic extraction method in the text;
step 1.22, inputting the extracted characteristics and calibration results of each video into an SVM classifier for training;
and step 1.23, storing the obtained classification model file after training is finished.
8. The method for determining the fall and reverse of an escalator passenger as claimed in claim 1, wherein said step 2 of predicting the real-time video comprises:
step 2.1, acquiring images of the monitoring camera in real time through an RTSP (real time streaming protocol);
step 2.2, initializing an SVM classifier by using a model file obtained in classifier training;
step 2.3, using the extraction mode in the step 1 to obtain a feature matrix;
step 2.4, inputting the feature matrix into an SVM classifier to obtain a predicted classification result;
step 2.5, informing a management system of the monitoring center of the obtained result in an HTTP mode;
and 2.6, the management system makes corresponding processing and prompting according to the received event and state.
9. The method for determining the fall and reverse of an escalator passenger as claimed in claim 8, wherein the corresponding processing and prompting in step 2.6 comprises:
a. the passenger moves in the reverse direction: playing a request back warning by the field voice;
b. the passenger falls down: the management software carries out popup alarm and has the intervention of staff.
CN202110193028.1A 2021-02-20 2021-02-20 Method for judging falling and reverse running of passengers going upstairs escalator Active CN112836667B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110193028.1A CN112836667B (en) 2021-02-20 2021-02-20 Method for judging falling and reverse running of passengers going upstairs escalator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110193028.1A CN112836667B (en) 2021-02-20 2021-02-20 Method for judging falling and reverse running of passengers going upstairs escalator

Publications (2)

Publication Number Publication Date
CN112836667A CN112836667A (en) 2021-05-25
CN112836667B true CN112836667B (en) 2022-11-15

Family

ID=75934019

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110193028.1A Active CN112836667B (en) 2021-02-20 2021-02-20 Method for judging falling and reverse running of passengers going upstairs escalator

Country Status (1)

Country Link
CN (1) CN112836667B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113723372B (en) * 2021-11-01 2022-01-18 北京卓建智菡科技有限公司 Prompting method and device, computer equipment and computer readable storage medium
CN114882393B (en) * 2022-03-29 2023-04-07 华南理工大学 Road reverse running and traffic accident event detection method based on target detection

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009152766A1 (en) * 2008-06-17 2009-12-23 上海阿艾依智控系统有限公司 Embedded computer vision warning device for monitoring the flow of passengers on escalator
JP2012120647A (en) * 2010-12-07 2012-06-28 Alpha Co Posture detection system
CN104200490A (en) * 2014-08-14 2014-12-10 华南理工大学 Rapid retrograde detecting and tracking monitoring method under complex environment
CN109165552A (en) * 2018-07-14 2019-01-08 深圳神目信息技术有限公司 A kind of gesture recognition method based on human body key point, system and memory
CN109492575A (en) * 2018-11-06 2019-03-19 东北大学 A kind of staircase safety monitoring method based on YOLOv3
CN109726750A (en) * 2018-12-21 2019-05-07 上海三菱电梯有限公司 A kind of passenger falls down detection device and its detection method and passenger conveying appliance
CN110188644A (en) * 2019-05-22 2019-08-30 广东寰球智能科技有限公司 A kind of staircase passenger's hazardous act monitoring system and method for view-based access control model analysis
CN111144247A (en) * 2019-12-16 2020-05-12 浙江大学 Escalator passenger reverse-running detection method based on deep learning

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11977364B2 (en) * 2018-08-24 2024-05-07 Inventio Ag Modernization method of an existing passenger transport system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009152766A1 (en) * 2008-06-17 2009-12-23 上海阿艾依智控系统有限公司 Embedded computer vision warning device for monitoring the flow of passengers on escalator
JP2012120647A (en) * 2010-12-07 2012-06-28 Alpha Co Posture detection system
CN104200490A (en) * 2014-08-14 2014-12-10 华南理工大学 Rapid retrograde detecting and tracking monitoring method under complex environment
CN109165552A (en) * 2018-07-14 2019-01-08 深圳神目信息技术有限公司 A kind of gesture recognition method based on human body key point, system and memory
CN109492575A (en) * 2018-11-06 2019-03-19 东北大学 A kind of staircase safety monitoring method based on YOLOv3
CN109726750A (en) * 2018-12-21 2019-05-07 上海三菱电梯有限公司 A kind of passenger falls down detection device and its detection method and passenger conveying appliance
CN110188644A (en) * 2019-05-22 2019-08-30 广东寰球智能科技有限公司 A kind of staircase passenger's hazardous act monitoring system and method for view-based access control model analysis
CN111144247A (en) * 2019-12-16 2020-05-12 浙江大学 Escalator passenger reverse-running detection method based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于光流法的视频分析技术在地铁客流监测中的应用研究;陈亮;《铁路技术创新》;20121130;全文 *

Also Published As

Publication number Publication date
CN112836667A (en) 2021-05-25

Similar Documents

Publication Publication Date Title
CN112836667B (en) Method for judging falling and reverse running of passengers going upstairs escalator
Kim Real time object tracking based on dynamic feature grouping with background subtraction
WO2020087743A1 (en) Non-motor vehicle traffic violation supervision method and apparatus and electronic device
TWI452540B (en) Image based detecting system and method for traffic parameters and computer program product thereof
US8041115B2 (en) Method and apparatus for determining a classification boundary for an object classifier
Giannakeris et al. Speed estimation and abnormality detection from surveillance cameras
GB2556942A (en) Transport passenger monitoring systems
CN111046832B (en) Retrograde judgment method, device, equipment and storage medium based on image recognition
CN111797803A (en) Road guardrail abnormity detection method based on artificial intelligence and image processing
CN111368749B (en) Automatic identification method and system for stair area
CN112488042B (en) Pedestrian traffic bottleneck discrimination method and system based on video analysis
EP3156972A1 (en) Counting apparatus and method for moving objects
CN107483894A (en) Judge to realize the high ferro station video monitoring system of passenger transportation management based on scene
CN111680613A (en) Method for detecting falling behavior of escalator passengers in real time
AU2021364403A1 (en) Rail feature identification system
JP2020013206A (en) Device for detecting two-wheeled vehicle from moving image/camera, program, and system
CN117456482B (en) Abnormal event identification method and system for traffic monitoring scene
CN113380021A (en) Vehicle state detection method, device, server and computer-readable storage medium
CN114708532A (en) Monitoring video quality evaluation method, system and storage medium
CN112381031B (en) Real-time online pantograph and horn detection method based on convolutional neural network
CN112766046B (en) Target detection method and related device
CN114529979A (en) Human body posture identification system, human body posture identification method and non-transitory computer readable storage medium
JP5917303B2 (en) MOBILE BODY DETECTING DEVICE, MOBILE BODY DETECTING SYSTEM, AND MOBILE BODY DETECTING METHOD
CN113838094B (en) Safety early warning method based on intelligent video identification
CN111627224A (en) Vehicle speed abnormality detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant