CN112084987A - Subway ticket evasion behavior detection method and system based on artificial intelligence - Google Patents

Subway ticket evasion behavior detection method and system based on artificial intelligence Download PDF

Info

Publication number
CN112084987A
CN112084987A CN202010971990.9A CN202010971990A CN112084987A CN 112084987 A CN112084987 A CN 112084987A CN 202010971990 A CN202010971990 A CN 202010971990A CN 112084987 A CN112084987 A CN 112084987A
Authority
CN
China
Prior art keywords
module
ticket evasion
behavior
ticket
evasion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010971990.9A
Other languages
Chinese (zh)
Inventor
杨晓敏
范会笑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202010971990.9A priority Critical patent/CN112084987A/en
Publication of CN112084987A publication Critical patent/CN112084987A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/29Individual registration on entry or exit involving the use of a pass the pass containing active electronic elements, e.g. smartcards

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Devices For Checking Fares Or Tickets At Control Points (AREA)

Abstract

The invention relates to the technical field of artificial intelligence, in particular to a method and a system for detecting subway fare evasion behaviors. The system comprises a real-time imaging module, a first image processing module and a second image processing module, wherein the real-time imaging module is used for carrying out image splicing operation on RGB images of each sub-region acquired by a plurality of cameras and projecting the images onto BIM ground; the posture estimation module is used for extracting key points of a human body from the acquired RGB images and detecting forced passing ticket evasion behaviors such as crossing, drilling through a gate machine and the like through deep learning; the parallel fare evasion detection module is used for judging whether parallel fare evasion behaviors exist or not through post-processing by utilizing a biped key point thermodynamic diagram extracted from the human body key points obtained in the posture estimation module; and the data analysis module is used for counting the times of the different types of ticket evasion behaviors in a fixed period and analyzing the change trend of the ticket evasion behaviors and the position information of the frequently occurring ticket evasion behaviors. The system obviously improves the subway ticket checking rate and the detection rate of subway ticket evasion behaviors.

Description

Subway ticket evasion behavior detection method and system based on artificial intelligence
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a subway ticket evasion behavior detection method and system based on artificial intelligence.
Background
The subway ticket evasion mode is eight-door, but the most important is the following: the first is parallel ticket evasion, the method is that a card is swiped once and a plurality of people pass through the gate, and the ticket evasion mostly occurs at the gate; secondly, the ticket is escaped in a mode of crossing, and the method is carried out at a gate and a guardrail; thirdly, the aim of ticket evasion is achieved in the past by a gate drill.
In a patent application document with application publication number CN110378179A, namely a subway fare evasion behavior detection method and system based on infrared imaging, a fare evasion behavior detection method and system based on a connected domain are provided.
In patent document CN111064925A, namely subway passenger ticket evasion behavior detection method and system, a method for determining that there is a card swiping behavior when there is an overlap between a person track and a card swiping area is proposed.
In a patent document with application publication number CN103605967A, namely a subway anti-ticket-evasion system based on image recognition and a working method thereof, a method for counting the number of people actually passing through a gate based on the feature of a projection curve to realize target segmentation is proposed.
In practice, the inventors found that the above prior art has the following disadvantages:
the method provided by the invention is not very high in detection rate of the ticket evasion behavior, has great possibility of false detection of the ticket evasion behavior close to the body, and can cause the situation of false detection of the ticket evasion behavior passing fast by swiping a card.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a method and a system for detecting subway fare evasion behavior based on artificial intelligence, so as to solve the above drawbacks of the prior art, and the adopted technical solution is as follows:
in a first aspect, an embodiment of the invention provides a subway fare evasion behavior detection method based on artificial intelligence, which specifically includes obtaining an RGB (red, green and blue) image of an interested (ROI) area and performing feature extraction to obtain a human body key point and a human body bipod key point thermodynamic diagram; judging the retention time of the thermodynamic diagram stack, and judging that a card swiping behavior exists when the retention time exceeds a preset threshold value; when the card swiping behavior is consistent with the card swiping information in the card swiping temporary database, judging that the card swiping is successful; counting the number of key points of the two feet of the human body within the opening time of the gate, and judging the number of people who actually pass through; and when the number of people is more than 1, judging that the parallel ticket evasion behaviors exist.
In a second aspect, another embodiment of the present invention further provides an artificial intelligence-based subway fare evasion behavior detection system, which specifically includes an attitude estimation module, a parallel fare evasion detection module, and a data analysis module.
The attitude estimation module comprises a key point detection module and a key point prediction module.
The key point detection module is used for acquiring RGB images of the ROI and extracting features to obtain human key points; the ROI area comprises an area where a gate opening is located.
And the key point prediction module is used for predicting a two-dimensional motion sequence of the human body key points formed in the continuous frames based on a Temporal Convolution Network (TCN). Judging the behavior of ticket evasion; the said ticket evading behaviors include the behavior of climbing over the ticket evading and the behavior of drilling through the gate.
The parallel ticket evasion detection module comprises a key point acquisition module, a comparison module and a judgment module.
And the key point acquisition module is used for extracting the thermodynamic diagrams of the key points of the two feet of the human body from the key points of the human body.
The comparison module is used for judging the retention time of the thermodynamic diagram stack, and when the retention time exceeds a preset threshold value, a card swiping behavior is determined to exist; and when the card swiping behavior is consistent with the card swiping information in the card swiping temporary database, judging that the card swiping is successful.
The judging module is used for counting the number of key points of the two feet of the human body in the opening time of the gate and judging the number of people who actually pass through; and when the number of people is more than 1, judging that the parallel ticket evasion behaviors exist.
The data analysis module is used for counting the times of the different types of ticket evasion behaviors in a fixed period, and analyzing the change trend of the ticket evasion behaviors and the position information of the frequently occurring ticket evasion behaviors; the ticket evading behaviors comprise a ticket evading behavior by turning over, a ticket evading behavior by drilling through a gate and a parallel ticket evading behavior.
The invention has at least the following beneficial effects:
the method has the advantages that only the ROI is detected, whether the card swiping behavior exists or not and whether the card swiping is successful or not are judged through parallel detection, and then whether the behavior of ticket evasion exists or not is judged according to the number of key points of the two feet of the human body in the opening time of the gate closing machine, the multiple detection obviously improves the ticket checking rate and the detection rate of the ticket evasion behavior, and the burden on hardware is reduced. Meanwhile, different types of ticket evasion behaviors in a fixed period are counted, a common mode of the ticket evasion behaviors and position information of the occurrence of the ticket evasion behaviors are analyzed, a corresponding management and decision method can be made, the method has an important effect on preventing the occurrence of the ticket evasion behaviors, and the detection rate of the ticket evasion behaviors is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a diagram of a system for detecting subway fare evasion based on artificial intelligence according to an embodiment of the present invention;
fig. 2 is a flowchart of detecting the behavior of a subway crossing and passing through a gate for ticket evasion based on artificial intelligence according to an embodiment of the present invention;
fig. 3 is a flowchart of a method for detecting subway parallel fare evasion behavior according to an embodiment of the present invention;
fig. 4 is a flowchart of a subway fare evasion behavior data analysis method according to an embodiment of the present invention;
Detailed Description
In order to further explain the technical means and effects of the present invention adopted to achieve the predetermined invention purpose, the following detailed description, with reference to the accompanying drawings and preferred embodiments, describes a subway fare evasion behavior detection method and system based on artificial intelligence, and the specific implementation, structure, features and effects thereof according to the present invention. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of the subway fare evasion behavior detection method and system based on artificial intelligence in detail with reference to the accompanying drawings.
Referring to fig. 1, a diagram of a subway fare evasion behavior detection system based on artificial intelligence according to an embodiment of the present invention is shown;
before the system is implemented, an Information processing and data exchange platform based on a Building Information Model (BIM) needs to be constructed, wherein the BIM comprises a BIM model of a subway guardrail and a gate port area, an Information exchange module of the BIM model, geographical position coordinate Information of a current area and camera perception Information. And dividing the guardrail and the interested (ROI) area of each gate in the BIM model.
Specifically, the BIM in the subway rail and gate area receives data sensed by all sensors in the current area, such as data sensed by a camera and a camera, simultaneously, stores corresponding sensor information into a central storage server according to a set rule, and periodically updates the information in a covering manner according to the capacity of the server so as to inquire historical data.
Specifically, the geographic position coordinate information specifically includes ground coordinate origin points of the established BIM models of the subway guardrail and the gate opening area, offset of the respective area coordinate origin points and the BIM ground coordinate origin points, and division of the guardrail and the ROI area of each gate opening.
The camera perception information specifically comprises a fixed pose, a shooting area and an acquired image when the camera is set.
The ROI area of the guardrail and each gate opening is divided in the BIM, and subsequent operation only processes images in the ROI area, so that the burden of hardware can be reduced, and the detection accuracy can be improved. For each ROI area in the image plane, because the initial pose of the camera is known, the camera projection matrix of the camera of each sub-area can be easily obtained, the ROI area is divided on the ground plane of the BIM model, and each ROI area of the image plane can be obtained through the projection matrix.
The embodiment provides a subway ticket evasion behavior detection system based on artificial intelligence, which comprises a real-time imaging module, an attitude estimation module, a parallel ticket evasion detection module and a data analysis module.
And the real-time imaging module is used for carrying out image splicing operation on the RGB images of the sub-regions acquired by the plurality of cameras and projecting the images onto the BIM ground.
And the attitude estimation module is used for extracting human body key points from the acquired RGB images and detecting forced passing ticket evasion behaviors such as crossing, drilling through a gate machine and the like through deep learning.
And the parallel ticket evasion detection module is used for judging whether parallel ticket evasion behaviors exist or not through post-processing by utilizing a biped key point thermodynamic diagram extracted from the human body key points obtained in the posture estimation module.
And the data analysis module is used for counting the times of the different types of ticket evasion behaviors in a fixed period and analyzing the change trend of the ticket evasion behaviors and the position information of the frequently occurring ticket evasion behaviors.
Specifically, the real-time imaging module comprises a preprocessing module, a feature extraction module and an image splicing and fusing module.
The preprocessing module is used for preprocessing each sub-region image collected by the camera, performing projection transformation on the preprocessed image, and converting all the images to the same visual angle, so that splicing is facilitated.
In this embodiment, the preprocessing includes image filtering denoising, distortion correction, and the like. In other embodiments, one skilled in the art can employ suitable image pre-processing operations as needed to improve the accuracy of image stitching.
The feature extraction module extracts feature points of the image shot by the camera by adopting a feature point extraction method, the feature point extraction method has a plurality of modes, and the modes such as SIFT, SURF, Harris, SUSAN corner, Kitchen-Rosenfeld and the like can be flexibly adopted according to the field environment.
And after obtaining the characteristic point information of the images, the image splicing and fusion module performs characteristic point matching on adjacent images, and finds out the corresponding positions of the characteristic points in the images to be spliced in the reference images by adopting a matching strategy of a RANSAC characteristic point matching and filtering algorithm so as to determine the transformation relation between the two images. The matching strategy is a RANSAC feature point matching filtering algorithm, redundant feature points can be filtered out, and a homography matrix during matching is obtained. And finally, selecting methods such as feathering, pyramid and gradient according to actual conditions to perform image fusion.
Further, the pose estimation module includes a keypoint detection module and a keypoint prediction module.
The key point detection module is used for inputting RGB images in the ROI into a human body key point detection network formed by an Encoder-Decoder structure, acquiring 15 key points of a head key point, a left shoulder key point, a right shoulder key point, a left elbow key point, a left hand key point, a right hand key point, a spine center key point, a neck center key point, a left crotch key point, a right knee joint key point and a left foot center key point of a human body, and introducing a joint Embedding vector mechanism (associated Embedding) to guide key point matching to obtain two-dimensional bone key points of each person while obtaining the key points. The embodiment of the invention operates the ROI, reduces the burden of hardware and improves the detection rate.
Specifically, the human body key point detection network is composed of an Encoder-Decoder structure; the function of the combined embedding vector mechanism is to distinguish different human key points, so that each human key point can generate a vector label, wherein the difference of label (tag) values among the human key points belonging to the same person is small, the difference of tag values of different persons is large, and the difference is measured by Euclidean distance.
The keypoint prediction module is configured to input a two-dimensional motion sequence of consecutive frames formed by two-dimensional bone keypoints into an artificial neural network, preferably, the artificial neural network is a Time Convolution Network (TCN), and the TCN extracts information in the motion sequence for prediction. The decision result of the fare evasion behavior is output through a full Connected layer (FC) of the TCN network, and the decision result is three types: normal passage, behavior of crossing ticket evasion, and behavior of crossing gate to evade ticket.
Specifically, in order to train the TCN network, the RGB image of the ROI region acquired by the camera is normalized, and in order to facilitate better convergence of the model, the value range of the image matrix is changed to a floating point number between [0, 1 ]. The labels are also normalized. Encoder and Decoder are trained end to end through the collected images and labeled label data. The Encoder extracts the characteristics of the image, inputs the image data subjected to normalization processing, outputs the image data as a characteristic map (Feature map), and then the Decoder samples the Feature map to finally generate a human skeleton key point thermodynamic diagram and an associated Embedding diagram (associated Embedding). The characterization in the channel of the output of the human skeleton key point thermodynamic diagram (Heatmap) is the heat spot conforming to the Gaussian distribution, when a training set is generally labeled, the heat spot centered on a key point is generated by Gaussian kernel convolution and is labeled, and the training is carried out by using a mean square error loss function to obtain the heat spot. Meanwhile, the tag data is normalized, so that the output hot spot value domain is also located at [0, 1 ].
After obtaining the human skeleton key point thermodynamic diagram and the associated embedding diagram, the Loss function is needed to be used for comparing the predicted value with the true value to calculate the Loss, and the Loss (Loss) function in the human skeleton key point detection network adopts the weighted sum of the thermodynamic Loss function (Heatmaps Loss) and the Grouping Loss function (Grouping Loss).
The Heatmaps Loss in the function is used for pulling the tag value of the key point input into the same person and the real value as close as possible, the Grouping Loss is used for Grouping the key points of different human bodies and mutually segmenting the key points, and the tag value of the key point of different people is pulled as large as possible.
The mathematic formula of Heatmaps Loss is:
Figure BDA0002684415500000051
wherein, PcijThe score of the human key point of C at the position (i, j) is represented, and the higher the score is, the more likely the human key point is; y iscijHeatmap representing the true value (ground true); n represents the number of key points in the ground route; alpha and beta are hyper-parameters and need to be set manually.
The mathematical formula of group Loss is:
Figure BDA0002684415500000061
wherein N represents the number of key points in the group channel, N represents the nth person, k represents the kth key point, X represents the pixel position of the real key point,
Figure BDA0002684415500000062
tag value, h, representing ground channelk(xnk) I.e. a predictable tag value, n' is a value set manually by someone other than the nth person.
This yields the total Loss function as:
Figure BDA0002684415500000063
the reason why γ and ψ are artificially set is to make the values of the two losses relatively close and to judge the convergence of the model better.
After the key point thermodynamic diagram and the associated embedded graph processed by the Loss function are obtained, human skeleton key points of each person in the ROI area image can be obtained through the processed key point thermodynamic diagram and the associated embedded graph, in the continuous frame images, the skeleton key points form a two-dimensional motion sequence, the two-dimensional motion sequence passes through a TCN network, the TCN network can extract input two-dimensional motion sequence information, and the extracted information is output through an FC layer of the TCN network.
Further, in the TCN network training stage, two-dimensional key point information is input, and classification results are output; the supervision information is one-hot codes obtained by converting category labels corresponding to the artificially labeled two-dimensional action sequence, and the cross entropy loss function is adopted to update network model parameters to finally obtain the discrimination result of the ticket evasion behavior. In the embodiment of the invention, a multi-frame 2D joint point sequence is input into a TCN network obtained by training, the shape of the input joint point sequence is [ frame number, 30], wherein 30 is the number of x and y coordinates of 15 key points, and finally the output of FC is subjected to FC classification output, and argmax processing is performed on the output of FC to obtain the judgment result of the ticket evasion behavior. The discrimination result of the ticket evasion has two forms of ticket evasion behaviors of climbing over the ticket evasion and drilling through a gate to evade the ticket.
Specifically, the parallel fare evasion detection module comprises a feature extraction module, a comparison module and a judgment module.
The parallel ticket evasion detection module needs to establish a card swiping information temporary database, and the card swiping information temporary database is used for recording current gate port information, card swiping time and times.
The feature extraction module extracts a biped key point thermodynamic diagram in the ROI area from the human body key points obtained in the posture estimation module.
The comparison module is used for stacking hot spots on the adjacent sampled images, and when the area of the overlapped area of the two hot spots is larger than 80% of the total area of the hot spots, the stay state is judged to exist, and the stay time is obtained by combining the sampling frame rate of the camera; when the retention time exceeds a self-set threshold value T, judging that card swiping behaviors exist, wherein the threshold value T is set according to experience; when the card swiping behavior exists, the card swiping information temporary database starts to work, the visually detected card swiping behavior is compared with the information in the temporary database in the parallel ticket evading module, when the visually detected card swiping behavior is inconsistent with the information in the temporary database, the card swiping failure is judged, and when the visually detected card swiping behavior is consistent with the information in the temporary database, the card swiping success is judged.
The judging module is used for judging a specific gate opening according to the position information of the two-foot key points after the card swiping is successful, counting the number of the two-foot key points in the ROI area of the gate opening within the opening time of the gate opening, and obtaining the actual number of people as N/2 if the counted number is N. And the statistical result of the single-frame image has errors, so the single-frame image is simply subjected to post-processing.
As an example of the post-processing, the frame rate of the camera is 30 frames/second, the number of people is detected every two frames, and a total of 15 statistics results are obtained within one second, and if the statistics results of 10 times are the same, the statistics result is taken as the final statistics result. And judging the number of people passing through the gate in the open time, judging whether the parallel ticket evasion behavior exists, judging that the parallel ticket evasion behavior exists when the number of people actually passing through is more than 1, and otherwise, judging that the ticket evasion behavior normally passes through.
When the system detects that the ticket evasion behavior exists, the system can timely inform the nearest staff of managing and educating the ticket evacuee through the information exchange module of the BIM.
Further, the data analysis module carries out statistics on different types of ticket evasion behaviors in a fixed period. When the ticket evading behavior is a ticket evading behavior by turning over or a ticket evading behavior by drilling through a gate, counting the corresponding type of the ticket evading behavior plus one, and simultaneously recording the position information of the ticket evading behavior by turning over or the ticket evading behavior by drilling through the gate; after the parallel ticket evasion behavior occurs, counting the type of the parallel ticket evasion behavior plus one, but not recording the position information of the parallel ticket evasion behavior. Every fixed period, such as one month, the change trend of the ticket evasion behavior and the position information of the frequent ticket evasion are analyzed.
When the intelligent ticket-evading system works, management methods can be reasonably formulated according to actual conditions, such as staff distribution, increase of the height of the guardrail which is frequently crossed and the like, and the intelligent ticket-evading system has a preventive effect on ticket-evading behaviors.
Referring to fig. 2, a flowchart of a method for detecting a subway crossing and gate crossing fare evasion behavior based on artificial intelligence according to another embodiment of the present invention is shown.
In the step a, a BIM model is constructed, and a region of interest (ROI) is divided.
And b, acquiring an image of the ROI and extracting key points of the human body.
And c, predicting the two-dimensional action sequence based on the TCN network.
And d, outputting the category of the fare evasion behavior.
Specifically, in the step a, building BIM models of subway guardrails and gate openings, and dividing ROI areas of the guardrails and each gate opening; in the step b, an RGB image in the ROI area is obtained, operation is carried out in the area to reduce the burden of hardware, the image passes through a human body key point detection network to obtain human body key points, and a joint Embedding vector mechanism (Association Embedding) is introduced to guide matching of the key points to obtain two-dimensional bone key points of each person; in the step c, the two-dimensional skeleton key points are displayed as a two-dimensional motion sequence in the continuous frame images, the two-dimensional motion sequence is input into a Time Convolution (TCN) network, and the TCN network extracts information in the motion sequence and predicts the information; and d, outputting the result of TCN network prediction through a Full Connection (FC) layer, and performing argmax processing on the output of the FC to obtain a judgment result of the ticket evading behavior. The discrimination result has three conditions, namely normal passing, climbing over ticket evasion behavior and ticket evasion behavior through a gate.
Referring to fig. 3, it shows a flowchart of a method for detecting subway parallel fare evasion behavior according to another embodiment of the present invention.
In step S1, a human body two-foot key point and a thermodynamic diagram of the human body two-foot key point are obtained.
And step S2, judging the retention time through thermodynamic diagram, and judging whether the card swiping behavior exists or not through the retention time.
And step S3, comparing the card swiping behavior with the card swiping temporary database information, and judging whether the card swiping is successful.
And step S4, judging the number of people who actually pass through the gate according to the number of key points of the feet of the human body.
And step S5, judging whether the parallel ticket evasion behavior exists according to the actual number of people.
Specifically, in step S1, the obtained thermodynamic diagrams of the key points of both feet of the human body are obtained from the key points of the human body; in step S2, judging the retention time by stacking thermodynamic diagrams of key points of both feet of a human body, that is, the area of an overlapping area of two hot spots is greater than 80% of the total area of the hot spots, judging that the thermal diagram stays, obtaining the retention time by combining with a camera sampling frame rate, comparing the retention time with a preset threshold T, wherein the preset threshold can be set by experience, and when the retention time exceeds the preset threshold T, judging that a card swiping behavior exists, otherwise, the card swiping behavior does not exist; in step S3, a card swiping temporary database is established, and when there is a card swiping behavior, the gateway port information and the time and times of card swiping can be effectively recorded, and the information in the card swiping temporary database is compared with the card swiping behavior to self-certify whether the card swiping is successful or not. When the information in the card swiping temporary database is consistent with the card swiping behavior, judging that the card swiping is successful; if not, judging the failure; in the step S4, when the specific gate opening is judged according to the position information of the key points of the two feet of the human body, the number of the key points of the two feet of the human body in the ROI area of the gate opening is counted, the number of people who actually pass through the gate opening is judged according to the counted number of the key points of the two feet of the human body, and when the peak value of the key points of the two feet is counted to be N, the number of people is N/2; in step S5, the number of key points of both feet in the gate opening time is counted in combination with the specific ROI area of the gate, and whether there is a parallel fare evasion behavior is determined. And when the number of people passing through the gate opening within the opening time is more than 1, determining that the parallel ticket evasion behavior exists.
Referring to fig. 4, a flowchart of a subway fare evasion behavior data analysis method according to another embodiment of the present invention is shown.
Wherein, in step S11, the fare evasion behavior category is determined.
In step S12, the corresponding fare evasion behavior category statistics is increased by one.
And step S21, projecting the position of the non-parallel fare evasion behavior into the BIM model.
And step S13, performing data analysis on the ticket evasion behavior.
Specifically, in step S11, the fare evasion behavior categories are determined and divided into three categories, namely, the rollover fare evasion predicted by the TCN network, the fare evasion through the gate and the parallel fare evasion determined by stacking the key points of the feet of the human body and the thermodynamic diagrams thereof; in step S12, after determining the fare evasion behavior category, the recorded corresponding fare evasion type statistics are increased by one; in step S21, after the ticket evasion behavior occurs, the position where the parallel ticket evasion occurs does not need to be projected into a BIM model of a subway guardrail or a gate, and only a record needs to be added to the statistics of the parallel ticket evasion behavior, but after other ticket evasion behaviors occur, that is, after the ticket evasion behavior occurs by crossing over the ticket and drilling through the gate, and after the statistics of the corresponding ticket evasion behaviors such as recording and the like increase by one, the position where the ticket evasion behavior occurs also needs to be recorded and displayed in the BIM model; in step S13, during a fixed period, the behavior of the ticket evasion is summarized and counted, such as the number of times of parallel ticket evasion, ticket evasion by crossing the fence and ticket evasion by drilling through the gate, the common mode of the ticket evasion, the variation trend of the ticket evasion, and the location information of frequent ticket evasion.
In practice, after the data analysis of the ticket evasion behaviors is finished, a management method can be reasonably established, for example, workers are allocated, the height of the guard rail which is frequently crossed is increased, and the effect of preventing the ticket evasion behaviors can be achieved.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalent replacements, improvements, etc. within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A subway ticket evasion behavior detection method based on artificial intelligence is characterized by comprising the following steps:
obtaining RGB images of the ROI and extracting features to obtain human key points and a human bipedal key point thermodynamic diagram; the ROI area comprises an ROI area of a gate port;
judging the thermodynamic diagram to obtain retention time, and judging that a card swiping behavior exists when the retention time exceeds a preset threshold value;
when the card swiping behavior is consistent with the card swiping information in the card swiping temporary database, judging that the card swiping is successful;
counting the number of key points of the two feet of the human body within the opening time of the gate, and judging the number of people who actually pass through;
and when the number of people is more than 1, judging that the parallel ticket evasion behaviors exist.
2. The method for detecting subway ticket evasion behavior based on artificial intelligence as claimed in claim 1, wherein:
the human body double-foot key point thermodynamic diagram is extracted from the human body key points.
3. The method for detecting subway ticket evasion behavior based on artificial intelligence as claimed in claim 1, wherein:
the dwell time is derived from the number of thermodynamic diagrams in combination with a camera sampling frame rate.
4. The method for detecting subway ticket evasion behavior based on artificial intelligence as claimed in claim 1, wherein:
the method for counting the number of the key points of the double feet of the human body comprises the steps of counting the number of the key points of the double feet of the human body according to the frame rate of a camera and the interval of a fixed frame number, and taking the counting result as a final counting result when the number of the counting result is consistent with the number of the preset counting result.
5. The method for detecting subway ticket evasion behavior based on artificial intelligence as claimed in claim 1, further comprising:
counting different ticket evasion behaviors within a certain time, and analyzing a common mode of the ticket evasion behaviors and a position where the ticket evasion behaviors occur; the different ticket evasion behaviors comprise the parallel ticket evasion behavior, the ticket evasion behavior by turning over and the ticket evasion behavior by drilling through a gate.
6. A subway ticket evasion behavior detection system based on artificial intelligence is characterized by comprising:
the system comprises an attitude estimation module, a parallel fare evasion detection module and a data analysis module;
the attitude estimation module comprises a key point detection module and a key point prediction module;
the key point detection module is used for obtaining human key points for the acquired RGB images of the ROI based on a human key point detection network; the ROI area comprises an area where a gate opening is located;
the key point prediction module is used for predicting a two-dimensional action sequence formed by human key points in continuous frames based on an artificial neural network and judging ticket evasion behaviors; the ticket evasion behaviors comprise a ticket evasion behavior of turning over and a ticket evasion behavior of drilling through a gate.
The parallel ticket evasion detection module comprises a key point acquisition module, a comparison module and a judgment module;
the key point acquisition module is used for extracting a human body two-foot key point thermodynamic diagram from the human body key points;
the comparison module is used for judging the retention time of the thermodynamic diagram stack, and judging that a card swiping behavior exists when the retention time exceeds a preset threshold; when the card swiping behavior is consistent with the card swiping information in the card swiping temporary database, judging that the card swiping is successful;
the judging module is used for counting the number of key points of the two feet of the human body in the opening time of the gate and judging the number of people who actually pass through; when the number of people is more than 1, judging that a parallel ticket evasion behavior exists;
the data analysis module is used for counting the times of ticket evasion behaviors of different types in a fixed period, and analyzing the change trend of the ticket evasion behaviors and the position information of the ticket evasion behaviors; the types of the ticket evasion behaviors comprise the behavior of crossing the ticket evasion, the behavior of drilling through a gate and the behavior of parallel ticket evasion.
7. The system for detecting subway ticket evasion based on artificial intelligence as claimed in claim 5, wherein:
the artificial neural network is a TCN network.
8. The system for detecting subway ticket evasion based on artificial intelligence as claimed in claim 5, wherein:
and the human body key point detection network in the key point detection module is combined with a joint embedded vector mechanism to obtain human body key points.
9. The system for detecting subway ticket evasion based on artificial intelligence as claimed in claim 5, wherein:
the method for counting the number of the key points of the two feet of the human body by the judging module comprises the steps of counting the number of the key points of the two feet of the human body according to the interval of the fixed frame number through the frame rate of a camera, and taking the counting result as a final counting result when the number of the counting result is consistent with the number of the preset counting result.
10. The system for detecting subway ticket evasion based on artificial intelligence as claimed in claim 5, wherein:
the system also includes a real-time imaging module;
the real-time imaging module comprises a preprocessing module, a feature extraction module and an image splicing and fusing module;
the preprocessing module is used for preprocessing the acquired images of the sub-regions and performing projection transformation on the preprocessed images to obtain images to be spliced;
the feature extraction module is used for extracting feature points from the images to be spliced;
and the image splicing and fusing module is used for matching the feature points of the adjacent images, determining the transformation relation between the two images and fusing the images.
CN202010971990.9A 2020-09-16 2020-09-16 Subway ticket evasion behavior detection method and system based on artificial intelligence Withdrawn CN112084987A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010971990.9A CN112084987A (en) 2020-09-16 2020-09-16 Subway ticket evasion behavior detection method and system based on artificial intelligence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010971990.9A CN112084987A (en) 2020-09-16 2020-09-16 Subway ticket evasion behavior detection method and system based on artificial intelligence

Publications (1)

Publication Number Publication Date
CN112084987A true CN112084987A (en) 2020-12-15

Family

ID=73737270

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010971990.9A Withdrawn CN112084987A (en) 2020-09-16 2020-09-16 Subway ticket evasion behavior detection method and system based on artificial intelligence

Country Status (1)

Country Link
CN (1) CN112084987A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112906675A (en) * 2021-04-27 2021-06-04 南京大学 Unsupervised human body key point detection method and system in fixed scene
CN114241051A (en) * 2021-12-21 2022-03-25 盈嘉互联(北京)科技有限公司 Object attitude estimation method for indoor complex scene
WO2023071188A1 (en) * 2021-10-29 2023-05-04 上海商汤智能科技有限公司 Abnormal-behavior detection method and apparatus, and electronic device and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112906675A (en) * 2021-04-27 2021-06-04 南京大学 Unsupervised human body key point detection method and system in fixed scene
CN112906675B (en) * 2021-04-27 2024-03-22 南京大学 Method and system for detecting non-supervision human body key points in fixed scene
WO2023071188A1 (en) * 2021-10-29 2023-05-04 上海商汤智能科技有限公司 Abnormal-behavior detection method and apparatus, and electronic device and storage medium
CN114241051A (en) * 2021-12-21 2022-03-25 盈嘉互联(北京)科技有限公司 Object attitude estimation method for indoor complex scene

Similar Documents

Publication Publication Date Title
CN109522793B (en) Method for detecting and identifying abnormal behaviors of multiple persons based on machine vision
CN112084987A (en) Subway ticket evasion behavior detection method and system based on artificial intelligence
Davis et al. A two-stage template approach to person detection in thermal imagery
CN108665487B (en) Transformer substation operation object and target positioning method based on infrared and visible light fusion
CN111460962B (en) Face recognition method and face recognition system for mask
Sidla et al. Pedestrian detection and tracking for counting applications in crowded situations
US8213679B2 (en) Method for moving targets tracking and number counting
CN103605971B (en) Method and device for capturing face images
CN104036236B (en) A kind of face gender identification method based on multiparameter exponential weighting
CN106600631A (en) Multiple target tracking-based passenger flow statistics method
CN106991370B (en) Pedestrian retrieval method based on color and depth
CN103632427B (en) A kind of gate cracking protection method and gate control system
CN113139521A (en) Pedestrian boundary crossing monitoring method for electric power monitoring
CN107230267A (en) Intelligence In Baogang Kindergarten based on face recognition algorithms is registered method
CN109754478A (en) A kind of face intelligent Checking on Work Attendance method of low user's fitness
CN114612823A (en) Personnel behavior monitoring method for laboratory safety management
CN113743256A (en) Construction site safety intelligent early warning method and device
Kang et al. Persistent objects tracking across multiple non overlapping cameras
CN112132873A (en) Multi-lens pedestrian recognition and tracking based on computer vision
CN110766645B (en) Target person recurrence map generation method based on person identification and segmentation
CN113537019A (en) Detection method for identifying wearing of safety helmet of transformer substation personnel based on key points
CN109711232A (en) Deep learning pedestrian recognition methods again based on multiple objective function
CN115908493A (en) Community personnel track management and display method and system
CN112580633B (en) Public transport passenger flow statistics device and method based on deep learning
CN110852203B (en) Multi-factor suspicious person identification method based on video feature learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20201215