CN112434566B - Passenger flow statistics method and device, electronic equipment and storage medium - Google Patents

Passenger flow statistics method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112434566B
CN112434566B CN202011220311.0A CN202011220311A CN112434566B CN 112434566 B CN112434566 B CN 112434566B CN 202011220311 A CN202011220311 A CN 202011220311A CN 112434566 B CN112434566 B CN 112434566B
Authority
CN
China
Prior art keywords
target
passenger
identification information
database
shoulder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011220311.0A
Other languages
Chinese (zh)
Other versions
CN112434566A (en
Inventor
曾卓熙
张天宇
李帅杰
邹雪中
廖汉秋
高增辉
胡文泽
蔡银燕
王孝宇
覃德荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Bus Group Co ltd
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Bus Group Co ltd
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bus Group Co ltd, Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Bus Group Co ltd
Priority to CN202011220311.0A priority Critical patent/CN112434566B/en
Publication of CN112434566A publication Critical patent/CN112434566A/en
Application granted granted Critical
Publication of CN112434566B publication Critical patent/CN112434566B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides a passenger flow statistics method, a passenger flow statistics device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring image information of a preset image acquisition area in a time period of a vehicle stop station; performing target detection on the image information to obtain a head and shoulder detection frame of the passenger; obtaining a target frame to be tracked based on the head-shoulder detection frame; performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger; and determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires. The embodiment of the application is beneficial to improving the accuracy of passenger flow statistics.

Description

Passenger flow statistics method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer vision, and in particular, to a passenger flow statistics method, apparatus, electronic device, and storage medium.
Background
The statistics of the passenger flow volume has very important significance for the modern society advocating the development of public transportation to the greatest extent, taking bus (bus) operation as an example, traffic management departments or bus (bus) operators can dynamically plan traffic routes based on the passenger flow volume of each station, and intelligent operation of the buses is realized. In the current passenger flow statistics scheme, images collected by front and rear doors of a bus are detected, then the motion direction of a person in the images is judged to determine whether the person gets on or off, and finally the number of people getting on or off a bus at a station is counted. However, in some cases, the image effect acquired by the image acquisition device is not ideal, which has a certain influence on subsequent detection and tracking, so that the accuracy of passenger flow statistics is not high.
Disclosure of Invention
In order to solve the problems, the application provides a passenger flow statistics method, a passenger flow statistics device, electronic equipment and a storage medium, which are beneficial to improving the accuracy of passenger flow statistics.
To achieve the above object, a first aspect of an embodiment of the present application provides a passenger flow statistics method, including:
Acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
performing target detection on the image information to obtain a head and shoulder detection frame of the passenger;
obtaining a target frame to be tracked based on the head-shoulder detection frame;
Performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
And determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires.
A second aspect of an embodiment of the present application provides a passenger flow statistics apparatus, including:
the image acquisition module is used for acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
the target detection module is used for carrying out target detection on the image information to obtain a head and shoulder detection frame of the passenger;
The target tracking module is used for obtaining a target frame to be tracked based on the head-shoulder detection frame;
The target tracking module is further used for carrying out optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
and the passenger flow statistics module is used for determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires.
A third aspect of the embodiments of the present application provides an electronic device, including an input device and an output device, and further including a processor adapted to implement one or more instructions; and a computer storage medium storing one or more instructions adapted to be loaded by the processor and to perform the steps of:
Acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
performing target detection on the image information to obtain a head and shoulder detection frame of the passenger;
obtaining a target frame to be tracked based on the head-shoulder detection frame;
Performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
And determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires.
A fourth aspect of the embodiments of the present application provides a computer storage medium storing one or more instructions adapted to be loaded by a processor and to perform the steps of:
Acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
performing target detection on the image information to obtain a head and shoulder detection frame of the passenger;
obtaining a target frame to be tracked based on the head-shoulder detection frame;
Performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
And determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires.
The scheme of the application at least comprises the following beneficial effects: compared with the prior art, the method and the device for acquiring the image information of the preset image acquisition area in the time period of the vehicle stop station are disclosed; performing target detection on the image information to obtain a head and shoulder detection frame of the passenger; obtaining a target frame to be tracked based on the head-shoulder detection frame; performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger; and determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires. The method is based on detection of the head and the shoulder of the passenger, and the target frame to be tracked is reset on the basis of the head and the shoulder detection frame, so that the problem of target tracking loss caused by shallow image acquisition visual field, overlarge target and the like is avoided, meanwhile, a plurality of trip wires are adopted to judge whether the passenger gets on or gets off the vehicle, the problem of passenger flow judgment error caused by detection of the fact that the passenger gets on the line when one trip wire is judged is solved, and the accuracy of passenger flow statistics is improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an application environment according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a passenger flow statistics method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a target detector according to an embodiment of the present application;
Fig. 4 is a schematic diagram of acquiring a target frame to be tracked according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a determination of a passenger getting on or off a vehicle according to an embodiment of the present application;
FIG. 6 is a schematic diagram of identification warehouse entry and identification matching according to an embodiment of the present application;
fig. 7 is a flow chart of another passenger flow statistics method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a passenger flow statistics device according to an embodiment of the present application;
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
The terms "comprising" and "having" and any variations thereof, as used in the description, claims and drawings, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus. Furthermore, the terms "first," "second," and "third," etc. are used for distinguishing between different objects and not for describing a particular sequential order.
The embodiment of the application provides a passenger flow statistical method, which can be implemented based on an application environment shown in fig. 1, please refer to fig. 1, wherein the application environment comprises a station for passengers to get on or off a vehicle and a vehicle parked at the station, wherein a first image acquisition device and a second image acquisition device are respectively arranged on front and rear doors of the vehicle, the first image acquisition device is used for acquiring image information of the condition of getting on or off the front door of the vehicle, the second image acquisition device is used for acquiring image information of the condition of getting on or off the rear door of the vehicle, the acquired image information is uploaded to a server, the server detects the image information and tracks the detected object, the number of passengers getting on or off the vehicle at the station is determined based on preset double-trip-line passenger flow judgment logic, and meanwhile, the number of passengers on the vehicle can be counted. Specifically, the target detection of the server may be based on the target detection of the head and the shoulder, and for the head and the shoulder target determined to be the passenger getting on, the server may establish a corresponding identifier in the first database, for example: the corresponding ID (Identity document, unique code) is established and the first database may be a base for storing information about the boarding passengers, such as: for the head-shoulder object determined to be the passenger getting off, the server establishes a corresponding Identifier (ID) in a second database, which may be a query library, for storing relevant information of the passenger getting off, for example: the snapshot of the passenger getting off, the head-shoulder characteristics and the marks corresponding to the snapshot, and the like, and the information corresponding to the marks in the second database is matched with the information corresponding to the marks in the first database, so as to obtain the passenger getting on, which is matched with the passenger getting off, and the two marks matched with the passenger getting off are used as matching pairs to enter a third database, wherein the third database is used for storing the related information of the passenger getting on and off, such as: for the getting-off passenger A1 to be matched with the getting-on passenger A, the getting-off passenger A1 and the getting-on passenger A are used as matching pairs to enter a third database, and the information such as a snap-shot image, a head shoulder characteristic, a mark, a getting-off station and the like corresponding to the getting-off passenger A1 and the information such as the snap-shot image, the head shoulder characteristic, the mark, the getting-on station and the like corresponding to the getting-on passenger A are stored in a correlated manner, so that the getting-on station and the getting-off station of a certain passenger can be known through the matching pairs stored in the third database. Therefore, the passenger flow statistics method and the passenger flow statistics system can accurately conduct passenger flow statistics, and further know the boarding stations and the alighting stations of passengers.
Based on the application environment shown in fig. 1, the following describes in detail the passenger flow statistics method provided by the embodiment of the present application with reference to other drawings.
Referring to fig. 2, fig. 2 is a flow chart of a passenger flow statistics method provided by an embodiment of the present application, where the method is applied to a server, as shown in fig. 2, and includes steps S21-S25:
S21, acquiring image information of a preset image acquisition area in a time period of a vehicle stop station.
In a specific embodiment of the application, the preset image acquisition area refers to an image acquisition area covered by a first image acquisition device arranged on a front door and a second image acquisition device arranged on a rear door of a vehicle, and in a time period of a vehicle stop, the first image acquisition device sends image information of getting on or off a vehicle front door passenger to a server, and the second image acquisition device sends image information of getting on or off the vehicle rear door passenger to the server, wherein the image information can be a video monitoring image or a continuous snap image. Optionally, the first image acquisition device and the second image acquisition device can acquire image information in real time, and can acquire image information when detecting that the vehicle starts to stop, and can stop acquiring image information when detecting that the vehicle ends to stop.
And S22, performing target detection on the image information to obtain a head and shoulder detection frame of the passenger.
In one possible implementation manner, the performing object detection on the image information to obtain a head and shoulder detection frame of the passenger includes:
obtaining an image to be processed based on the image information;
downsampling the image to be processed;
the target features obtained through downsampling are selected to be upsampled, and feature images corresponding to the images to be processed are obtained;
And carrying out classification prediction based on the feature map to obtain the head and shoulder detection frame.
In a specific embodiment of the present application, for image information acquired by a first image acquisition device and a second image acquisition device, on the premise of maintaining an aspect ratio, a size of the image to be processed is changed to obtain 512×512 images to be processed, and for the image to be processed, a trained neural network model is adopted as a detector to perform target detection, as shown in fig. 3, the detector is divided into two parts of downsampling and upsampling, wherein the downsampling is performed in five stages, each stage downsampling is 2 times, 32 times total downsampling, each convolution block of downsampling performs convolution operation by adopting separable convolution, so as to reduce the number of parameters and improve the operation speed, in addition, the target features, namely features obtained by downsampling in the first stage, the third stage and the fifth stage, are stored for use in upsampling in the downsampling process. The up-sampling part corresponds to the down-sampling part, features obtained by down-sampling in the fifth stage are used as up-sampling input, the features obtained by down-sampling in the first stage to the third stage are overlapped with the input features to be used as up-sampling input when up-sampling corresponding to the down-sampling in the first stage to the third stage is carried out, and finally, a feature map with the same size as the image to be processed is output. The head and shoulder center of the passenger is predicted by a Gaussian kernel-based method on the feature map, the output is the center of the head and shoulder detection frame, the width and height of the head and shoulder detection frame and the offset value of the center, and the finally output head and shoulder detection frame is presented in a thermodynamic diagram mode.
In a possible implementation manner, the head and shoulder detection frame is obtained by performing target detection on the image information by adopting a trained neural network model; before acquiring the image information of the preset image acquisition area, the method further comprises:
setting the size of a Gaussian kernel of the neural network;
constructing a sample image, inputting the sample image into a neural network for training, and obtaining a target detection result of the sample image;
determining target loss according to the target detection result and the marked image;
And adjusting parameters of the neural network model according to the target loss and the Gaussian kernel.
In the specific embodiment of the application, the marked image is obtained by marking the sample image, and because the Gaussian kernels are overlapped in a dense occasion, the head and the shoulder of two people which are relatively close only output one detection frame. Aiming at the construction of sample images, firstly, a plurality of training images are cut to obtain corresponding small images, and particularly, the head and shoulder targets in the obtained small images are larger by randomly cutting on the training images in the proportion of 16:9, so that image information acquired by lower image acquisition equipment erection on a vehicle is simulated, and the detection effect on the larger targets is improved. And then the small images are spliced to obtain the sample image, so that the sample image comprises a plurality of head-shoulder targets to simulate a scene with dense crowd, thereby being beneficial to improving the generalization capability of the neural network model and relieving the problem of missed detection when the crowd is dense. The target loss includes smoothL loss, focalloss loss, smoothL loss is used for training the width and height of the head and shoulder detection frame and the deviation value of the center, focalloss and the gaussian kernel are used for training the center point of the head and shoulder detection frame, because under the scene that the target center point is denser, the effect of fitting the prediction of the center point of the head and shoulder detection frame by only adopting focalloss loss is poor, and therefore the gaussian kernel is used for guiding focalloss to fit the prediction of the center point of the head and shoulder detection frame so as to improve the fitting effect. Based on the training method, the neural network model adopted in the application is beneficial to solving a series of problems that the image acquisition equipment is lower in erection, head and shoulder targets are overlarge, a detection frame frames a plurality of head and shoulder targets in dense occasions, and the like, so that the detection precision is higher.
S23, obtaining a target frame to be tracked based on the head and shoulder detection frame.
S24, carrying out optical flow tracking on the passenger based on the target frame to be tracked to obtain the movement track of the passenger.
In a specific embodiment of the present application, as shown in fig. 4, because the angle of erection of the image capturing device on the vehicle is low, the detected head-shoulder target is large, if the conventional method is adopted to track the head-shoulder detection frame, the problem of tracking the head-shoulder detection frame is solved, so that the present application does not track the whole head-shoulder detection frame, but selects a m x m box with the center of the head-shoulder detection frame as the center, takes the m x m box as the target frame to be tracked, takes the characteristic point of m x m in the box as the optical flow tracking point, and predicts the relation of the intersection ratio (IoU, intersection over Union) of the target frame to be tracked and the predicted frame in the next frame (or the next image) by combining the optical flow algorithm and the hungarian algorithm, thereby realizing the tracking of the passengers and obtaining the motion trail of the passengers. Wherein m is a positive integer of 20 or less.
S25, determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires.
In one possible embodiment, the plurality of tripwires includes a first tripwire, a second tripwire, a third tripwire, and a fourth tripwire; the method for determining the boarding number and the alighting number of the station according to the motion trail and a plurality of preset tripwires comprises the following steps:
The passengers with the motion trail covering the first tripwire and the second tripwire in sequence are determined to be boarding passengers, and the boarding number of the station is obtained;
and determining the passengers with the movement tracks covering the third trip wire and the fourth trip wire in sequence as getting-off passengers to obtain the getting-off number of the station.
In a specific embodiment of the present application, as shown in fig. 5, four trip wires are preset to determine whether a passenger gets on or off, where a first trip wire and a second trip wire are used for determining that a passenger gets on, and if a motion track of a certain passenger sequentially covers the first trip wire and the second trip wire, it is determined that the passenger gets on, and the number of passengers gets on is +1; if the motion trail of a certain passenger sequentially covers the third trip wire and the fourth trip wire, the passenger is judged to be a getting-off passenger, and the getting-off passenger number is +1. The motion trail is considered to be on the bus only when the first tripwire and the second tripwire are covered in sequence, if the motion trail is covered on one of the first tripwire and the second tripwire, statistics is not carried out on the first tripwire and the second tripwire, and similarly, the motion trail is considered to be off the bus only when the motion trail is covered on the third tripwire and the fourth tripwire in sequence, and if the motion trail is covered on one of the third tripwire and the fourth tripwire, statistics is not carried out on the third tripwire and the fourth tripwire, so that repeated statistics problems caused by loiter of staff are solved.
In one possible embodiment, the method further comprises:
Extracting a first target image of the boarding passenger passing by the second tripwire from the image information;
Generating first identification information of the first target image in a first database, and extracting first shoulder characteristics of the boarding passengers;
and storing the first target image, the first identification information, the first shoulder feature and the boarding point in an associated mode.
Specifically, the first target image, that is, an image (or a frame) of the moment when the boarding passenger passes the second trip line, is used for establishing first identification information for the first target image in the first database by the server, and extracting features of the first target image by adopting a trained re-identification model to obtain first shoulder features of the boarding passenger for subsequent matching use. Because the orientation of passengers can change 180 degrees when getting on or off the vehicle, and the head and shoulder characteristics of the characters are relatively less, the head and shoulder characteristics are adopted for matching, so that the matching rate is improved. As shown in fig. 6, the extracted first head shoulder feature is stored in association with the first target image, the first identification information, and the station (the station is considered to be the boarding point) in the first database.
Further, for extracting the first head shoulder feature, firstly changing the size of the first head shoulder feature to obtain an image to be input, then inputting resnet neural network to perform feature extraction, finally normalizing the extracted feature to obtain the first head shoulder feature, training a re-identification model by adopting softmax loss and HARD MINING TRIPLET loss, and performing enhancement processing on training data to improve the generalization capability of the model.
In one possible embodiment, the method further comprises:
Extracting a second target image of the off passenger passing through the fourth tripwire from the image information;
generating second identification information of the second target image in a second database, and extracting second head shoulder characteristics of the off-board passengers;
And storing the second target image, the second identification information, the second head-shoulder characteristics and the departure station point in an associated mode.
Specifically, the second target image is one (or one frame) image of the moment when the passenger gets off the bus and passes the fourth trip wire. For the second target image, the server also adopts a re-identification model to extract the features of the second target image, so as to obtain the second head shoulder feature of the passenger getting off, and as shown in fig. 6, the extracted first head shoulder feature is associated with the second target image, second identification information and a station (the station is considered to be the station point of getting off) and is stored in a second database.
In one possible embodiment, the method further comprises:
Under the condition that the second identification information exists in the second database, matching the second head shoulder characteristic corresponding to the second identification information with the first head shoulder characteristic in the first database to determine a first target boarding passenger;
Storing the second identification information and the first identification information corresponding to the first target boarding passenger as a matching pair into a third database under the condition that the first identification information corresponding to the first target boarding passenger does not exist in the third database;
And under the condition that the first identification information corresponding to the first target boarding passenger exists in the third database, storing the second identification information and the first identification information corresponding to the first target boarding passenger into the third database as matching pairs so as to replace original matching pairs of the first identification information corresponding to the first target boarding passenger in the third database.
Specifically, referring to fig. 6, when the second identification information is generated in the second database, the server matches the second shoulder feature corresponding to the identification information with all the first shoulder features in the first database to select the matched boarding passenger, that is, the first target boarding passenger, for example: and selecting the boarding passenger with the largest similarity as the first target boarding passenger by calculating the cosine similarity of the second head shoulder characteristic and the first head shoulder characteristic. And then inquiring whether first identification information of the first target boarding passenger exists in a third database, if the first identification information does not exist, the fact that the boarding passenger matched with the first target boarding passenger is not found yet is indicated, then the first identification information corresponding to the second identification information and the first target boarding passenger is stored in the third database as a matched pair, a second target image corresponding to the second identification information, a second head shoulder feature and a boarding station point are stored in the third database in a correlated mode, the second identification information, the second target image corresponding to the second identification information, the second head shoulder feature and the boarding station point are deleted from the second database, and therefore the boarding station point and the boarding station point of the same passenger are displayed in the third database, and the storage space of the second database can be saved. In addition, if the first identification information of the first target boarding passenger already exists in the third database, it indicates that the alighting passenger matching the first target boarding passenger has been found, but the alighting passenger corresponding to the current second identification information has a higher matching degree with the first target boarding passenger, which indicates that the original matching pair described by the first identification information corresponding to the first target boarding passenger is inaccurate, for example: for the boarding passenger A, the original matching pair is matched with the boarding passenger B1, but the boarding passenger B2 is more matched with the boarding passenger B1, the original A-B1 matching pair is inaccurate, the original matching pair A-B1 in the third database is required to be updated by the current second identification information B2, and the A-B2 is used as a new matching pair to replace the original matching pair A-B1. And simultaneously, storing second identification information in the original matching pair, a second target image corresponding to the second identification information, a second head shoulder feature and a station point in a second database, and deleting the original matching pair from a third database. The first target image, the first head shoulder feature and the next station point which are originally matched with the first identification information B1 and the second target image corresponding to the first identification information B1 are restored to the first database, and all the matched information of the first identification information A-B1 is deleted.
In one possible embodiment, the method further comprises:
Acquiring the generation time of the second identification information in the original matching pair in the second database;
acquiring the first shoulder feature in the first database which is put in storage before the generation time;
Matching the second shoulder feature corresponding to the second identification information in the original matching pair with the first shoulder feature put in storage before the generation time to determine a second target boarding passenger;
And under the condition that the first identification information corresponding to the second target boarding passenger does not exist in a third database, storing the second identification information in the original matching pair and the first identification information corresponding to the second target boarding passenger into the third database as a matching pair.
Specifically, after the second identification information and the corresponding second target image, the second head-shoulder feature and the departure station point in the original matching pair are stored in the second database, a matched second target boarding passenger needs to be found for the boarding passenger corresponding to the second identification information. Since the second identification information in the original matching pair is put in storage before, the similarity can be calculated only by using the first shoulder characteristic of the first database which is entered before the generation time in the second database during matching, and the boarding passenger with the largest similarity is taken as the second target boarding passenger, so that the matching speed during updating of the matching pair can be improved. It should be understood that after determining that the second target boarding passenger is in the third database, it is also determined whether the second target boarding passenger exists in the third database, if the second target boarding passenger does not exist, the second identification information in the original matching pair and the first identification information corresponding to the second target boarding passenger are stored as new matching pairs, if the first identification information corresponding to the second target boarding passenger already exists in the third database, which indicates that the second target boarding passenger already exists matching pairs, at this time, the second identification information in the original matching pair and the first identification information corresponding to the second target boarding passenger are still stored in the third database to replace the matching pairs existing in the second target boarding passenger, so that each boarding passenger can be matched with the most similar boarding passenger, that is, the accuracy of determining where the same passenger is boarding and alighting is higher is ensured.
It can be seen that in the embodiment of the application, in the time period of the vehicle stop, the image information of the preset image acquisition area is acquired; performing target detection on the image information to obtain a head and shoulder detection frame of the passenger; obtaining a target frame to be tracked based on the head-shoulder detection frame; performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger; and determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires. The method is based on detection of the head and the shoulder of the passenger, and the target frame to be tracked is reset on the basis of the head and the shoulder detection frame, so that the problem of target tracking loss caused by shallow image acquisition visual field, overlarge target and the like is avoided, meanwhile, a plurality of trip wires are adopted to judge whether the passenger gets on or gets off the vehicle, the problem of passenger flow judgment error caused by detection of the fact that the passenger gets on the line when one trip wire is judged is solved, and the accuracy of passenger flow statistics is improved.
Referring to fig. 7, a flowchart of another passenger flow statistics method provided by the embodiment of the present application in fig. 7 is shown in fig. 7, and includes steps S71-S76:
s71, acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
S72, performing target detection on the image information to obtain a head and shoulder detection frame of the passenger;
s73, obtaining a target frame to be tracked based on the head and shoulder detection frame;
s74, carrying out optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
S75, determining the passengers with the motion trail covering the first trip wire and the second trip wire in sequence as boarding passengers to obtain boarding numbers of stations;
And S76, determining the passengers with the motion trail covering the third trip wire and the fourth trip wire in sequence as the getting-off passengers, and obtaining the getting-off number of the station.
The specific implementation of steps S71-S76 is described in the embodiments shown in fig. 2-6, and the same or similar advantages can be achieved, and the description thereof is omitted herein for avoiding repetition.
In the embodiment of the application, in the time period of a vehicle stop, the image information of a preset image acquisition area is acquired; performing target detection on the image information to obtain a head and shoulder detection frame of the passenger; obtaining a target frame to be tracked based on the head-shoulder detection frame; performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger; the passengers with the motion trail covering the first trip wire and the second trip wire in sequence are determined to be boarding passengers, and the boarding number of the station is obtained; and determining the passengers with the motion trail covering the third trip wire and the fourth trip wire in sequence as the passengers getting off, and obtaining the number of passengers getting off the station. The method is characterized in that the method comprises the steps of detecting the head and shoulder of a passenger, resetting a target frame to be tracked on the basis of the head and shoulder detection frame, and solving the problems of target tracking caused by shallow image acquisition view, overlarge target and the like.
Based on the description of the above embodiment of the passenger flow statistics method, please refer to fig. 8, fig. 8 is a schematic structural diagram of a passenger flow statistics device provided by an embodiment of the present application, as shown in fig. 8, the device includes:
an image acquisition module 81, configured to acquire image information of a preset image acquisition area during a time period of a vehicle stop;
the target detection module 82 is configured to perform target detection on the image information to obtain a head and shoulder detection frame of the passenger;
The target tracking module 83 is configured to obtain a target frame to be tracked based on the head-shoulder detection frame;
The target tracking module is further used for carrying out optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
The passenger flow statistics module 84 is configured to determine the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires.
In one possible embodiment, the plurality of tripwires includes a first tripwire, a second tripwire, a third tripwire, and a fourth tripwire; in determining the number of boarding and disembarking persons at the station according to the motion trail and the preset plurality of tripwires, the passenger flow statistics module 84 is specifically configured to:
The passengers with the motion trail covering the first tripwire and the second tripwire in sequence are determined to be boarding passengers, and the boarding number of the station is obtained;
and determining the passengers with the movement tracks covering the third trip wire and the fourth trip wire in sequence as getting-off passengers to obtain the getting-off number of the station.
In one possible implementation, the passenger flow statistics module 84 is further configured to:
Extracting a first target image of the boarding passenger passing by the second tripwire from the image information;
Generating first identification information of the first target image in a first database, and extracting first shoulder characteristics of the boarding passengers;
and storing the first target image, the first identification information, the first shoulder feature and the boarding point in an associated mode.
In one possible implementation, the passenger flow statistics module 84 is further configured to:
Extracting a second target image of the off passenger passing through the fourth tripwire from the image information;
generating second identification information of the second target image in a second database, and extracting second head shoulder characteristics of the off-board passengers;
And storing the second target image, the second identification information, the second head-shoulder characteristics and the departure station point in an associated mode.
In one possible implementation, the passenger flow statistics module 84 is further configured to:
Under the condition that the second identification information exists in the second database, matching the second head shoulder characteristic corresponding to the second identification information with the first head shoulder characteristic in the first database to determine a first target boarding passenger;
Storing the second identification information and the first identification information corresponding to the first target boarding passenger as a matching pair into a third database under the condition that the first identification information corresponding to the first target boarding passenger does not exist in the third database;
And under the condition that the first identification information corresponding to the first target boarding passenger exists in the third database, storing the second identification information and the first identification information corresponding to the first target boarding passenger into the third database as matching pairs so as to replace original matching pairs of the first identification information corresponding to the first target boarding passenger in the third database.
In one possible implementation, the passenger flow statistics module 84 is further configured to:
the second target image, the second head-shoulder characteristics and the get-off station corresponding to the second identification information are stored in the third database in an associated mode with the second identification information;
And deleting the second identification information, the corresponding second target image, the second head-shoulder characteristic and the get-off station from the second database.
In one possible implementation manner, in replacing the original matching pair to which the first identification information corresponding to the first target boarding passenger belongs in the third database, the passenger flow statistics module 84 is specifically configured to:
storing the second identification information in the original matching pair, the second target image corresponding to the second identification information, the second head shoulder feature and the departure station point back to the second database;
And deleting the original matching pair from the third database.
In one possible implementation, the passenger flow statistics module 84 is further configured to:
Acquiring the generation time of the second identification information in the original matching pair in the second database;
acquiring the first shoulder feature in the first database which is put in storage before the generation time;
Matching the second shoulder feature corresponding to the second identification information in the original matching pair with the first shoulder feature put in storage before the generation time to determine a second target boarding passenger;
And under the condition that the first identification information corresponding to the second target boarding passenger does not exist in a third database, storing the second identification information in the original matching pair and the first identification information corresponding to the second target boarding passenger into the third database as a matching pair.
In one possible implementation, in performing the target detection on the image information to obtain a head and shoulder detection frame of the passenger, the target detection module 82 is specifically configured to:
obtaining an image to be processed based on the image information;
downsampling the image to be processed;
the target features obtained through downsampling are selected to be upsampled, and feature images corresponding to the images to be processed are obtained;
And carrying out classification prediction based on the feature map to obtain the head and shoulder detection frame.
In one possible implementation manner, in obtaining the target frame to be tracked based on the head-shoulder detection frame, the target tracking module 83 is specifically configured to:
A square frame of m is determined by taking the center of the head shoulder detection frame as the center;
and taking the square block of m as the target frame to be tracked.
According to one embodiment of the present application, each unit of the passenger flow statistics device shown in fig. 8 may be separately or completely combined into one or several other units, or some (some) of the units may be further split into a plurality of units with smaller functions, which may achieve the same operation without affecting the implementation of the technical effects of the embodiment of the present application. The above units are divided based on logic functions, and in practical applications, the functions of one unit may be implemented by a plurality of units, or the functions of a plurality of units may be implemented by one unit. In other embodiments of the application, the passenger flow statistics-based device may also comprise other units, and in practical applications, these functions may also be implemented with the assistance of other units, and may be implemented by the cooperation of a plurality of units.
According to another embodiment of the present application, a passenger flow statistics apparatus as shown in fig. 8 may be constructed by running a computer program (including program code) capable of executing the steps involved in the respective methods as shown in fig. 2 or fig. 7 on a general-purpose computing device such as a computer including a Central Processing Unit (CPU), a random access storage medium (RAM), a read-only storage medium (ROM), etc., processing elements and storage elements, and implementing the passenger flow statistics method of the embodiment of the present application. The computer program may be recorded on, for example, a computer-readable recording medium, and loaded into and executed by the above-described computing device via the computer-readable recording medium.
Based on the description of the method embodiment and the device embodiment, the embodiment of the application also provides electronic equipment. Referring to fig. 9, the electronic device includes at least a processor 91, an input device 92, an output device 93, and a computer storage medium 94. Wherein the processor 91, input device 92, output device 93, and computer storage medium 94 within the electronic device may be connected by a bus or other means.
The computer storage medium 94 may be stored in a memory of an electronic device, the computer storage medium 94 being for storing a computer program comprising program instructions, the processor 91 being for executing the program instructions stored by the computer storage medium 94. The processor 91, or CPU (Central Processing Unit )), is a computing core as well as a control core of the electronic device, which is adapted to implement one or more instructions, in particular to load and execute one or more instructions to implement a corresponding method flow or a corresponding function.
In one embodiment, the processor 91 of the electronic device provided in the embodiment of the present application may be used to perform a series of processes of passenger flow statistics:
Acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
performing target detection on the image information to obtain a head and shoulder detection frame of the passenger;
obtaining a target frame to be tracked based on the head-shoulder detection frame;
Performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
And determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires.
In yet another embodiment, the plurality of tripwires includes a first tripwire, a second tripwire, a third tripwire, and a fourth tripwire; the processor 91 executes the determining the number of boarding and disembarking persons at the station according to the motion trail and the preset plurality of tripwires, including:
The passengers with the motion trail covering the first tripwire and the second tripwire in sequence are determined to be boarding passengers, and the boarding number of the station is obtained;
and determining the passengers with the movement tracks covering the third trip wire and the fourth trip wire in sequence as getting-off passengers to obtain the getting-off number of the station.
In yet another embodiment, the processor 91 is further configured to:
Extracting a first target image of the boarding passenger passing by the second tripwire from the image information;
Generating first identification information of the first target image in a first database, and extracting first shoulder characteristics of the boarding passengers;
and storing the first target image, the first identification information, the first shoulder feature and the boarding point in an associated mode.
In yet another embodiment, the processor 91 is further configured to:
Extracting a second target image of the off passenger passing through the fourth tripwire from the image information;
generating second identification information of the second target image in a second database, and extracting second head shoulder characteristics of the off-board passengers;
And storing the second target image, the second identification information, the second head-shoulder characteristics and the departure station point in an associated mode.
In yet another embodiment, the processor 91 is further configured to:
Under the condition that the second identification information exists in the second database, matching the second head shoulder characteristic corresponding to the second identification information with the first head shoulder characteristic in the first database to determine a first target boarding passenger;
Storing the second identification information and the first identification information corresponding to the first target boarding passenger as a matching pair into a third database under the condition that the first identification information corresponding to the first target boarding passenger does not exist in the third database;
And under the condition that the first identification information corresponding to the first target boarding passenger exists in the third database, storing the second identification information and the first identification information corresponding to the first target boarding passenger into the third database as matching pairs so as to replace original matching pairs of the first identification information corresponding to the first target boarding passenger in the third database.
In yet another embodiment, the processor 91 is further configured to:
the second target image, the second head-shoulder characteristics and the get-off station corresponding to the second identification information are stored in the third database in an associated mode with the second identification information;
And deleting the second identification information, the corresponding second target image, the second head-shoulder characteristic and the get-off station from the second database.
In yet another embodiment, the processor 91 executes the replacement of the original matching pair to which the first identification information corresponding to the first target boarding passenger belongs in the third database, including:
storing the second identification information in the original matching pair, the second target image corresponding to the second identification information, the second head shoulder feature and the departure station point back to the second database;
And deleting the original matching pair from the third database.
In yet another embodiment, the processor 91 is further configured to:
Acquiring the generation time of the second identification information in the original matching pair in the second database;
acquiring the first shoulder feature in the first database which is put in storage before the generation time;
Matching the second shoulder feature corresponding to the second identification information in the original matching pair with the first shoulder feature put in storage before the generation time to determine a second target boarding passenger;
And under the condition that the first identification information corresponding to the second target boarding passenger does not exist in a third database, storing the second identification information in the original matching pair and the first identification information corresponding to the second target boarding passenger into the third database as a matching pair.
By way of example, the electronic devices described above may be servers, cloud servers, computer hosts, server clusters, distributed systems, etc., including but not limited to processors 91, input devices 92, output devices 93, and computer storage media 94. It will be appreciated by those skilled in the art that the schematic diagram is merely an example of an electronic device and is not limiting of an electronic device, and may include more or fewer components than shown, or certain components may be combined, or different components.
It should be noted that, since the steps in the above-mentioned passenger flow statistics method are implemented when the processor 91 of the electronic device executes the computer program, the embodiments of the passenger flow statistics method described above are all applicable to the electronic device, and all achieve the same or similar beneficial effects.
The embodiment of the application also provides a computer storage medium (Memory), which is a Memory device in the electronic device and is used for storing programs and data. It will be appreciated that the computer storage medium herein may include both a built-in storage medium in the terminal and an extended storage medium supported by the terminal. The computer storage medium provides a storage space that stores an operating system of the terminal. Also stored in this memory space are one or more instructions, which may be one or more computer programs (including program code), adapted to be loaded and executed by the processor 91. The computer storage medium herein may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory; alternatively, it may be at least one computer storage medium located remotely from the aforementioned processor 91. In one embodiment, one or more instructions stored in a computer storage medium may be loaded and executed by processor 91 to implement the respective steps described above with respect to the passenger flow statistics method; in particular implementations, one or more instructions in a computer storage medium are loaded by processor 91 and perform the steps of:
Acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
performing target detection on the image information to obtain a head and shoulder detection frame of the passenger;
obtaining a target frame to be tracked based on the head-shoulder detection frame;
Performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
And determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
the passengers with the motion trail covering the first trip wire and the second trip wire in sequence are determined to be boarding passengers, and the boarding number of the station is obtained;
And determining the passengers with the motion trail covering the third trip wire and the fourth trip wire in sequence as the passengers getting off, and obtaining the number of passengers getting off the station.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
Extracting a first target image of the boarding passenger passing by the second tripwire from the image information;
Generating first identification information of the first target image in a first database, and extracting first shoulder characteristics of the boarding passengers;
and storing the first target image, the first identification information, the first shoulder feature and the boarding point in an associated mode.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
Extracting a second target image of the off passenger passing through the fourth tripwire from the image information;
generating second identification information of the second target image in a second database, and extracting second head shoulder characteristics of the off-board passengers;
And storing the second target image, the second identification information, the second head-shoulder characteristics and the departure station point in an associated mode.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
Under the condition that the second identification information exists in the second database, matching the second head shoulder characteristic corresponding to the second identification information with the first head shoulder characteristic in the first database to determine a first target boarding passenger;
Storing the second identification information and the first identification information corresponding to the first target boarding passenger as a matching pair into a third database under the condition that the first identification information corresponding to the first target boarding passenger does not exist in the third database;
And under the condition that the first identification information corresponding to the first target boarding passenger exists in the third database, storing the second identification information and the first identification information corresponding to the first target boarding passenger into the third database as matching pairs so as to replace original matching pairs of the first identification information corresponding to the first target boarding passenger in the third database.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
the second target image, the second head-shoulder characteristics and the get-off station corresponding to the second identification information are stored in the third database in an associated mode with the second identification information;
And deleting the second identification information, the corresponding second target image, the second head-shoulder characteristic and the get-off station from the second database.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
storing the second identification information in the original matching pair, the second target image corresponding to the second identification information, the second head shoulder feature and the departure station point back to the second database;
And deleting the original matching pair from the third database.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
Acquiring the generation time of the second identification information in the original matching pair in the second database;
acquiring the first shoulder feature in the first database which is put in storage before the generation time;
Matching the second shoulder feature corresponding to the second identification information in the original matching pair with the first shoulder feature put in storage before the generation time to determine a second target boarding passenger;
And under the condition that the first identification information corresponding to the second target boarding passenger does not exist in a third database, storing the second identification information in the original matching pair and the first identification information corresponding to the second target boarding passenger into the third database as a matching pair.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
obtaining an image to be processed based on the image information;
downsampling the image to be processed;
the target features obtained through downsampling are selected to be upsampled, and feature images corresponding to the images to be processed are obtained;
And carrying out classification prediction based on the feature map to obtain the head and shoulder detection frame.
In yet another example, one or more instructions in the computer storage medium, when loaded by the processor 91, further perform the steps of:
A square frame of m is determined by taking the center of the head shoulder detection frame as the center;
and taking the square block of m as the target frame to be tracked.
The computer program of the computer storage medium may illustratively include computer program code, which may be in source code form, object code form, executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
It should be noted that, since the steps in the above-mentioned passenger flow statistics method are implemented when the computer program of the computer storage medium is executed by the processor, all embodiments of the above-mentioned passenger flow statistics method are applicable to the computer storage medium, and the same or similar beneficial effects can be achieved.
The foregoing has outlined rather broadly the more detailed description of embodiments of the application, wherein the principles and embodiments of the application are explained in detail using specific examples, the above examples being provided solely to facilitate the understanding of the method and core concepts of the application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (10)

1. A method of passenger flow statistics, the method comprising:
Acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
performing target detection on the image information to obtain a head and shoulder detection frame of the passenger;
obtaining a target frame to be tracked based on the head-shoulder detection frame;
Performing optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
determining the number of boarding and disembarking persons at a station according to the motion trail and a plurality of preset tripwires;
The plurality of tripwires comprises a first tripwire, a second tripwire, a third tripwire and a fourth tripwire; the method further comprises the steps of:
Extracting a first target image of the boarding passenger passing by the second tripwire from the image information;
Generating first identification information of the first target image in a first database, and extracting first shoulder characteristics of the boarding passengers;
storing the first target image, the first identification information, the first head shoulder characteristic and the boarding point in an associated manner;
Extracting a second target image of the off passenger passing through the fourth tripwire from the image information;
generating second identification information of the second target image in a second database, and extracting second head shoulder characteristics of the off-board passengers;
Storing the second target image, the second identification information, the second head-shoulder characteristics and the departure station point in an associated manner;
Under the condition that the second identification information exists in the second database, matching the second head shoulder characteristic corresponding to the second identification information with the first head shoulder characteristic in the first database to determine a first target boarding passenger;
Storing the second identification information and the first identification information corresponding to the first target boarding passenger as a matching pair into a third database under the condition that the first identification information corresponding to the first target boarding passenger does not exist in the third database;
And under the condition that the first identification information corresponding to the first target boarding passenger exists in the third database, storing the second identification information and the first identification information corresponding to the first target boarding passenger into the third database as matching pairs so as to replace original matching pairs of the first identification information corresponding to the first target boarding passenger in the third database.
2. The method of claim 1, wherein the determining the number of boarding and disembarking persons for the station based on the motion profile and a preset plurality of tripwires comprises:
The passengers with the motion trail covering the first tripwire and the second tripwire in sequence are determined to be boarding passengers, and the boarding number of the station is obtained;
and determining the passengers with the movement tracks covering the third trip wire and the fourth trip wire in sequence as getting-off passengers to obtain the getting-off number of the station.
3. The method according to claim 1, wherein the method further comprises:
the second target image, the second head-shoulder characteristics and the get-off station corresponding to the second identification information are stored in the third database in an associated mode with the second identification information;
And deleting the second identification information, the corresponding second target image, the second head-shoulder characteristic and the get-off station from the second database.
4. The method according to claim 1, wherein the replacing the original matching pair to which the first identification information corresponding to the first target boarding passenger belongs in the third database includes:
storing the second identification information in the original matching pair, the second target image corresponding to the second identification information, the second head shoulder feature and the departure station point back to the second database;
And deleting the original matching pair from the third database.
5. The method according to claim 4, wherein the method further comprises:
Acquiring the generation time of the second identification information in the original matching pair in the second database;
acquiring the first shoulder feature in the first database which is put in storage before the generation time;
Matching the second shoulder feature corresponding to the second identification information in the original matching pair with the first shoulder feature put in storage before the generation time to determine a second target boarding passenger;
And under the condition that the first identification information corresponding to the second target boarding passenger does not exist in a third database, storing the second identification information in the original matching pair and the first identification information corresponding to the second target boarding passenger into the third database as a matching pair.
6. The method according to claim 1, wherein the performing object detection on the image information to obtain a head and shoulder detection frame of the passenger includes:
obtaining an image to be processed based on the image information;
downsampling the image to be processed;
the target features obtained through downsampling are selected to be upsampled, and feature images corresponding to the images to be processed are obtained;
And carrying out classification prediction based on the feature map to obtain the head and shoulder detection frame.
7. The method according to claim 1, wherein the obtaining the target frame to be tracked based on the head-shoulder detection frame comprises:
a square frame of m is determined by taking the center of the head shoulder detection frame as the center; wherein m is a positive integer of 20 or less;
and taking the square block of m as the target frame to be tracked.
8. A passenger flow statistics apparatus, the apparatus comprising:
the image acquisition module is used for acquiring image information of a preset image acquisition area in a time period of a vehicle stop station;
the target detection module is used for carrying out target detection on the image information to obtain a head and shoulder detection frame of the passenger;
The target tracking module is used for obtaining a target frame to be tracked based on the head-shoulder detection frame;
The target tracking module is further used for carrying out optical flow tracking on the passenger based on the target frame to be tracked to obtain a movement track of the passenger;
The passenger flow statistics module is used for determining the number of boarding and disembarking persons at the station according to the motion trail and a plurality of preset tripwires;
the plurality of tripwires comprises a first tripwire, a second tripwire, a third tripwire and a fourth tripwire; the passenger flow statistics module is also used for:
Extracting a first target image of the boarding passenger passing by the second tripwire from the image information;
Generating first identification information of the first target image in a first database, and extracting first shoulder characteristics of the boarding passengers;
storing the first target image, the first identification information, the first head shoulder characteristic and the boarding point in an associated manner;
Extracting a second target image of the off passenger passing through the fourth tripwire from the image information;
generating second identification information of the second target image in a second database, and extracting second head shoulder characteristics of the off-board passengers;
Storing the second target image, the second identification information, the second head-shoulder characteristics and the departure station point in an associated manner;
Under the condition that the second identification information exists in the second database, matching the second head shoulder characteristic corresponding to the second identification information with the first head shoulder characteristic in the first database to determine a first target boarding passenger;
Storing the second identification information and the first identification information corresponding to the first target boarding passenger as a matching pair into a third database under the condition that the first identification information corresponding to the first target boarding passenger does not exist in the third database;
And under the condition that the first identification information corresponding to the first target boarding passenger exists in the third database, storing the second identification information and the first identification information corresponding to the first target boarding passenger into the third database as matching pairs so as to replace original matching pairs of the first identification information corresponding to the first target boarding passenger in the third database.
9. An electronic device comprising an input device and an output device, further comprising:
A processor adapted to implement one or more instructions; and
A computer storage medium storing one or more instructions adapted to be loaded by the processor and to perform the method of any one of claims 1-7.
10. A computer storage medium storing one or more instructions adapted to be loaded by a processor and to perform the method of any one of claims 1-7.
CN202011220311.0A 2020-11-04 2020-11-04 Passenger flow statistics method and device, electronic equipment and storage medium Active CN112434566B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011220311.0A CN112434566B (en) 2020-11-04 2020-11-04 Passenger flow statistics method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011220311.0A CN112434566B (en) 2020-11-04 2020-11-04 Passenger flow statistics method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112434566A CN112434566A (en) 2021-03-02
CN112434566B true CN112434566B (en) 2024-05-07

Family

ID=74695452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011220311.0A Active CN112434566B (en) 2020-11-04 2020-11-04 Passenger flow statistics method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112434566B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408587B (en) * 2021-05-24 2022-06-03 支付宝(杭州)信息技术有限公司 Bus passenger OD matching method and device and electronic equipment
JP2022185837A (en) * 2021-06-03 2022-12-15 クラスメソッド株式会社 Management server and management method for managing commodity products in unmanned store
CN113971784A (en) * 2021-10-28 2022-01-25 北京市商汤科技开发有限公司 Passenger flow statistical method and device, computer equipment and storage medium
CN114495491A (en) * 2021-12-31 2022-05-13 深圳云天励飞技术股份有限公司 Method and device for determining cross-line travel route, computer equipment and medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049787A (en) * 2011-10-11 2013-04-17 汉王科技股份有限公司 People counting method and system based on head and shoulder features
CN105139425A (en) * 2015-08-28 2015-12-09 浙江宇视科技有限公司 People counting method and device
CN105512720A (en) * 2015-12-15 2016-04-20 广州通达汽车电气股份有限公司 Public transport vehicle passenger flow statistical method and system
CN105844234A (en) * 2016-03-21 2016-08-10 商汤集团有限公司 People counting method and device based on head shoulder detection
WO2017156772A1 (en) * 2016-03-18 2017-09-21 深圳大学 Method of computing passenger crowdedness and system applying same
CN108241844A (en) * 2016-12-27 2018-07-03 北京文安智能技术股份有限公司 A kind of public traffice passenger flow statistical method, device and electronic equipment
CN109697499A (en) * 2017-10-24 2019-04-30 北京京东尚科信息技术有限公司 Pedestrian's flow funnel generation method and device, storage medium, electronic equipment
WO2019242672A1 (en) * 2018-06-22 2019-12-26 杭州海康威视数字技术股份有限公司 Method, device and system for target tracking

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109711299A (en) * 2018-12-17 2019-05-03 北京百度网讯科技有限公司 Vehicle passenger flow statistical method, device, equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049787A (en) * 2011-10-11 2013-04-17 汉王科技股份有限公司 People counting method and system based on head and shoulder features
CN105139425A (en) * 2015-08-28 2015-12-09 浙江宇视科技有限公司 People counting method and device
CN105512720A (en) * 2015-12-15 2016-04-20 广州通达汽车电气股份有限公司 Public transport vehicle passenger flow statistical method and system
WO2017156772A1 (en) * 2016-03-18 2017-09-21 深圳大学 Method of computing passenger crowdedness and system applying same
CN105844234A (en) * 2016-03-21 2016-08-10 商汤集团有限公司 People counting method and device based on head shoulder detection
CN108241844A (en) * 2016-12-27 2018-07-03 北京文安智能技术股份有限公司 A kind of public traffice passenger flow statistical method, device and electronic equipment
CN109697499A (en) * 2017-10-24 2019-04-30 北京京东尚科信息技术有限公司 Pedestrian's flow funnel generation method and device, storage medium, electronic equipment
WO2019242672A1 (en) * 2018-06-22 2019-12-26 杭州海康威视数字技术股份有限公司 Method, device and system for target tracking

Also Published As

Publication number Publication date
CN112434566A (en) 2021-03-02

Similar Documents

Publication Publication Date Title
CN112434566B (en) Passenger flow statistics method and device, electronic equipment and storage medium
CN108596277B (en) Vehicle identity recognition method and device and storage medium
CN109784162B (en) Pedestrian behavior recognition and trajectory tracking method
CN108388888B (en) Vehicle identification method and device and storage medium
CN104966304A (en) Kalman filtering and nonparametric background model-based multi-target detection tracking method
CN113326719A (en) Method, equipment and system for target tracking
CN108875754B (en) Vehicle re-identification method based on multi-depth feature fusion network
CN111008574A (en) Key person track analysis method based on body shape recognition technology
CN102254394A (en) Antitheft monitoring method for poles and towers in power transmission line based on video difference analysis
CN112381132A (en) Target object tracking method and system based on fusion of multiple cameras
CN114926766A (en) Identification method and device, equipment and computer readable storage medium
CN114708555A (en) Forest fire prevention monitoring method based on data processing and electronic equipment
CN112613668A (en) Scenic spot dangerous area management and control method based on artificial intelligence
CN112465854A (en) Unmanned aerial vehicle tracking method based on anchor-free detection algorithm
CN115546742A (en) Rail foreign matter identification method and system based on monocular thermal infrared camera
CN112738725A (en) Real-time identification method, device, equipment and medium for target crowd in semi-closed area
CN112418118A (en) Method and device for detecting pedestrian intrusion under unsupervised bridge
CN116311166A (en) Traffic obstacle recognition method and device and electronic equipment
CN115953744A (en) Vehicle identification tracking method based on deep learning
CN114898287A (en) Method and device for dinner plate detection early warning, electronic equipment and storage medium
CN114445787A (en) Non-motor vehicle weight recognition method and related equipment
CN112580633A (en) Public transport passenger flow statistical device and method
CN110659384B (en) Video structured analysis method and device
CN112232124A (en) Crowd situation analysis method, video processing device and device with storage function
CN110738692A (en) spark cluster-based intelligent video identification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant