CN116503789B - Bus passenger flow detection method, system and equipment integrating track and scale - Google Patents

Bus passenger flow detection method, system and equipment integrating track and scale Download PDF

Info

Publication number
CN116503789B
CN116503789B CN202310747361.1A CN202310747361A CN116503789B CN 116503789 B CN116503789 B CN 116503789B CN 202310747361 A CN202310747361 A CN 202310747361A CN 116503789 B CN116503789 B CN 116503789B
Authority
CN
China
Prior art keywords
getting
passenger
bus
video frame
passengers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310747361.1A
Other languages
Chinese (zh)
Other versions
CN116503789A (en
Inventor
戚湧
周竹萍
李卫
欧阳墨蓝
刘洋
汤睿尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202310747361.1A priority Critical patent/CN116503789B/en
Publication of CN116503789A publication Critical patent/CN116503789A/en
Application granted granted Critical
Publication of CN116503789B publication Critical patent/CN116503789B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Electromagnetism (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a bus passenger flow detection method, system and equipment integrating tracks and scales, and belongs to the technical field of intelligent transportation. The method of the application comprises the following steps: when a vehicle arrives at a station, the processing terminal receives a door opening signal, receives a passenger getting-off video frame acquired by a fish-eye lens, and detects and extracts head track data and head pixel scale change data of the passenger getting off the station; the processing terminal acquires binary digital signals related to the number of passengers getting off in real time; after receiving the door closing signal, the processing terminal inputs the detected head track data of the passengers getting off, the pixel scale change data and binary digital signals related to the number of the passengers getting off into a trained machine learning classification model, outputs the predicted number of the passengers getting off at the station and sends the predicted number of the passengers getting off to a public transportation cloud platform. The detected bus passenger flow result has high reliability and accuracy.

Description

Bus passenger flow detection method, system and equipment integrating track and scale
Technical Field
The application belongs to the technical field of road traffic simulation, and particularly relates to a method, a system and equipment for detecting bus passenger flow under a bus with integrated track and scale.
Background
With the development of related technologies, buses are increasingly equipped with various sensors to detect bus passenger flows. However, a single sensor often has certain drawbacks, and it is difficult to meet the detection requirement. The multi-sensor fusion system is used for processing and synthesizing data from a plurality of sensors so as to obtain more accurate and reliable useful information and results. Information from multiple sensors has better fault tolerance and reliability than systems employing only a single sensor. Therefore, how to use the complementary advantages of a plurality of sensors to increase the accuracy and reliability of detection becomes an important problem in the field of bus passenger flow detection.
The traditional bus passenger flow detection method is used for detecting information of passengers getting on or off a bus by using sensors such as gravity, infrared, pressure pedals and the like. The infrared sensor leaks easily when the passenger passes through simultaneously and examines, and gravity sensor precision is too low, and pressure sensor is difficult to detect when the number of people of getting on or off the bus simultaneously is great, and fragile, and if adopt bluetooth detector and wiFi probe to detect, then can not detect the passenger that bluetooth or wiFi closed. At present, a camera is used for detection, but the detection accuracy is not high due to the influence of picture quality and shielding.
Disclosure of Invention
The application aims to provide a bus passenger flow detection method, system and equipment integrating tracks and scales, and the detected bus passenger flow result has high reliability and accuracy.
Specifically, in one aspect, the application provides a bus passenger flow detection method integrating tracks and scales, comprising the following steps:
when a vehicle arrives at a bus stop, a processing terminal receives a door opening signal, receives a passenger getting-off video frame acquired by a fisheye lens arranged near a getting-off door, detects and tracks the head of the getting-off passenger through a target detection algorithm and a tracking algorithm, extracts head track data and head pixel scale change data of the getting-off passenger at the bus stop, and obtains position coordinates and pixel scale change data of a head detection frame of each getting-off passenger in a time period from the beginning to the end of detection;
the processing terminal acquires binary digital signals related to the number of passengers getting off, which are output by a microwave radar module arranged near the door of the car in real time;
after receiving the door closing signal, the processing terminal inputs the detected head track data of the passengers getting off, the detected head pixel scale change data of the passengers getting off and the binary digital signals related to the number of the passengers getting off into a trained machine learning classification model to predict the passenger flow of the passengers getting off and output the number of the passengers getting off at the bus stop;
and the processing terminal sends the number of passengers getting off at the bus stop to the bus cloud platform.
Further, the bus getting-off passenger flow detection method integrating the track and the scale further comprises the steps of tracking each passenger detected in the passenger getting-off video frame acquired by the fish-eye lens, numbering the getting-off passenger head detection frames again according to the sequence of the getting-off passenger head detection frames leaving the fish-eye lens video frame picture, and numbering the getting-off passenger head detection frames with the sequence of 1, 2, … and n, wherein n represents the number of all the getting-off passenger head detection frames.
Further, the bus passenger flow detection method integrating the track and the scale further comprises the following steps:
outputting position coordinates and pixel scales of the head detection frames of the passengers getting off corresponding to each number for the head detection frames of the passengers getting off in the fisheye lens video frame picture, and sequencing according to the sizes of the numbers;
the position coordinates of the head detection frame of the getting-off passenger are respectively set as { (x) based on the center point of the detection frame 1 ,y 1 ),…,(x n ,y n ) The pixel scale of the head detection frame of the passenger is respectively marked as { (w) 1 ,h 1 ),…,(w n ,h n ) N represents the number of detection frames of the heads of all passengers getting off, and w and h represent the width and the height of the detection frames respectively;
for a head detection frame of a passenger getting off the vehicle, which is positioned outside a fisheye lens video frame picture, after the passenger getting off the vehicle leaves the fisheye lens video frame picture, setting the position coordinate of the head detection frame of the passenger getting off as a first fixed position on the fisheye lens video frame picture, and setting the pixel scale as a first fixed scale;
for a head detection frame of a get-off passenger which does not temporarily appear in the fisheye lens video frame picture, setting the position coordinate of the head detection frame of the get-off passenger as a second fixed position on the fisheye lens video frame picture, and setting the pixel scale as a second fixed scale.
Further, the first fixed position is a center point (0.5X) of the lower boundary of the fisheye lens video frame picture max 0), the first fixed dimension is (0.5X max ,0.5Y max ) Wherein X is max Representing the frame width of the video frame of the fish-eye lens, Y max Representing the fisheye lens video frame height.
Further, the second fixed position is a boundary center point (0.5X max ,Y max ) The second fixed scale is (0, 0), wherein X max Representing the frame width of the video frame of the fish-eye lens, Y max Representing the fisheye lens video frame height.
Further, the target detection algorithm adopts a modified YOLOv7 algorithm, which specifically comprises the following steps:
s101, adding a convolution layer with a span of 2 before the last convolution layer of the YOLOv7 algorithm, and halving the scale of the feature map; adding an up-sampling layer after the convolution layer, and changing the scale of the feature map back to the original size;
s102, splicing and fusing the feature map obtained in the previous step with the feature map obtained before the added convolution layer by using a contact function;
s103, adding a convolution layer and an activation function to the fused feature map, and normalizing.
Further, the trained machine learning classification model is trained by the following process:
acquiring the head movement track and pixel scale change data of passengers getting off in the video frame of the passengers getting off at each bus stop through a fisheye lens arranged near the bus getting-off door; acquiring binary digital signals through a microwave radar sensing module arranged near a bus lower door; the scheduler follows the bus and records the number of passengers getting off each bus stop; and taking the number of passengers getting off in the passenger getting off video frame of each bus station as a target value, splicing the head movement track data of the passengers getting off detected in the passenger getting off video frame of each bus station, the pixel scale change data and the binary digital signal data output by the microwave radar sensing module according to a specified sequence to serve as characteristic values, carrying out normalization processing on all the data, then using a principal component analysis method to reduce the dimension of the characteristics to a specified dimension, inputting the target value and the dimension-reduced characteristic values into a machine learning classifier for training, and obtaining a trained machine learning classification model.
On the other hand, the application also provides a bus passenger flow detection system integrating the track and the scale, and a bus passenger flow detection method realizing the track and the scale integration comprises a signal acquisition module, a fish eye lens, a microwave radar sensing module, a processing terminal, a communication module and a positioning module;
the signal acquisition module is arranged at the lower door of the bus and is used for detecting a door opening signal or a door closing signal; when the signal acquisition module acquires a door opening signal or a door closing signal, the door opening signal or the door closing signal is transmitted to the processing terminal;
the fish-eye lens is arranged near a get-off door of the bus and is used for acquiring a get-off video frame of a passenger;
the microwave radar sensing module is arranged near a bus door, amplifies electromagnetic wave signals of passengers getting off, sensed by the microwave radar sensor, of the passengers getting off when passengers get off are near the bus door, converts the electromagnetic wave signals into square wave signals through the comparison circuit, and outputs binary signals related to the number of the passengers getting off;
the processing terminal predicts the passenger flow of the getting off and outputs the predicted number of the getting off passengers according to the passenger getting off video frame acquired from the fish eye lens and the binary digital signal related to the number of the getting off passengers output by the microwave radar sensing module when receiving the door opening signal or the door closing signal, and sends the number of the getting off passengers to the public transportation cloud platform through the communication module;
and the positioning module acquires the position information of the current bus station and sends the position information to the bus cloud platform through the communication module.
In still another aspect, the present application further provides a bus passenger flow detection device integrating a track and a scale, where the device includes a memory and a processor; the memory stores a computer program for realizing a bus passenger flow detection method integrating tracks and scales, and the processor executes the computer program to realize the steps of the method.
In yet another aspect, the present application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the above method.
The bus passenger flow detection method, system and equipment integrating the track and the scale have the following beneficial effects:
compared with the traditional method for detecting the passenger flow directly according to the track, the method, the system and the equipment for detecting the passenger flow under the bus, which are integrated with the track and the scale, combine the characteristics of the fish-eye lens, apply the influence of the distance of the fish-eye camera on the image formation of the picture, take the scale data as the characteristic data, and utilize the principle that the correlation exists between the motion track of the head of the passenger under the bus and the scale of the pixels of the head in the fish-eye lens, so as to apply the method to the field of detecting the passenger flow under the bus for detecting the passenger flow, avoid the condition that the target detection track is lost or abnormal, and solve the problem that the traditional method excessively depends on the track detection precision.
The bus passenger flow detection method, system and equipment integrating the track and the scale, disclosed by the application, use the microwave radar data as one of the characteristic data, and solve the problem that the single sensor of the camera cannot detect under the condition that the video picture is fuzzy or the picture is blocked when the light is poor.
According to the bus passenger flow detection method, system and equipment integrating the tracks and the scales, the head motion track of the passengers getting off, the pixel scales of the head and the binary digital signals of the microwave radar sensor at the same time, which are acquired by the fisheye lens, are used as input values, and the machine learning classification method is adopted to output the number of passengers getting off at the current bus station, so that the robustness of video detection of the passenger flow is improved, the external environment interference caused by the traditional single equipment in the process of detecting the passenger flow getting off is avoided, and the accuracy of detecting the passenger flow getting off the bus is improved.
According to the bus passenger flow detection method, system and equipment integrating the track and the scale, the fisheye lens and the microwave sensor are arranged near the bus door, the data of the fisheye lens and the microwave sensor are acquired in real time through the processing terminal on the bus and are subjected to data analysis, and the generated bus passenger flow data are sent to the bus cloud platform, so that the intellectualization of the bus system is facilitated.
Drawings
FIG. 1 is a schematic diagram of the system components of an embodiment of the present application.
Fig. 2 is a flow chart of an embodiment of the present application.
Fig. 3 is a schematic diagram of a fisheye lens detection parameter according to an embodiment of the application.
FIG. 4 is a schematic diagram of a target detection algorithm according to an embodiment of the present application
Detailed Description
The application is described in further detail below with reference to the examples and with reference to the accompanying drawings.
An embodiment of the application is a bus passenger flow detection system integrating tracks and scales, which realizes bus passenger flow detection, and comprises a signal acquisition module, a fish eye lens, a microwave radar sensing module, a processing terminal, a positioning module and a communication module as shown in fig. 1. Wherein, the liquid crystal display device comprises a liquid crystal display device,
the signal acquisition module is arranged at the lower door of the bus and is used for detecting a door opening signal or a door closing signal. When the signal acquisition module acquires a door opening signal or a door closing signal, the door opening signal or the door closing signal is transmitted to the processing terminal.
The fish eye lens is arranged near the lower door of the bus and used for acquiring the video frame of the passenger getting off. Because the fish-eye lens is arranged near the lower door, when a passenger gets off, the passenger gets closer to the lower door, and the picture of the head of the passenger in the fish-eye lens is bigger and bigger until the passenger gets off.
The microwave radar sensing module is arranged near a bus door, amplifies a tiny electromagnetic wave signal of a passenger getting off sensed by the microwave radar sensor when the passenger gets off near the bus door, converts the tiny electromagnetic wave signal into a square wave signal through the comparison circuit, and outputs a binary signal related to the number of the passenger getting off.
And when receiving the door opening signal or the door closing signal, the processing terminal predicts the passenger flow of the passengers to be taken off according to the passenger video frame obtained from the fish eye lens and the binary digital signal related to the passenger number to be taken off, which is output by the microwave radar sensing module, outputs the predicted passenger number to be taken off, and sends the passenger number to the public transportation cloud platform through the communication module.
And the positioning module is used for acquiring the position information of the current bus stop and transmitting the position information to the bus cloud platform through the communication module.
The method for detecting the passenger flow of the bus in the embodiment, which is integrated with the track and the scale, as shown in fig. 2, detects the passenger flow of the bus once every one bus stop, and comprises the following steps:
s1, when a vehicle arrives at a bus stop, a get-off door is opened, a processing terminal receives a door opening signal, then receives a passenger get-off video frame acquired by a fisheye lens arranged near the get-off door, and the processing terminal detects and tracks the head of the passenger getting off through a target detection and tracking algorithm (such as an improved YOLOv7 target detection and Deep Sort tracking algorithm), extracts head track data and head pixel scale change data of the passenger getting off at the bus stop, and obtains position coordinates and pixel scale change data of a head detection frame of each passenger getting off in a time period from the beginning to the end of detection. The getting-off passenger head track data refers to the change data of the position coordinates of the getting-off passenger head detection frame in each frame of the passenger getting-off video. The getting-off passenger head pixel scale change data refers to the change data of the pixel scale of the getting-off passenger head detection frame in each frame of the getting-off video of the passenger. The specific method comprises the following steps:
preferably, in another embodiment, in order to facilitate statistics of detection data, each passenger detected in each video frame is tracked, and the head detection frames of the passengers getting off are numbered again according to the sequence in which the head detection frames of the passengers getting off leave the bottom of the video frame picture of the fish eye lens, and the numbers 1, 2, …, n and n in turn represent the numbers of the head detection frames of all passengers getting off, such as ID1, ID2, … and ID7 shown in fig. 3.
After the detection starts, for the head detection frames (such as ID2, ID3, ID4, ID5 and ID 6) of the passengers getting off in the fisheye lens video frame picture, the position coordinates and the pixel scale of the head detection frame of the passengers getting off corresponding to each number are output, and the ordering is performed according to the size of the numbers. If the head detection frames of the passengers getting off are numbered again according to the sequence that the head detection frames of the passengers get off from the bottoms of the frames of the fisheye lens video frames, the size of the numbers corresponds to the passenger getting off sequence. And setting the position coordinates and pixel scale of the head detection frame of the passenger getting off by taking the leftmost lower corner of the picture as the origin of coordinates and taking each pixel as a unit length. The position coordinates of the head detection frame of the passenger getting off are set as { (x) based on the center point of the detection frame 1 ,y 1 ),…,(x n ,y n ) The pixel scale of the head detection frame of the passenger getting off is respectively marked as { (w) 1 ,h 1 ),…,(w n ,h n ) N represents the number of detection frames of the head of all passengers getting off, and w and h represent the width and height of the detection frames, respectively.
For the head detection frame of the passenger getting off the fish-eye lens video frame, after the passenger getting off (i.e. the detection frame with the smallest number in the original fish-eye lens video frame, i.e. the passenger getting off the car first in the current fish-eye lens video frame, e.g. ID1 in fig. 3) leaves the fish-eye lens video frame, the position coordinate of the head detection frame of the passenger getting off is set to the first fixed position on the fish-eye lens video frame, and the pixel scale is set to the first fixed scale.
Preferably, in another embodiment, after the passenger leaves the fisheye lens video frame, the position coordinate setting of the head detection frame of the passenger is always setAnd setting the video frame picture lower boundary center point of the fish-eye lens. For example, X is used max Representing the frame width of the video frame of the fish-eye lens, Y max Representing the frame height of the fisheye lens video frame, the position coordinates of ID1 are set to (0.5X max 0), the pixel scale is set to (0.5X max ,0.5Y max )。
For a get-off passenger head detection frame (e.g., ID7 in fig. 3) that does not temporarily appear in the fisheye lens video frame, the position coordinates of the get-off passenger head detection frame are set to a second fixed position on the fisheye lens video frame, and the pixel scale is set to a second fixed scale.
Preferably, in another embodiment, the position of the head detection frame of the passenger getting off, which is not temporarily present in the fisheye lens video frame, is set as the center point of the upper boundary of the fisheye lens video frame. For example, the position coordinates of ID7 are set to (0.5X max ,Y max ) The pixel scale is noted as (0, 0).
S2, the microwave radar sensing module amplifies the sensed micro electromagnetic wave signals of the passengers getting off and converts the micro electromagnetic wave signals into square wave signals, and binary signals related to the number of the passengers getting off are output to the processing terminal in real time. The output binary digital signal is correlated with the number of passengers getting off. The processing terminal acquires binary digital signals related to the number of passengers getting off, which are output by a microwave radar module arranged near the door of the car in real time.
And S3, when the vehicle is closed, the signal acquisition module sends a door closing signal to the processing terminal. After receiving the door closing signal, the processing terminal inputs the detected head track data of the passengers getting off, the head pixel scale change data of the passengers getting off and the binary signals related to the number of the passengers getting off into a trained machine learning classification model, such as XGBoost, random forest, logistic regression, SVM and the like, predicts the passenger flow of the passengers getting off and outputs the number of the passengers getting off at the bus stop.
And S4, the processing terminal sends the number of passengers getting off at the bus stop to the bus cloud platform.
Preferably, in another embodiment, the method for detecting passenger flow under buses with fused track and scale improves the YOLOv7 algorithm by considering the imaging scale characteristics of the fisheye lens, a multi-scale feature fusion module is added in a feature processing layer, and features extracted by the YOLOv7 algorithm are fused and then detected in layers, so that heads of passengers with different scales can be detected, and the detection precision of the head targets under the fisheye lens is improved. The target detection algorithm in this embodiment adopts a modified YOLOv7 algorithm, as shown in fig. 4, specifically as follows:
s101, adding a convolution layer with a span of 2 before the last convolution layer of the YOLOv7 algorithm, and halving the scale of the feature map; an up-sampling layer is added after the convolution layer to change the scale of the feature map back to the original size.
And S102, splicing and fusing the feature map obtained in the last step with the feature map obtained before the added convolution layer by using a contact function.
S103, adding a convolution layer and an activation function to the fused feature map, and normalizing.
Preferably, in another embodiment, the trained machine learning classification model is trained by the following process.
Acquiring the head movement track and pixel scale change data of passengers getting off in the video frame of the passengers getting off at each bus stop through a fisheye lens arranged near the bus getting-off door; acquiring binary digital signals through a microwave radar sensing module arranged near a bus lower door; the scheduler follows the bus and records the number of passengers getting off each bus stop; and taking the number of passengers getting off in the passenger getting off video frame of each bus station as a target value, splicing the head movement track data of the passengers getting off detected in the passenger getting off video frame of each bus station, the pixel scale change data and the binary digital signal data output by the microwave radar sensing module according to a specified sequence to serve as characteristic values, carrying out normalization processing on all the data to obtain characteristic values, then using a principal component analysis method to reduce the characteristic values to specified dimensions (such as 500 dimensions), inputting the target value and the characteristic values subjected to dimension reduction into a machine learning classifier to train, such as XGBoost, random forest, logistic regression, SVM and the like, and obtaining a trained machine learning classification model.
Compared with the prior art, the application has the remarkable advantages that:
(1) Compared with the traditional method for directly detecting the passenger flow according to the track, the method combines the characteristics of the fish-eye lens, takes the scale data as the characteristic data, and avoids the problem that the traditional method excessively depends on the track detection precision.
(2) The microwave radar data is used as one of the characteristic data, so that the problem that a single sensor of a camera cannot detect when light is poor and a picture is blocked is solved.
Compared with the traditional method for detecting the passenger flow directly according to the track, the method, the system and the equipment for detecting the passenger flow under the bus, which are integrated with the track and the scale, combine the characteristics of the fish-eye lens, apply the influence of the distance of the fish-eye camera on the image formation of the picture, take the scale data as the characteristic data, and utilize the principle that the correlation exists between the motion track of the head of the passenger under the bus and the scale of the pixels of the head in the fish-eye lens, so as to apply the method to the field of detecting the passenger flow under the bus for detecting the passenger flow, avoid the condition that the target detection track is lost or abnormal, and solve the problem that the traditional method excessively depends on the track detection precision.
The bus passenger flow detection method, system and equipment integrating the track and the scale, disclosed by the application, use the microwave radar data as one of the characteristic data, and solve the problem that the single sensor of the camera cannot detect under the condition that the video picture is fuzzy or the picture is blocked when the light is poor.
According to the bus passenger flow detection method, system and equipment integrating the tracks and the scales, the head motion track of the passengers getting off, the pixel scales of the head and the binary digital signals of the microwave radar sensor at the same time, which are acquired by the fisheye lens, are used as input values, and the machine learning classification method is adopted to output the number of passengers getting off at the current bus station, so that the robustness of video detection of the passenger flow is improved, the external environment interference caused by the traditional single equipment in the process of detecting the passenger flow getting off is avoided, and the accuracy of detecting the passenger flow getting off the bus is improved.
According to the bus passenger flow detection method, system and equipment integrating the track and the scale, the fisheye lens and the microwave sensor are arranged near the bus door, the data of the fisheye lens and the microwave sensor are acquired in real time through the processing terminal on the bus and are subjected to data analysis, and the generated bus passenger flow data are sent to the bus cloud platform, so that the intellectualization of the bus system is facilitated.
In some embodiments, certain aspects of the techniques described above may be implemented by one or more processors of a processing system executing software. The software includes one or more sets of executable instructions stored or otherwise tangibly embodied on a non-transitory computer-readable storage medium. The software may include instructions and certain data that, when executed by one or more processors, operate the one or more processors to perform one or more aspects of the techniques described above. The non-transitory computer readable storage medium may include, for example, a magnetic or optical disk storage device, a solid state storage device such as flash memory, cache, random Access Memory (RAM), or other non-volatile memory device. Executable instructions stored on a non-transitory computer-readable storage medium may be in source code, assembly language code, object code, or other instruction format that is interpreted or otherwise executed by one or more processors.
A computer-readable storage medium may include any storage medium or combination of storage media that can be accessed by a computer system during use to provide instructions and/or data to the computer system. Such storage media may include, but is not limited to, optical media (e.g., compact Disc (CD), digital Versatile Disc (DVD), blu-ray disc), magnetic media (e.g., floppy disk, magnetic tape, or magnetic hard drive), volatile memory (e.g., random Access Memory (RAM) or cache), non-volatile memory (e.g., read Only Memory (ROM) or flash memory), or microelectromechanical system (MEMS) based storage media. The computer-readable storage medium may be embedded in a computing system (e.g., system RAM or ROM), fixedly attached to the computing system (e.g., a magnetic hard drive), removably attached to the computing system (e.g., an optical disk or Universal Serial Bus (USB) based flash memory), or coupled to the computer system via a wired or wireless network (e.g., network-accessible storage (NAS)).
While the application has been disclosed in terms of preferred embodiments, the embodiments are not intended to limit the application. Any equivalent changes or modifications can be made without departing from the spirit and scope of the present application, and are intended to be within the scope of the present application. The scope of the application should therefore be determined by the following claims.

Claims (8)

1. The bus passenger flow detection method integrating the track and the scale is characterized by comprising the following steps of:
when a vehicle arrives at a bus stop, a processing terminal receives a door opening signal, receives a passenger getting-off video frame acquired by a fish-eye lens arranged near a getting-off door, detects the head of a getting-off passenger through a target detection algorithm and a tracking algorithm, tracks each passenger detected in the passenger getting-off video frame acquired by the fish-eye lens, numbers the getting-off passenger head detection frames again according to the sequence of the getting-off passenger head detection frames leaving the fish-eye lens video frame picture, and numbers the getting-off passenger head detection frames to be 1, 2, … and n in sequence, wherein n represents the number of all getting-off passenger head detection frames, and outputs the position coordinates and the pixel scale of the getting-off passenger head detection frame corresponding to each number for the getting-off passenger head detection frame positioned in the fish-eye lens video frame picture, and sorts the getting-off passenger head detection frames according to the size of the numbers; the position coordinates of the head detection frame of the getting-off passenger are respectively set as { (x) based on the center point of the detection frame 1 ,y 1 ),…,(x n ,y n ) The pixel scale of the head detection frame of the passenger is respectively marked as { (w) 1 ,h 1 ),…,(w n ,h n ) N represents the number of detection frames of the heads of all passengers getting off, and w and h represent the width and the height of the detection frames respectively; for a head detection frame of a passenger getting off the vehicle and positioned outside a fisheye lens video frame picture, after the passenger getting off the vehicle leaves the fisheye lens video frame picture, setting the position coordinate of the head detection frame of the passenger getting off as the first position coordinate on the fisheye lens video frame pictureA fixed position, the pixel scale is set as a first fixed scale; for a head detection frame of a get-off passenger which does not temporarily appear in the fisheye lens video frame picture, setting the position coordinate of the head detection frame of the get-off passenger as a second fixed position on the fisheye lens video frame picture, and setting the pixel scale as a second fixed scale; extracting head track data and head pixel scale change data of passengers getting off at the bus stop, and obtaining position coordinates and pixel scale change data of head detection frames of passengers getting off in a time period from detection start to detection end;
the processing terminal acquires binary digital signals related to the number of passengers getting off, which are output by a microwave radar module arranged near the door of the car in real time;
after receiving the door closing signal, the processing terminal inputs the detected head track data of the passengers getting off, the head pixel scale change data of the passengers getting off and the binary digital signals related to the number of the passengers getting off into a trained machine learning classification model, predicts the passenger flow of the passengers getting off, and outputs the number of the passengers getting off at the bus stop;
and the processing terminal sends the number of passengers getting off at the bus stop to the bus cloud platform.
2. The bus passenger flow detection method based on the fusion track and scale according to claim 1, wherein the first fixed position is a fisheye lens video frame lower boundary center point (0.5X max 0), the first fixed dimension is (0.5X max ,0.5Y max ) Wherein X is max Representing the frame width of the video frame of the fish-eye lens, Y max Representing the fisheye lens video frame height.
3. The bus passenger flow detection method based on the fusion track and scale according to claim 1, wherein the second fixed position is a boundary center point (0.5X max ,Y max ) The second fixed scale is (0, 0), wherein X max Representing the frame width of the video frame of the fish-eye lens, Y max Glasses for indicating fishHead video frame picture height.
4. The bus down passenger flow detection method of the fusion track and scale according to claim 1, wherein the target detection algorithm adopts an improved YOLOv7 algorithm, and specifically comprises the following steps:
s101, adding a convolution layer with a span of 2 before the last convolution layer of the YOLOv7 algorithm, and halving the scale of the feature map; adding an up-sampling layer after the convolution layer, and changing the scale of the feature map back to the original size;
s102, splicing and fusing the feature map obtained in the previous step with the feature map obtained before the added convolution layer by using a contact function;
s103, adding a convolution layer and an activation function to the fused feature map, and normalizing.
5. The bus down passenger flow detection method of the fusion track and scale according to claim 1, wherein the trained machine learning classification model is trained by the following process:
acquiring the head movement track and pixel scale change data of passengers getting off in the video frame of the passengers getting off at each bus stop through a fisheye lens arranged near the bus getting-off door; acquiring binary digital signals through a microwave radar sensing module arranged near a bus lower door; the scheduler follows the bus and records the number of passengers getting off each bus stop; and taking the number of passengers getting off in the passenger getting off video frame of each bus station as a target value, splicing the head movement track data of the passengers getting off detected in the passenger getting off video frame of each bus station, the pixel scale change data and the binary digital signal data output by the microwave radar sensing module according to a specified sequence to serve as characteristic values, carrying out normalization processing on all the data, then using a principal component analysis method to reduce the dimension of the characteristics to a specified dimension, inputting the target value and the dimension-reduced characteristic values into a machine learning classifier for training, and obtaining a trained machine learning classification model.
6. The bus passenger flow detection system integrating the track and the scale is characterized by realizing the bus passenger flow detection method integrating the track and the scale according to any one of claims 1-5, and comprises a signal acquisition module, a fisheye lens, a microwave radar sensing module, a processing terminal, a communication module and a positioning module;
the signal acquisition module is arranged at the lower door of the bus and is used for detecting a door opening signal or a door closing signal; when the signal acquisition module acquires a door opening signal or a door closing signal, the door opening signal or the door closing signal is transmitted to the processing terminal;
the fish-eye lens is arranged near a get-off door of the bus and is used for acquiring a get-off video frame of a passenger;
the microwave radar sensing module is arranged near a bus door, amplifies electromagnetic wave signals of passengers getting off, sensed by the microwave radar sensor, of the passengers getting off when passengers get off are near the bus door, converts the electromagnetic wave signals into square wave signals through the comparison circuit, and outputs binary signals related to the number of the passengers getting off;
the processing terminal predicts the passenger flow of the getting off and outputs the predicted number of the getting off passengers according to the passenger getting off video frame acquired from the fish eye lens and the binary digital signal related to the number of the getting off passengers output by the microwave radar sensing module when receiving the door opening signal or the door closing signal, and sends the number of the getting off passengers to the public transportation cloud platform through the communication module;
and the positioning module acquires the position information of the current bus station and sends the position information to the bus cloud platform through the communication module.
7. The bus passenger flow detection device integrating the track and the scale is characterized by comprising a memory and a processor; the memory stores a computer program for implementing a bus drop detection method of merging trajectories and scales, the processor executing the computer program for implementing the steps of the method according to any one of claims 1-5.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any of claims 1-5.
CN202310747361.1A 2023-06-25 2023-06-25 Bus passenger flow detection method, system and equipment integrating track and scale Active CN116503789B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310747361.1A CN116503789B (en) 2023-06-25 2023-06-25 Bus passenger flow detection method, system and equipment integrating track and scale

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310747361.1A CN116503789B (en) 2023-06-25 2023-06-25 Bus passenger flow detection method, system and equipment integrating track and scale

Publications (2)

Publication Number Publication Date
CN116503789A CN116503789A (en) 2023-07-28
CN116503789B true CN116503789B (en) 2023-09-05

Family

ID=87323404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310747361.1A Active CN116503789B (en) 2023-06-25 2023-06-25 Bus passenger flow detection method, system and equipment integrating track and scale

Country Status (1)

Country Link
CN (1) CN116503789B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105512720A (en) * 2015-12-15 2016-04-20 广州通达汽车电气股份有限公司 Public transport vehicle passenger flow statistical method and system
CN110633671A (en) * 2019-09-16 2019-12-31 天津通卡智能网络科技股份有限公司 Bus passenger flow real-time statistical method based on depth image
CN111239727A (en) * 2020-02-26 2020-06-05 深圳雷研技术有限公司 Passenger counting method and communication equipment
CN112560641A (en) * 2020-12-11 2021-03-26 北京交通大学 Video-based one-way passenger flow information detection method in two-way passenger flow channel
CN114694054A (en) * 2020-12-30 2022-07-01 深圳云天励飞技术股份有限公司 Bus stop passenger flow statistical method and device, electronic equipment and storage medium
CN114998819A (en) * 2022-04-24 2022-09-02 上海悠络客电子科技股份有限公司 Passenger flow statistical method, device, equipment and medium for multi-dimensional detection and tracking

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110950206B (en) * 2018-09-26 2022-08-02 奥的斯电梯公司 Passenger movement detection system, passenger movement detection method, passenger call control method, readable storage medium, and elevator system
KR102180615B1 (en) * 2019-08-06 2020-11-19 엘지전자 주식회사 Method for opening and closing door of vehicle for safe getting off of passenger

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105512720A (en) * 2015-12-15 2016-04-20 广州通达汽车电气股份有限公司 Public transport vehicle passenger flow statistical method and system
CN110633671A (en) * 2019-09-16 2019-12-31 天津通卡智能网络科技股份有限公司 Bus passenger flow real-time statistical method based on depth image
CN111239727A (en) * 2020-02-26 2020-06-05 深圳雷研技术有限公司 Passenger counting method and communication equipment
CN112560641A (en) * 2020-12-11 2021-03-26 北京交通大学 Video-based one-way passenger flow information detection method in two-way passenger flow channel
CN114694054A (en) * 2020-12-30 2022-07-01 深圳云天励飞技术股份有限公司 Bus stop passenger flow statistical method and device, electronic equipment and storage medium
CN114998819A (en) * 2022-04-24 2022-09-02 上海悠络客电子科技股份有限公司 Passenger flow statistical method, device, equipment and medium for multi-dimensional detection and tracking

Also Published As

Publication number Publication date
CN116503789A (en) 2023-07-28

Similar Documents

Publication Publication Date Title
US11144786B2 (en) Information processing apparatus, method for controlling information processing apparatus, and storage medium
US10671068B1 (en) Shared sensor data across sensor processing pipelines
US11527077B2 (en) Advanced driver assist system, method of calibrating the same, and method of detecting object in the same
WO2020042984A1 (en) Vehicle behavior detection method and apparatus
CN112700470A (en) Target detection and track extraction method based on traffic video stream
US11682297B2 (en) Real-time scene mapping to GPS coordinates in traffic sensing or monitoring systems and methods
JP7185419B2 (en) Method and device for classifying objects for vehicles
CN111832410B (en) Forward train detection method based on fusion of vision and laser radar
CN114495064A (en) Monocular depth estimation-based vehicle surrounding obstacle early warning method
CN115470884A (en) Platform for perception system development of an autopilot system
JPWO2012014972A1 (en) Vehicle behavior analysis apparatus and vehicle behavior analysis program
CN116503789B (en) Bus passenger flow detection method, system and equipment integrating track and scale
WO2023071874A1 (en) Roadside assistance working node determining method and apparatus, electronic device, and storage medium
CN108960160B (en) Method and device for predicting structured state quantity based on unstructured prediction model
Muniruzzaman et al. Deterministic algorithm for traffic detection in free-flow and congestion using video sensor
Hanel et al. Iterative Calibration of a Vehicle Camera using Traffic Signs Detected by a Convolutional Neural Network.
JP7429246B2 (en) Methods and systems for identifying objects
JP2020067818A (en) Image selection device and image selection method
US20230267749A1 (en) System and method of segmenting free space based on electromagnetic waves
KR20240090518A (en) Vehicle data systems and methods for determining vehicle data from environmental sensors that are relevant or worth transmitting
US11726188B2 (en) Eliminating sensor self-hit data
Nagarajan et al. Detection of Potholes and Speedbumps by Monitoring Front Traffic
WO2020073272A1 (en) Snapshot image to train an event detector
WO2020073270A1 (en) Snapshot image of traffic scenario
WO2020073271A1 (en) Snapshot image of traffic scenario

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant