CN112102290A - Passenger flow statistical method, system and computer readable storage medium - Google Patents

Passenger flow statistical method, system and computer readable storage medium Download PDF

Info

Publication number
CN112102290A
CN112102290A CN202010970214.7A CN202010970214A CN112102290A CN 112102290 A CN112102290 A CN 112102290A CN 202010970214 A CN202010970214 A CN 202010970214A CN 112102290 A CN112102290 A CN 112102290A
Authority
CN
China
Prior art keywords
passenger
getting
preset
obtaining
pixel points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010970214.7A
Other languages
Chinese (zh)
Inventor
罗宁
李达
吴建雄
张建强
黄育辉
莫志杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHENZHEN JIMI IOT Co.,Ltd.
Original Assignee
Guangzhou Jimi Wulian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Jimi Wulian Technology Co ltd filed Critical Guangzhou Jimi Wulian Technology Co ltd
Priority to CN202010970214.7A priority Critical patent/CN112102290A/en
Publication of CN112102290A publication Critical patent/CN112102290A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30268Vehicle interior

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a passenger flow statistical method, a system and a computer readable storage medium, wherein the passenger flow statistical method comprises the following steps: acquiring a video image frame, and preprocessing the video image frame to obtain an image frame to be detected; calculating the image frame to be detected according to a preset optical flow algorithm to obtain an optical flow prediction result image; and obtaining a passenger flow statistical result according to the optical flow prediction result image. The invention can improve the accuracy of passenger flow statistics and reduce the cost.

Description

Passenger flow statistical method, system and computer readable storage medium
Technical Field
The invention relates to the technical field of image processing, in particular to a passenger flow statistical method, a passenger flow statistical system and a computer readable storage medium.
Background
With the rapid development of social economy and science and technology, more and more places need to count the passenger flow, such as public transportation, shopping malls, scenic spots and other places with larger passenger flow volume. For the management of public transport, the real-time passenger flow data is used as the management basis, the vehicles can be reasonably scheduled, and the departure interval can be efficiently selected, so that the utilization rate of public resources is improved.
At present, the passenger flow statistical methods include manual statistics, IC (Integrated Circuit) card information statistics, sensor detection statistics, inter-frame difference statistics, background difference statistics, machine learning detection statistics, and other statistical methods. Wherein, the manual counting cost is high; the IC card information statistics can not count passengers not using the IC card, so that the passenger flow statistics accuracy is low; the sensors required by the sensor detection statistics have high cost and are inconvenient to maintain; the inter-frame difference statistics and the background difference statistics excessively depend on the environment, so that the passenger flow statistics accuracy is low; machine learning detection statistics have high computational power requirements, resulting in high equipment costs. Therefore, how to improve the accuracy of the passenger flow statistics and reduce the cost becomes a problem to be solved urgently at present.
Disclosure of Invention
The invention mainly aims to provide a passenger flow statistics method, a passenger flow statistics system and a computer readable storage medium, aiming at improving the passenger flow statistics accuracy and reducing the cost.
In order to achieve the above object, the present invention provides a passenger flow statistical method, comprising the steps of:
acquiring a video image frame, and preprocessing the video image frame to obtain an image frame to be detected;
calculating the image frame to be detected according to a preset optical flow algorithm to obtain an optical flow prediction result image;
and obtaining a passenger flow statistical result according to the optical flow prediction result image.
Optionally, the step of obtaining a passenger flow statistical result according to the optical flow prediction result image includes:
dividing the light stream prediction result image into an upper vehicle light stream graph and a lower vehicle light stream graph according to a light stream vector of the light stream prediction result image, a preset passenger moving area, a preset door entering detection line and a preset door exiting detection line;
setting the light stream prediction points of the current foreground pixel points of the on-vehicle light stream graph as on-vehicle foreground pixel points, and setting the light stream prediction points of the current pixel points of the off-vehicle light stream graph as off-vehicle foreground pixel points;
and obtaining a passenger flow statistical result according to the getting-on foreground pixel points and the getting-off foreground pixel points.
Optionally, the step of setting the optical flow prediction points of the current foreground pixels of the boarding foreground pixel as boarding foreground pixels and setting the optical flow prediction points of the current pixels of the disembarking foreground pixel as disembarking foreground pixels includes:
generating a mobile light flow graph with the same size as the getting-on light flow graph or the getting-off light flow graph;
judging whether the light stream prediction point of the current foreground pixel point of the on-board light stream map is in the preset passenger moving area or not;
if the light stream prediction points of the current foreground pixel points of the on-vehicle light stream map are in the preset passenger moving area, obtaining the on-vehicle foreground pixel points according to the moving light stream map, the light stream prediction points of the current foreground pixel points of the on-vehicle light stream map and the on-vehicle light stream map;
emptying the mobile light flow graph, and judging whether a light flow prediction point of a current foreground pixel point of the off-vehicle light flow graph is in the preset passenger moving area or not;
and if the light stream prediction points of the current foreground pixel points of the off-vehicle light stream map are in the preset passenger moving area, obtaining the off-vehicle foreground pixel points according to the moving light stream map, the light stream prediction points of the current foreground pixel points of the off-vehicle light stream map and the off-vehicle light stream map.
Optionally, the step of obtaining the passenger flow statistical result according to the getting-on foreground pixel points and the getting-off foreground pixel points includes:
obtaining an upper vehicle touch foreground number and an upper vehicle collision line ratio according to the upper vehicle foreground pixel points, a preset upper vehicle touch counting area and a preset passenger standing area, and obtaining a lower vehicle collision foreground number and a lower vehicle collision line ratio according to the lower vehicle foreground pixel points, a preset lower vehicle touch counting area and a preset passenger standing area;
judging whether the number of the upper vehicle touch foreground is larger than a first preset threshold value or not;
if the number of the upper vehicle touch foreground is larger than the first preset threshold, judging whether the upper vehicle collision line ratio is larger than a second preset threshold;
if the boarding collision ratio is larger than the second preset threshold, acquiring a passenger boarding count, and adding one to the passenger boarding count;
judging whether the get-off touch foreground number is larger than a first preset threshold value or not;
if the get-off touch foreground number is larger than the first preset threshold, judging whether the get-off collision ratio is larger than the second preset threshold;
if the getting-off collision line ratio is larger than the second preset threshold, obtaining a passenger getting-off count, and adding one to the passenger getting-off count;
and obtaining a passenger flow statistical result according to the added passenger boarding count and the added passenger alighting count.
Optionally, if the boarding collision ratio is greater than the second preset threshold, the step of obtaining a passenger boarding count, and adding one to the passenger boarding count includes:
if the ratio of the upper vehicle collision line is larger than the second preset threshold value, obtaining area pixel points of the preset passenger standing area;
obtaining the boarding area occupation ratio according to the area pixel points, the boarding foreground pixel points and the preset passenger standing area;
judging whether the getting-on area ratio is larger than a third preset threshold value or not;
if the boarding area occupation ratio is larger than the third preset threshold, acquiring a passenger boarding count, and adding one to the passenger boarding count;
if the get-off collision line ratio is greater than the second preset threshold, then obtaining a passenger get-off count, and adding one to the passenger get-off count, the step includes:
if the getting-off collision line ratio is larger than the second preset threshold value, acquiring area pixel points of the preset passenger standing area;
obtaining the getting-off area occupation ratio according to the area pixel points, the getting-off foreground pixel points and the preset passenger standing area;
judging whether the getting-off area ratio is greater than a third preset threshold value or not;
and if the getting-off area ratio is greater than the third preset threshold value, obtaining a passenger getting-off count, and adding one to the passenger getting-off count.
Optionally, if the boarding area occupancy is greater than the third preset threshold, the step of obtaining a passenger boarding count, and adding one to the passenger boarding count includes:
if the occupancy of the boarding area is greater than the third preset threshold, obtaining the total number of boarding blank lines according to the boarding light flow graph and the boarding foreground pixel points;
judging whether the total number of the upper blank lines is smaller than a fourth preset threshold value or not;
if the total number of the blank getting-on rows is smaller than the fourth preset threshold value, obtaining a passenger getting-on count, and adding one to the passenger getting-on count;
if the getting-off area occupation ratio is larger than the third preset threshold, the step of obtaining the getting-off count of the passenger and adding one to the getting-off count of the passenger comprises the following steps:
if the getting-off area occupation ratio is larger than the third preset threshold value, obtaining the total number of blank driving of getting-off according to the getting-off light flow diagram and the getting-off foreground pixel points;
judging whether the total number of the blank lines of the lower vehicle is smaller than a fourth preset threshold value or not;
and if the total number of the blank getting-off rows is smaller than the fourth preset threshold, obtaining the getting-off count of the passengers, and adding one to the getting-off count of the passengers.
Optionally, the step of obtaining a video image frame and preprocessing the video image frame to obtain an image frame to be detected includes:
acquiring a video image frame;
and reducing the video image frame according to a preset size, and taking the image frame obtained by reduction as an image frame to be detected.
Optionally, the step of reducing the video image frame according to a preset size, and taking the reduced image frame as an image frame to be detected includes:
reducing the video image frame according to a preset size;
and converting the image frame obtained by reduction into a single-channel gray image, and taking the single-channel gray image as the image frame to be detected.
In addition, to achieve the above object, the present invention further provides a passenger flow statistics system, including: a memory, a processor and a passenger flow statistics program stored on said memory and executable on said processor, said passenger flow statistics program when executed by said processor implementing the steps of the passenger flow statistics method as described above.
Furthermore, to achieve the above object, the present invention further provides a computer readable storage medium, having a passenger flow statistics program stored thereon, which when executed by a processor implements the steps of the passenger flow statistics method as described above.
The invention provides a passenger flow statistical method, a passenger flow statistical system and a computer readable storage medium, wherein a video image frame is obtained and is preprocessed to obtain an image frame to be detected; calculating an image frame to be detected according to a preset optical flow algorithm to obtain an optical flow prediction result image; and obtaining a passenger flow statistical result according to the optical flow prediction result image. The video image frame is acquired only by the conventional monitoring camera of the public transport, and no additional equipment cost is needed; then, preprocessing the video image frame to obtain an image frame to be detected with less information content so as to reduce the calculated amount, thereby being used on low-cost equipment; finally, the passenger flow statistics is carried out based on the optical flow method, machine learning is not needed, namely, sample data is not needed to train a classifier, the calculated amount can be reduced, the optical flow method does not depend on the environment excessively, and the passenger flow statistics accuracy can be improved. Therefore, the passenger flow statistical method can improve the accuracy of passenger flow statistics and reduce the cost.
Drawings
Fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of a passenger flow statistics method according to the present invention;
FIG. 3 is a flow chart illustrating a second embodiment of a passenger flow statistics method according to the present invention;
FIG. 4 is a flowchart illustrating a detailed process of step S32 in the method for counting passenger flow according to the present invention;
FIG. 5 is a flow chart illustrating a third embodiment of a passenger flow statistics method according to the present invention;
FIG. 6 is a flowchart illustrating a passenger flow statistics method according to a fourth embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention.
The terminal in the embodiment of the present invention is a passenger flow statistics device, and the passenger flow statistics device may be a terminal device having a processing function, such as a microcomputer, a Personal Computer (PC), a notebook computer, and a server.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU (Central Processing Unit), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a passenger flow statistics program.
In the terminal shown in fig. 1, the processor 1001 may be configured to invoke a passenger flow statistics program stored in the memory 1005 and perform the following operations:
acquiring a video image frame, and preprocessing the video image frame to obtain an image frame to be detected;
calculating the image frame to be detected according to a preset optical flow algorithm to obtain an optical flow prediction result image;
and obtaining a passenger flow statistical result according to the optical flow prediction result image.
Further, the processor 1001 may be configured to invoke a passenger flow statistics program stored in the memory 1005, and further perform the following operations:
dividing the light stream prediction result image into an upper vehicle light stream graph and a lower vehicle light stream graph according to a light stream vector of the light stream prediction result image, a preset passenger moving area, a preset door entering detection line and a preset door exiting detection line;
setting the light stream prediction points of the current foreground pixel points of the on-vehicle light stream graph as on-vehicle foreground pixel points, and setting the light stream prediction points of the current pixel points of the off-vehicle light stream graph as off-vehicle foreground pixel points;
and obtaining a passenger flow statistical result according to the getting-on foreground pixel points and the getting-off foreground pixel points.
Further, the processor 1001 may be configured to invoke a passenger flow statistics program stored in the memory 1005, and further perform the following operations:
generating a mobile light flow graph with the same size as the getting-on light flow graph or the getting-off light flow graph;
judging whether the light stream prediction point of the current foreground pixel point of the on-board light stream map is in the preset passenger moving area or not;
if the light stream prediction points of the current foreground pixel points of the on-vehicle light stream map are in the preset passenger moving area, obtaining the on-vehicle foreground pixel points according to the moving light stream map, the light stream prediction points of the current foreground pixel points of the on-vehicle light stream map and the on-vehicle light stream map;
emptying the mobile light flow graph, and judging whether a light flow prediction point of a current foreground pixel point of the off-vehicle light flow graph is in the preset passenger moving area or not;
and if the light stream prediction points of the current foreground pixel points of the off-vehicle light stream map are in the preset passenger moving area, obtaining the off-vehicle foreground pixel points according to the moving light stream map, the light stream prediction points of the current foreground pixel points of the off-vehicle light stream map and the off-vehicle light stream map.
Further, the processor 1001 may be configured to invoke a passenger flow statistics program stored in the memory 1005, and further perform the following operations:
obtaining an upper vehicle touch foreground number and an upper vehicle collision line ratio according to the upper vehicle foreground pixel points, a preset upper vehicle touch counting area and a preset passenger standing area, and obtaining a lower vehicle collision foreground number and a lower vehicle collision line ratio according to the lower vehicle foreground pixel points, a preset lower vehicle touch counting area and a preset passenger standing area;
judging whether the number of the upper vehicle touch foreground is larger than a first preset threshold value or not;
if the number of the upper vehicle touch foreground is larger than the first preset threshold, judging whether the upper vehicle collision line ratio is larger than a second preset threshold;
if the boarding collision ratio is larger than the second preset threshold, acquiring a passenger boarding count, and adding one to the passenger boarding count;
judging whether the get-off touch foreground number is larger than a first preset threshold value or not;
if the get-off touch foreground number is larger than the first preset threshold, judging whether the get-off collision ratio is larger than the second preset threshold;
if the getting-off collision line ratio is larger than the second preset threshold, obtaining a passenger getting-off count, and adding one to the passenger getting-off count;
and obtaining a passenger flow statistical result according to the added passenger boarding count and the added passenger alighting count.
Further, the processor 1001 may be configured to invoke a passenger flow statistics program stored in the memory 1005, and further perform the following operations:
if the ratio of the upper vehicle collision line is larger than the second preset threshold value, obtaining area pixel points of the preset passenger standing area;
obtaining the boarding area occupation ratio according to the area pixel points, the boarding foreground pixel points and the preset passenger standing area;
judging whether the getting-on area ratio is larger than a third preset threshold value or not;
if the boarding area occupation ratio is larger than the third preset threshold, acquiring a passenger boarding count, and adding one to the passenger boarding count;
if the get-off collision line ratio is greater than the second preset threshold, then obtaining a passenger get-off count, and adding one to the passenger get-off count, the step includes:
if the getting-off collision line ratio is larger than the second preset threshold value, acquiring area pixel points of the preset passenger standing area;
obtaining the getting-off area occupation ratio according to the area pixel points, the getting-off foreground pixel points and the preset passenger standing area;
judging whether the getting-off area ratio is greater than a third preset threshold value or not;
and if the getting-off area ratio is greater than the third preset threshold value, obtaining a passenger getting-off count, and adding one to the passenger getting-off count.
Further, the processor 1001 may be configured to invoke a passenger flow statistics program stored in the memory 1005, and further perform the following operations:
if the occupancy of the boarding area is greater than the third preset threshold, obtaining the total number of boarding blank lines according to the boarding light flow graph and the boarding foreground pixel points;
judging whether the total number of the upper blank lines is smaller than a fourth preset threshold value or not;
if the total number of the blank getting-on rows is smaller than the fourth preset threshold value, obtaining a passenger getting-on count, and adding one to the passenger getting-on count;
if the getting-off area occupation ratio is larger than the third preset threshold, the step of obtaining the getting-off count of the passenger and adding one to the getting-off count of the passenger comprises the following steps:
if the getting-off area occupation ratio is larger than the third preset threshold value, obtaining the total number of blank driving of getting-off according to the getting-off light flow diagram and the getting-off foreground pixel points;
judging whether the total number of the blank lines of the lower vehicle is smaller than a fourth preset threshold value or not;
and if the total number of the blank getting-off rows is smaller than the fourth preset threshold, obtaining the getting-off count of the passengers, and adding one to the getting-off count of the passengers.
Further, the processor 1001 may be configured to invoke a passenger flow statistics program stored in the memory 1005, and further perform the following operations:
acquiring a video image frame;
and reducing the video image frame according to a preset size, and taking the image frame obtained by reduction as an image frame to be detected.
Further, the processor 1001 may be configured to invoke a passenger flow statistics program stored in the memory 1005, and further perform the following operations:
reducing the video image frame according to a preset size;
and converting the image frame obtained by reduction into a single-channel gray image, and taking the single-channel gray image as the image frame to be detected.
Based on the hardware structure, the invention provides various embodiments of the passenger flow statistical method.
The invention provides a passenger flow statistical method.
Referring to fig. 2, fig. 2 is a flowchart illustrating a first embodiment of a passenger flow statistics method according to the present invention.
In this embodiment, the passenger flow statistical method includes:
step S10, acquiring video image frames, and preprocessing the video image frames to obtain image frames to be detected;
in this embodiment, the passenger flow statistics method is implemented by a passenger flow statistics device, and the passenger flow statistics method may be a terminal device with a processing function, such as a microcomputer, a PC, a notebook computer, and a server. The application place of the passenger flow statistics is explained by taking public transport as an example.
In this embodiment, a video image frame is obtained, and the video image frame is preprocessed to obtain an image frame to be detected. Specifically, a video of the position of a bus door is obtained through a camera, then a video image frame is obtained through the video according to an image processing technology, and finally the video image frame is preprocessed to obtain an image frame to be detected. Wherein, the camera is installed in bus door top position to aim at the bus door, in order to obtain the video of bus door position, of course, the camera also can install in other positions.
The preprocessing method includes reducing the video image frames to a uniform size, converting the color video image frames to a single-channel grayscale image, and the like.
Specifically, step S10 includes:
step a11, acquiring video image frames;
step a12, reducing the video image frame according to a preset size, and taking the reduced image frame as an image frame to be detected.
In this embodiment, a video image frame is obtained, the video image frame is reduced according to a preset size, and the image frame obtained by reduction is used as an image frame to be detected. The preset size is set according to actual needs, the value standard of the preset size is as small as possible, and accurate identification and calculation of video image frames are guaranteed. In addition, the preset sizes of the video image frames are the same, so that the reference standards of pixel points in the video image frames are consistent.
It will be appreciated that detecting image frames of smaller size reduces the amount of computation and can therefore be used on low cost devices. Of course, the color image frame can also be converted into a single-channel gray image to further reduce the amount of calculation.
Specifically, step a12 includes:
a121, reducing the video image frame according to a preset size;
step a122, converting the image frame obtained by reduction into a single-channel gray image, and taking the single-channel gray image as an image frame to be detected.
In this embodiment, a video image frame is reduced according to a preset size, the image frame obtained through reduction is converted into a single-channel grayscale image, and the single-channel grayscale image is used as an image frame to be detected.
It should be noted that, a video image frame is usually a color map, for example, a three-channel (RGB) color map or a four-channel (RGBA) color map, and the color map is converted into a single-channel grayscale map, that is, each pixel point of the image frame has only one color value, so that the single-channel grayscale map occupies a smaller memory and has less information content, thereby further reducing the amount of calculation.
Step S20, calculating the image frame to be detected according to a preset optical flow algorithm to obtain an optical flow prediction result image;
in this embodiment, an optical flow prediction result image is obtained by calculating an image frame to be detected according to a preset optical flow algorithm. Specifically, a Horn-schunck algorithm is adopted to calculate dense optical flows of the image frames to be detected, then optical flow prediction points are obtained according to the dense optical flows, and finally optical flow prediction result images are obtained according to the optical flow prediction points.
It should be noted that, by using the horns-schunck algorithm, it is necessary to calculate an optical flow value for each pixel of the image frame to be detected, and therefore, the accuracy of the optical flow prediction result image calculated by using the horns-schunck algorithm is high. Of course, an optical flow method such as the Lukas-Kanade algorithm and the image pyramid introduction method can be adopted for calculation to obtain an optical flow prediction result image.
In addition, it should be noted that, in the continuous video images, the changed portion is regarded as the image motion portion, and the pixel point where the motion change occurs and the motion change trend exists is regarded as the motion pixel point. Specifically, the position to which the pixel point is to be moved in the next frame of video image can be obtained through the optical flow vector of the optical flow graph, and the position is regarded as the prediction point. It can be understood that the pixel points in the image frame all have corresponding prediction points, and the deviation between the position of the prediction point and the origin is an offset, and the offset reflects the motion trend of the pixel points.
And step S30, obtaining a passenger flow statistical result according to the optical flow prediction result image.
After the optical flow prediction result image is obtained, a passenger flow statistical result is obtained according to the optical flow prediction result image. The optical flow prediction result image comprises an upper vehicle optical flow graph and a lower vehicle optical flow graph; the passenger flow statistics result includes information such as the number of passengers getting on the bus, the number of passengers getting off the bus, the current passenger capacity, and the like, and may also include character characteristic information such as age group, body type, height, and the like.
Specifically, according to a moving pixel point in an optical flow prediction result image, an optical flow vector of the moving pixel point is obtained; dividing an optical flow prediction result image into an upper vehicle optical flow graph and a lower vehicle optical flow graph according to the optical flow vector, a preset door entrance detection line and a preset door exit detection line; obtaining an upper vehicle optical flow prediction point according to the upper vehicle optical flow diagram, and obtaining a lower vehicle optical flow prediction point according to the lower vehicle optical flow diagram; setting the upper vehicle light stream prediction point and the lower vehicle light stream prediction point as foreground pixel points; and obtaining a line collision ratio according to the foreground pixel points, and judging the size of the line collision ratio to obtain a passenger flow statistical result. The specific implementation process is described in the second embodiment below, and is not described herein again.
The embodiment of the invention provides a passenger flow statistical method, which comprises the steps of obtaining video image frames, preprocessing the video image frames to obtain image frames to be detected; calculating an image frame to be detected according to a preset optical flow algorithm to obtain an optical flow prediction result image; and obtaining a passenger flow statistical result according to the optical flow prediction result image. The embodiment of the invention acquires the video image frame by only using the existing monitoring camera of the public transport without additional equipment cost; then, preprocessing the video image frame to obtain an image frame to be detected with less information content so as to reduce the calculated amount, thereby being used on low-cost equipment; finally, the embodiment of the invention carries out passenger flow statistics based on the optical flow method, does not need machine learning, namely does not need to train a classifier by using sample data, can reduce the calculated amount, does not depend on the environment excessively by the optical flow method, and can improve the accuracy of the passenger flow statistics. Therefore, the passenger flow statistical method provided by the embodiment of the invention can improve the passenger flow statistical accuracy and reduce the cost.
Further, based on the first embodiment described above, a second embodiment of the passenger flow statistical method of the present invention is proposed.
Referring to fig. 3, fig. 3 is a flow chart of a passenger flow statistics method according to a second embodiment of the present invention.
In this embodiment, the step S30 includes:
step S31, dividing the light stream prediction result image into an upper vehicle light stream graph and a lower vehicle light stream graph according to the light stream vector of the light stream prediction result image, a preset passenger moving area, a preset door entering detection line and a preset door exiting detection line;
in this embodiment, the optical flow prediction result image is divided into an upper vehicle optical flow graph and a lower vehicle optical flow graph according to the optical flow vector of the optical flow prediction result image, the preset passenger movement area, the preset door entering detection line and the preset door exiting detection line. Specifically, according to a moving pixel point in an optical flow prediction result image, an optical flow vector of the moving pixel point is obtained; and dividing the light stream prediction result image into an upper vehicle light stream graph and a lower vehicle light stream graph according to the light stream vector, the preset passenger moving area, the preset door entrance detection line and the preset door exit detection line.
The preset passenger moving area is a union of a preset passenger standing area, a preset getting-on touch counting area and a preset getting-off touch counting area, the preset passenger standing area is arranged between a preset going-out detection line and a preset entering detection line, and the area from one fourth from the left of an optical flow diagram to three fourths from the left of the optical flow diagram.
It should be noted that the preset entry detection lines and the preset exit detection lines are set according to actual needs, for example, the preset entry detection lines are all pixels in the 15 th row in the optical flow prediction result image, and the preset exit detection lines are all pixels in the 113 th row in the optical flow prediction result image, which is not limited herein. In addition, the preset door entering detection line and the preset door exiting detection line are set as foreground pixel points.
In addition, it should be noted that, in the continuous video images, the changed portion is regarded as the image motion portion, and the pixel point where the motion change occurs and the motion change trend exists is regarded as the motion pixel point. Therefore, whether the person gets on or off the vehicle is known from the moving pixel points, that is, from the movement trajectory of the person, and the optical flow prediction result image is divided into an on-vehicle optical flow map and an off-vehicle optical flow map.
Step S32, setting the light stream prediction points of the current foreground pixel points of the on-vehicle light stream diagram as on-vehicle foreground pixel points, and setting the light stream prediction points of the current pixel points of the off-vehicle light stream diagram as off-vehicle foreground pixel points;
in this embodiment, the optical flow prediction points of the current foreground pixels of the boarding optical flow graph are set as boarding foreground pixels, and the optical flow prediction points of the current foreground pixels of the disembarking optical flow graph are set as disembarking foreground pixels.
It should be noted that, each pixel point of the on-vehicle light flow graph is traversed, and each pixel point has its corresponding abscissa x and ordinate y. The value of each coordinate point in the light flow graph represents the motion offset of the pixel point, and if the numerical value is zero, the pixel point is not offset, namely no motion change occurs. When the pixel on the (x, y) coordinate of the on-vehicle light flow map is not 0 and the (x, y) coordinate prediction offset value of the light flow prediction result image is (fx, fy), the (x + fx, y + fy) coordinate point is the light flow prediction point of the coordinate origin (x, y). And if the (x + fx, y + fy) coordinates are contained in the preset passenger standing area, setting the coordinate pixel point of the predicted point (x + fx, y + fy) of the boarding light flow graph as a foreground pixel point. Accordingly, the same processing is performed for the departure light map. The preset passenger standing area is arranged between the preset door exit detection line and the preset door entrance detection line and is arranged from one fourth of the left of the optical flow graph to three fourths of the left of the optical flow graph.
Further, referring to fig. 4, fig. 4 is a detailed flowchart of step S32 in the passenger flow statistics method of the present invention, and step S32 includes:
step S321, generating a mobile light flow graph with the same size as the getting-on light flow graph or the getting-off light flow graph;
step S322, judging whether the optical flow prediction point of the current foreground pixel point of the on-board optical flow graph is in the preset passenger moving area;
step S323, if the light stream prediction points of the current foreground pixel points of the on-board light flow graph are in the preset passenger moving area, obtaining the on-board foreground pixel points according to the moving light flow graph, the light stream prediction points of the current foreground pixel points of the on-board light flow graph and the on-board light flow graph;
step S324, emptying the mobile light flow graph, and judging whether the light flow prediction point of the current foreground pixel point of the off-board light flow graph is in the preset passenger moving area;
step S325, if the light stream prediction points of the current foreground pixels of the departure light flow graph are in the preset passenger movement area, obtaining the departure foreground pixels according to the mobile light flow graph, the light stream prediction points of the current foreground pixels of the departure light flow graph, and the departure light flow graph.
In this embodiment, first, a moving optical flow map with the same size as the entering optical flow map or the leaving optical flow map is generated, then, whether the optical flow prediction point of the current foreground pixel point of the entering optical flow map is in the preset passenger moving area is judged, if the optical flow prediction point of the current foreground pixel point of the entering optical flow map is in the preset passenger moving area, the entering foreground pixel point is obtained according to the moving optical flow map, the optical flow prediction point of the current foreground pixel point of the entering optical flow map and the entering optical flow map, finally, the moving optical flow map is cleared, and whether the optical flow prediction point of the current foreground pixel point of the leaving optical flow map is in the preset passenger moving area is judged, if the optical flow prediction point of the current foreground pixel point of the leaving optical flow map is in the preset passenger moving area, the moving optical flow prediction point of the current foreground pixel point of the leaving optical flow map and the leaving optical flow map are judged, and obtaining get-off foreground pixel points.
Specifically, an empty image with the same size as the image to be detected (the light flow diagram of the upper vehicle or the light flow diagram of the lower vehicle) is generated and is called as a mobile light flow diagram. And judging whether the light stream prediction points corresponding to the foreground pixel points of the on-vehicle light flow graph are in a moving area or not, and setting the prediction point coordinates of the moving light flow graph corresponding to the moving area as the foreground pixel points so as to set the foreground pixel points of the moving light flow graph. And then clearing the foreground pixels of the upper vehicle light flow graph, and setting one row of the upper vehicle light flow graph entering detection lines as foreground pixel points. And then combining the foreground pixel points of the mobile light flow graph into the upper vehicle light flow graph to finish the updating of the foreground pixel of the upper vehicle light flow graph.
Correspondingly, the lower-vehicle foreground pixel updating process is as follows: the method comprises the steps of emptying a mobile light flow graph, judging whether light flow prediction points corresponding to foreground pixel points of a light flow graph of a vehicle to be off are located in a moving area, and setting coordinates of the prediction points corresponding to the foreground pixel points in the moving area of the mobile light flow graph as foreground pixel points so as to set the foreground pixel points of the mobile light flow graph. And then emptying foreground pixels of the lower vehicle light flow graph, and setting one line of the lower vehicle light flow graph exit detection lines as foreground pixel points. And then, the foreground pixel points of the mobile light flow graph are merged into the lower light flow graph to complete the update of the foreground pixel of the lower light flow graph.
And step S33, obtaining a passenger flow statistical result according to the getting-on foreground pixel points and the getting-off foreground pixel points.
In this embodiment, the passenger flow statistical result is obtained according to the getting-on foreground pixel points and the getting-off foreground pixel points. Specifically, an upper vehicle line collision ratio is obtained according to the upper vehicle foreground pixel points, judgment is carried out according to the upper vehicle line collision ratio, meanwhile, a lower vehicle line collision ratio is obtained according to the lower vehicle foreground pixel points, judgment is carried out according to the lower vehicle line collision ratio, and passenger flow statistics results are obtained.
The upper vehicle collision line ratio is the ratio of the number of pixels passing through the preset door entry detection line in the upper vehicle foreground pixel points to the number of pixels located in the preset passenger standing area in the upper vehicle foreground pixel points; the getting-off collision line ratio is the ratio of the number of pixels passing through the preset exit detection line in the getting-off foreground pixel points to the number of pixels located in the preset passenger standing area in the getting-off foreground pixel points.
Specifically, step S33 includes:
step a331, obtaining an getting-on touch foreground number and a getting-on collision line ratio according to the getting-on foreground pixel points, a preset getting-on touch counting area and a preset passenger standing area, and obtaining a getting-off collision foreground number and a getting-off collision line ratio according to the getting-off foreground pixel points, a preset getting-off touch counting area and a preset passenger standing area;
in this embodiment, according to the foreground pixel point, preset the touch count area and preset the passenger standing area, the line collision ratio is obtained, and according to the getting-off foreground pixel point, the preset getting-off touch count area and the preset passenger standing area, the line collision ratio is obtained. Specifically, the number of boarding foreground pixel points in a preset boarding touch counting area is counted, the counted result is used as the boarding touch number, the number of boarding foreground pixel points in a preset passenger standing area is counted, the counted result is used as the boarding movement number, then, division operation is carried out on the boarding touch number and the boarding movement number, and the quotient obtained through calculation is used as the boarding collision line ratio. Correspondingly, the same processing is carried out on the obtained lower vehicle collision line ratio. In addition, the getting-on touch foreground number is obtained according to the getting-on foreground pixel points and the preset getting-on touch counting area, and the getting-off touch foreground number is obtained according to the getting-off foreground pixel points and the preset getting-off touch counting area correspondingly.
Step a332, judging whether the number of the upper vehicle touch foreground is larger than a first preset threshold value;
in this embodiment, it is determined whether the number of the upper vehicle touch foreground is greater than a first preset threshold. The first preset threshold is set according to actual conditions, and is not specifically limited herein.
Step a333, if the number of upper vehicle touch foreground is greater than the first preset threshold, determining whether the upper vehicle collision line ratio is greater than a second preset threshold;
step a334, if the boarding collision ratio is greater than the second preset threshold, acquiring a passenger boarding count, and adding one to the passenger boarding count;
in this embodiment, when the number of the upper touch foreground is greater than the first preset threshold, it is determined whether the upper line-of-collision ratio is greater than the second preset threshold, and if the upper line-of-collision ratio is greater than the second preset threshold, the passenger getting-on count is obtained, and the passenger getting-on count is increased by one. The second preset threshold may be set according to actual conditions, for example, 0.1, 0.15, 0.5, and the like, and is not limited herein. The number of passengers getting on the vehicle is counted by a counter.
Step a335, judging whether the get-off touch foreground number is larger than the first preset threshold value;
step a336, judging whether the lower vehicle collision line ratio is greater than a second preset threshold value;
step a337, if the getting-off collision line ratio is greater than the second preset threshold, obtaining a passenger getting-off count, and adding one to the passenger getting-off count;
in this embodiment, when the number of the alighting touch foreground is greater than the first preset threshold, it is determined whether the alighting collision line ratio is greater than the second preset threshold, and if the alighting collision line ratio is greater than the second preset threshold, the number of alighting passengers is obtained, and the number of the alighting passengers is increased by one. The second preset threshold may be set according to actual conditions, for example, 0.1, 0.15, 0.5, and the like, and is not limited herein. The number of passengers getting off is counted by a counter.
And a338, obtaining a passenger flow statistical result according to the added passenger boarding count and the added passenger disembarking count.
And finally, obtaining a passenger flow statistical result according to the added passenger boarding count and the added passenger alighting count. The passenger flow statistics result includes information such as the number of passengers getting on the bus, the number of passengers getting off the bus, and the current passenger capacity, and may also include character feature information such as age group, body type, height, and the like.
In this embodiment, the optical flow prediction result image is divided into the boarding optical flow graph and the alighting optical flow graph, so that the present embodiment realizes the statistics of the number of boarding passengers and the statistics of the number of alighting passengers. Meanwhile, the light stream prediction points of the light stream graph are set as foreground pixel points and are distinguished from background pixel points, so that the calculation amount is reduced.
Further, based on the second embodiment described above, a third embodiment of the passenger flow statistical method of the present invention is proposed.
Referring to fig. 5, fig. 5 is a flow chart illustrating a passenger flow statistics method according to a third embodiment of the present invention.
In this embodiment, the step a334 includes:
step S3341, if the ratio of the upper traffic collision line is greater than the second preset threshold, obtaining area pixel points of the preset passenger standing area;
in this embodiment, if the ratio of the vehicle-to-vehicle collision line is greater than the second preset threshold, the area pixel points of the preset passenger standing area are obtained. Wherein, the regional pixel is the pixel in the region of presetting the passenger to stand.
Step S3342, obtaining the boarding area ratio according to the area pixel points, the boarding foreground pixel points and the preset passenger standing area;
in this embodiment, the boarding area occupancy is obtained according to the area pixel points, the boarding foreground pixel points and the preset passenger standing area. Specifically, the number of regional pixel points is counted, the counted result is used as a denominator, then the number of boarding foreground pixel points in a preset passenger standing region is counted, the counted result is used as a numerator, finally, division operation is carried out on the numerator and the denominator, and the calculated result is used as a boarding region occupation ratio.
Step S3343, judging whether the getting-on area ratio is greater than a third preset threshold value;
step S3344, if the boarding area occupation ratio is greater than the third preset threshold, acquiring a passenger boarding count, and adding one to the passenger boarding count;
in this embodiment, it is determined whether the boarding area occupancy is greater than a third preset threshold, and if the boarding area occupancy is greater than the third preset threshold, the boarding count of the passenger is obtained, and the boarding count of the passenger is incremented by one. Wherein, the counting of getting on the bus of the passenger is counted by a counter.
It should be noted that, when a passenger gets on the vehicle, there is a possibility that some parts of the body (feet, hands, etc.) or articles carried by the passenger (bag, luggage, etc.) are not yet in the preset passenger standing area, but the passenger has got on the vehicle, and therefore, the second preset threshold is set to remove the above-mentioned interference. The third preset threshold may be set according to actual conditions, and is not specifically limited herein.
In this embodiment, the step a337 includes:
step a3371, if the lower vehicle collision line ratio is greater than the second preset threshold, obtaining area pixel points of the preset passenger standing area;
in this embodiment, if the ratio of the departure collision line is greater than the second preset threshold, the area pixel points of the preset passenger standing area are obtained. Wherein, the regional pixel is the pixel in the region of presetting the passenger to stand.
Step a3372, obtaining the ratio of the getting-off area according to the area pixel points, the getting-off foreground pixel points and the preset passenger standing area;
in this embodiment, the getting-off area occupation ratio is obtained according to the area pixel points, the getting-off foreground pixel points and the preset passenger standing area. Specifically, the number of pixels in the area is counted, the counted result is used as a denominator, then the number of getting-off foreground pixels in the preset passenger standing area is counted, the counted result is used as a numerator, finally, division operation is carried out on the numerator and the denominator, and the calculated result is used as the getting-off area ratio.
Step a3373, judging whether the get-off area ratio is greater than a third preset threshold;
step a3374, if the getting-off area occupancy is greater than the third preset threshold, obtaining a passenger getting-off count, and adding one to the passenger getting-off count.
In this embodiment, it is determined whether the getting-off area occupancy is greater than a third preset threshold, and if the getting-off area occupancy is greater than the third preset threshold, the get-off count of the passenger is obtained, and the get-off count of the passenger is increased by one. Wherein, the counting of getting off of the passengers is counted by a counter.
It should be noted that, when a passenger gets off the vehicle, there is a possibility that some parts of the body (feet, hands, etc.) or articles carried by the passenger (bag, luggage, etc.) are not yet in the preset passenger standing area, but the passenger gets on the vehicle, and therefore, the second preset threshold is set to remove the above-mentioned interference. The third preset threshold may be set according to actual conditions, and is not specifically limited herein.
In this embodiment, the getting-on area ratio is calculated, and whether the getting-on area ratio is greater than the second preset threshold is determined, so that interference of some parts (feet, hands, etc.) of the body or articles (bags, luggage, etc.) carried by passengers is reduced, and accordingly, the getting-off area ratio is also processed in the same manner. Therefore, the embodiment can further improve the accuracy of the passenger flow statistics.
Further, based on the third embodiment described above, a fourth embodiment of the passenger flow statistical method of the present invention is proposed.
Referring to fig. 6, fig. 6 is a flow chart of a passenger flow statistics method according to a fourth embodiment of the present invention.
In this embodiment, the step S3344 includes:
step S33441, if the occupancy of the boarding area is greater than the third preset threshold, obtaining the total number of boarding blank lines according to the boarding light flow graph and the boarding foreground pixel points;
in this embodiment, if the occupancy of the boarding area is greater than the third preset threshold, the total number of boarding blank rows is obtained according to the boarding light flow map and the boarding foreground pixel points. Specifically, the upper vehicle optical flow graph is traversed according to rows, the number of foreground pixel points in each row is calculated, then the row number of which the number of the foreground pixel points is smaller than a preset blank threshold value is used as a blank row, and finally the total number of the upper vehicle blank rows is counted.
Step S33442, judging whether the total number of the upper blank lines is smaller than a fourth preset threshold value;
step S33443, if the total number of the blank getting-on rows is smaller than the fourth preset threshold, obtaining a passenger getting-on count, and adding one to the passenger getting-on count;
in this embodiment, it is determined whether the total number of the blank lines in the boarding process is smaller than a fourth preset threshold, and if the total number of the blank lines in the boarding process is smaller than the fourth preset threshold, the boarding count of the passenger is obtained, and the boarding count of the passenger is increased by one. Wherein, the counting of getting on the bus of the passenger is counted by a counter.
It should be noted that, since some parts of the passenger's body (feet, hands, etc.) or articles carried by the passenger (bag, luggage, etc.) are within the video image frame, but the passenger does not get on the vehicle, the fourth preset threshold is set to remove the above-mentioned interference. The fourth preset threshold may be set according to actual conditions, and is not specifically limited herein.
In this embodiment, the step a3374 includes:
step a33741, if the get-off area occupancy is greater than the third preset threshold, obtaining the total number of blank rows of the get-off according to the get-off light flow graph and the get-off foreground pixel points;
in this embodiment, if the getting-off area occupancy is greater than the third preset threshold, the total number of blank lines of the getting-off vehicle is obtained according to the getting-off light flow graph and the getting-off foreground pixel points. Specifically, the lower vehicle optical flow diagram is traversed according to the rows, the number of foreground pixel points in each row is calculated, then the row number of which the number of the foreground pixel points is smaller than a preset blank threshold value is used as a blank row, and finally the total number of the lower vehicle blank rows is counted.
Step a33742, determining whether the total number of the blank rows of the lower vehicle is less than a fourth preset threshold value;
step a33743, if the total number of blank getting-off rows is smaller than the fourth preset threshold, obtaining a passenger getting-off count, and adding one to the passenger getting-off count.
In this embodiment, it is determined whether the total number of getting-off blank rows is smaller than a fourth preset threshold, and if the total number of getting-off blank rows is smaller than the fourth preset threshold, the get-off count of the passenger is obtained, and the get-off count of the passenger is incremented by one. Wherein, the counting of getting off of the passengers is counted by a counter.
It should be noted that, since some parts of the passenger's body (feet, hands, etc.) or articles carried by the passenger (bag, luggage, etc.) are in the video image frame, but the passenger does not get off the vehicle, the fourth preset threshold is set to remove the above-mentioned interference. The fourth preset threshold may be set according to actual conditions, and is not specifically limited herein.
In this embodiment, the total number of the blank lines during getting-on is calculated, and whether the total number of the blank lines during getting-on is smaller than a fourth preset threshold is determined, so that interference of some parts (feet, hands, etc.) of a body or articles (bags, luggage, etc.) carried by passengers is reduced, and accordingly, the blank lines during getting-off are also processed in the same way. Therefore, the embodiment can further improve the accuracy of the passenger flow statistics.
The invention also provides a passenger flow statistical system, which comprises: memory, a processor and a traffic statistic program stored on the memory and executable on the processor, the traffic statistic program when executed by the processor implementing the steps of the traffic statistic method according to any of the above embodiments.
The specific embodiment of the passenger flow statistics system of the present invention is basically the same as the embodiments of the passenger flow statistics method, and is not described herein again.
The invention further provides a computer readable storage medium having stored thereon a passenger flow statistics program, which when executed by a processor implements the steps of the passenger flow statistics method according to any of the embodiments above.
The specific embodiment of the computer-readable storage medium of the present invention is substantially the same as the embodiments of the passenger flow statistics method described above, and is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method for providing statistics on passenger flow, comprising the steps of:
acquiring a video image frame, and preprocessing the video image frame to obtain an image frame to be detected;
calculating the image frame to be detected according to a preset optical flow algorithm to obtain an optical flow prediction result image;
and obtaining a passenger flow statistical result according to the optical flow prediction result image.
2. The method of claim 1, wherein the step of obtaining the passenger flow statistics from the optical flow predictor image comprises:
dividing the light stream prediction result image into an upper vehicle light stream graph and a lower vehicle light stream graph according to a light stream vector of the light stream prediction result image, a preset passenger moving area, a preset door entering detection line and a preset door exiting detection line;
setting the light stream prediction points of the current foreground pixel points of the on-vehicle light stream graph as on-vehicle foreground pixel points, and setting the light stream prediction points of the current pixel points of the off-vehicle light stream graph as off-vehicle foreground pixel points;
and obtaining a passenger flow statistical result according to the getting-on foreground pixel points and the getting-off foreground pixel points.
3. The passenger flow statistical method of claim 2, wherein the step of setting the optical flow prediction points of the current foreground pixels of the boarding flow graph as boarding foreground pixels and setting the optical flow prediction points of the current pixels of the disembarking flow graph as disembarking foreground pixels comprises:
generating a mobile light flow graph with the same size as the getting-on light flow graph or the getting-off light flow graph;
judging whether the light stream prediction point of the current foreground pixel point of the on-board light stream map is in the preset passenger moving area or not;
if the light stream prediction points of the current foreground pixel points of the on-vehicle light stream map are in the preset passenger moving area, obtaining the on-vehicle foreground pixel points according to the moving light stream map, the light stream prediction points of the current foreground pixel points of the on-vehicle light stream map and the on-vehicle light stream map;
emptying the mobile light flow graph, and judging whether a light flow prediction point of a current foreground pixel point of the off-vehicle light flow graph is in the preset passenger moving area or not;
and if the light stream prediction points of the current foreground pixel points of the off-vehicle light stream map are in the preset passenger moving area, obtaining the off-vehicle foreground pixel points according to the moving light stream map, the light stream prediction points of the current foreground pixel points of the off-vehicle light stream map and the off-vehicle light stream map.
4. The passenger flow statistical method according to claim 2, wherein the step of obtaining the passenger flow statistical result according to the getting-on foreground pixel and the getting-off foreground pixel comprises:
obtaining an upper vehicle touch foreground number and an upper vehicle collision line ratio according to the upper vehicle foreground pixel points, a preset upper vehicle touch counting area and a preset passenger standing area, and obtaining a lower vehicle collision foreground number and a lower vehicle collision line ratio according to the lower vehicle foreground pixel points, a preset lower vehicle touch counting area and a preset passenger standing area;
judging whether the number of the upper vehicle touch foreground is larger than a first preset threshold value or not;
if the number of the upper vehicle touch foreground is larger than the first preset threshold, judging whether the upper vehicle collision line ratio is larger than a second preset threshold;
if the boarding collision ratio is larger than the second preset threshold, acquiring a passenger boarding count, and adding one to the passenger boarding count;
judging whether the get-off touch foreground number is larger than a first preset threshold value or not;
if the get-off touch foreground number is larger than the first preset threshold, judging whether the get-off collision ratio is larger than the second preset threshold;
if the getting-off collision line ratio is larger than the second preset threshold, obtaining a passenger getting-off count, and adding one to the passenger getting-off count;
and obtaining a passenger flow statistical result according to the added passenger boarding count and the added passenger alighting count.
5. The method of claim 4, wherein the step of obtaining a passenger boarding count if the boarding collision ratio is greater than the second predetermined threshold and incrementing the passenger boarding count by one comprises:
if the ratio of the upper vehicle collision line is larger than the second preset threshold value, obtaining area pixel points of the preset passenger standing area;
obtaining the boarding area occupation ratio according to the area pixel points, the boarding foreground pixel points and the preset passenger standing area;
judging whether the getting-on area ratio is larger than a third preset threshold value or not;
if the boarding area occupation ratio is larger than the third preset threshold, acquiring a passenger boarding count, and adding one to the passenger boarding count;
if the get-off collision line ratio is greater than the second preset threshold, then obtaining a passenger get-off count, and adding one to the passenger get-off count, the step includes:
if the getting-off collision line ratio is larger than the second preset threshold value, acquiring area pixel points of the preset passenger standing area;
obtaining the getting-off area occupation ratio according to the area pixel points, the getting-off foreground pixel points and the preset passenger standing area;
judging whether the getting-off area ratio is greater than a third preset threshold value or not;
and if the getting-off area ratio is greater than the third preset threshold value, obtaining a passenger getting-off count, and adding one to the passenger getting-off count.
6. The method of claim 5, wherein the step of obtaining a passenger boarding count if the boarding zone occupancy is greater than the third preset threshold and incrementing the passenger boarding count by one comprises:
if the occupancy of the boarding area is greater than the third preset threshold, obtaining the total number of boarding blank lines according to the boarding light flow graph and the boarding foreground pixel points;
judging whether the total number of the upper blank lines is smaller than a fourth preset threshold value or not;
if the total number of the blank getting-on rows is smaller than the fourth preset threshold value, obtaining a passenger getting-on count, and adding one to the passenger getting-on count;
if the getting-off area occupation ratio is larger than the third preset threshold, the step of obtaining the getting-off count of the passenger and adding one to the getting-off count of the passenger comprises the following steps:
if the getting-off area occupation ratio is larger than the third preset threshold value, obtaining the total number of blank driving of getting-off according to the getting-off light flow diagram and the getting-off foreground pixel points;
judging whether the total number of the blank lines of the lower vehicle is smaller than a fourth preset threshold value or not;
and if the total number of the blank getting-off rows is smaller than the fourth preset threshold, obtaining the getting-off count of the passengers, and adding one to the getting-off count of the passengers.
7. The passenger flow statistics method according to any one of claims 1 to 6, wherein the step of obtaining video image frames and preprocessing the video image frames to obtain image frames to be detected comprises:
acquiring a video image frame;
and reducing the video image frame according to a preset size, and taking the image frame obtained by reduction as an image frame to be detected.
8. The passenger flow statistical method according to claim 7, wherein the step of reducing the video image frames by a preset size and using the reduced image frames as the image frames to be detected comprises:
reducing the video image frame according to a preset size;
and converting the image frame obtained by reduction into a single-channel gray image, and taking the single-channel gray image as the image frame to be detected.
9. A passenger flow statistics system, characterized in that the passenger flow statistics system comprises: memory, a processor and a passenger flow statistics program stored on said memory and executable on said processor, said passenger flow statistics program when executed by said processor implementing the steps of the passenger flow statistics method according to any of claims 1 to 8.
10. A computer-readable storage medium, characterized in that a traffic statistic program is stored on the computer-readable storage medium, which when executed by a processor implements the steps of the traffic statistic method according to any of claims 1 to 8.
CN202010970214.7A 2020-09-15 2020-09-15 Passenger flow statistical method, system and computer readable storage medium Pending CN112102290A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010970214.7A CN112102290A (en) 2020-09-15 2020-09-15 Passenger flow statistical method, system and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010970214.7A CN112102290A (en) 2020-09-15 2020-09-15 Passenger flow statistical method, system and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112102290A true CN112102290A (en) 2020-12-18

Family

ID=73759144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010970214.7A Pending CN112102290A (en) 2020-09-15 2020-09-15 Passenger flow statistical method, system and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112102290A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114724123A (en) * 2022-03-30 2022-07-08 东南大学 Bus passenger flow statistical method based on vehicle-mounted monitoring video

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114724123A (en) * 2022-03-30 2022-07-08 东南大学 Bus passenger flow statistical method based on vehicle-mounted monitoring video
CN114724123B (en) * 2022-03-30 2024-04-23 东南大学 Bus passenger flow statistics method based on vehicle-mounted monitoring video

Similar Documents

Publication Publication Date Title
Chakraborty et al. Traffic congestion detection from camera images using deep convolution neural networks
US9317752B2 (en) Method for detecting large size and passenger vehicles from fixed cameras
US11475770B2 (en) Electronic device, warning message providing method therefor, and non-transitory computer-readable recording medium
US20190042857A1 (en) Information processing system and information processing method
US10997422B2 (en) Information processing apparatus, information processing method, and program
US20060039584A1 (en) Motion detection method and device, program and vehicle surveillance system
US20240054791A1 (en) Information processing apparatus, information processing method, and program
CN109767298B (en) Method and system for passenger driver safety matching
CN110245554B (en) Pedestrian movement trend early warning method, system platform and storage medium
JP4883415B2 (en) Monitoring device and program
CN112102290A (en) Passenger flow statistical method, system and computer readable storage medium
CN112433528A (en) Method and device for robot to take advantage of elevator, robot and storage medium
CN111079621A (en) Method and device for detecting object, electronic equipment and storage medium
CN111310650A (en) Vehicle riding object classification method and device, computer equipment and storage medium
CN114424241A (en) Image processing apparatus and image processing method
JP6620476B2 (en) Vehicle type identification device, vehicle type identification method, and vehicle type identification program
CN111278708B (en) Method and device for assisting driving
WO2023071397A1 (en) Detection method and system for dangerous driving behavior
CN116486332A (en) Passenger flow monitoring method, device, equipment and storage medium
CN113139488B (en) Method and device for training segmented neural network
CN114475502A (en) Method and device for protecting getting-off safety of passengers
CN115019242A (en) Abnormal event detection method and device for traffic scene and processing equipment
CN115045585A (en) Control device, system, vehicle, and control method
CN112405532B (en) Movement control method and device and robot
JP7491241B2 (en) CONTROL DEVICE, SYSTEM, VEHICLE, AND CONTROL METHOD

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210507

Address after: 518000 4 / F, gaoxinxin factory building, district 67, Xingdong community, Xin'an street, Bao'an District, Shenzhen City, Guangdong Province

Applicant after: SHENZHEN JIMI IOT Co.,Ltd.

Address before: 511400 room 508, building 3, 28 Qinglan street, Xiaoguwei street, Panyu District, Guangzhou City, Guangdong Province

Applicant before: Guangzhou Jimi Wulian Technology Co.,Ltd.

TA01 Transfer of patent application right