CN114511821B - Statistical method and system for number of persons getting on and off, computer equipment and storage medium - Google Patents
Statistical method and system for number of persons getting on and off, computer equipment and storage medium Download PDFInfo
- Publication number
- CN114511821B CN114511821B CN202210401166.9A CN202210401166A CN114511821B CN 114511821 B CN114511821 B CN 114511821B CN 202210401166 A CN202210401166 A CN 202210401166A CN 114511821 B CN114511821 B CN 114511821B
- Authority
- CN
- China
- Prior art keywords
- current
- image
- passenger
- rgb
- infrared
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the invention discloses a statistical method, a statistical system, computer equipment and a storage medium for the number of passengers getting on and off. The statistical method for the number of people getting on or off the bus comprises the following steps: acquiring a current RGB image and a current infrared image; judging whether the intensity of shooting light for shooting the current RGB image is greater than a preset intensity threshold value or not; if so, comparing the current RGB image with the current infrared image to obtain the current passenger position and the current passenger number; if not, acquiring an adjacent infrared image of at least one adjacent frame of the current infrared image, and acquiring the position of the current passenger and the number of the current passengers according to the current infrared image and the adjacent infrared image; and acquiring the movement trend and the passenger number change value of each passenger, and acquiring the number of passengers getting on or off the bus according to the movement trend and the passenger number change value of each passenger. The invention can effectively improve the accuracy and reliability of counting the number of people getting on or off the bus.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a statistical method, a statistical system, computer equipment and a storage medium for the number of passengers getting on and off.
Background
In urban management, the number of passengers getting on and off the bus in different time periods and the busy degree of each subway station need to be counted, and the method is used for macroscopic regulation and planning management. At present, a picture can be taken at the head of the car roof through a camera device, and the number of people is counted by identifying the picture through visual AI.
Because of the limitation of the installation environment of the bus or the subway, the picture can be obtained only by shooting from the top of the head downwards, the picture comprises the image of the top of the head of the passenger, and the top of the head has only one circle, so that the characteristic information is less, and the judgment error of the algorithm is caused when the position and the number of the passengers are obtained. For example, false alarms are easily generated when a passenger holds a ball or carries an object such as a backpack. In the case of poor illumination at night or in rainy weather, the quality of the picture is poor, which also results in inaccurate counting.
Disclosure of Invention
Based on the above, it is necessary to provide a statistical method, a system, a computer device and a medium for the number of passengers getting on and off the bus.
A statistical method for the number of passengers getting on and off is applied to a shooting system, wherein the shooting system comprises an RGB camera device and an infrared camera device;
the statistical method for the number of people getting on or off the bus comprises the following steps:
when the vehicle is located at a stop position, shooting according to a preset shooting time node when a passenger passes through a vehicle door is detected, and acquiring a current RGB image and a current infrared image;
judging whether the intensity of shooting light for shooting the current RGB image is greater than a preset intensity threshold value or not;
if the shooting light intensity is larger than the preset intensity threshold value, comparing the current RGB image with the current infrared image to obtain the current passenger position and the current passenger number;
if the shooting light intensity is smaller than or equal to the preset intensity threshold, acquiring an adjacent infrared image of at least one frame adjacent to the current infrared image, and acquiring the position of the current passenger and the number of the current passengers according to the current infrared image and the adjacent infrared image;
and acquiring the movement trend and the passenger quantity change value of each passenger according to the current RGB images and the current infrared images acquired when the vehicle is at a stop position, and acquiring the number of passengers getting on or off the bus according to the movement trend and the passenger quantity change value of each passenger.
A statistical system for the number of passengers getting on and off comprises the following modules:
the acquisition module is used for shooting according to a preset shooting time node when a vehicle door is detected to pass by when the vehicle is located at a stop position, and acquiring a current RGB image and a current infrared image;
the judging module is used for judging whether the shooting light intensity for shooting the current RGB image is greater than a preset intensity threshold value or not;
the comparison module is used for comparing the current RGB image with the current infrared image to obtain the current passenger position and the current passenger number if the shooting light intensity is larger than the preset intensity threshold value;
the detection module is used for acquiring at least one adjacent infrared image of the current infrared image adjacent to the frame if the shooting light intensity is smaller than or equal to the preset intensity threshold value, and acquiring the position of the current passenger and the number of the current passengers according to the current infrared image and the adjacent infrared image;
and the number module is used for acquiring the movement trend and the passenger number change value of each passenger according to the current RGB images and the current infrared images acquired when the vehicle is at the stop position, and acquiring the number of people getting on or off the bus according to the movement trend and the passenger number change value of each passenger.
A storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of the method as described above.
A computer device comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of the method as described above.
The implementation of the invention has the following beneficial effects:
when the shooting light intensity for shooting the current RGB image is smaller than or equal to the preset intensity threshold value and larger than the preset intensity threshold value, the quality of the current RGB image is high, the current RGB image and the current infrared image are compared, the position of the current passenger and the number of the current passenger can be accurately obtained, and therefore the accurate number of passengers getting on or off the bus is obtained.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Wherein:
FIG. 1 is a flow chart illustrating an embodiment of a statistical method for a number of persons getting on and off according to the present invention;
FIG. 2 is a schematic structural diagram of an embodiment of a statistical system for the number of passengers getting on and off according to the present invention;
fig. 3 is a schematic structural diagram of an embodiment of a control terminal provided in the present invention;
fig. 4 is a schematic structural diagram of an embodiment of a storage medium provided in the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of an embodiment of a statistical method for getting on/off people according to the present invention. The statistical method for the number of passengers getting on and off provided by the invention comprises the following steps:
s101: when the vehicle is located at a stop position, shooting is carried out according to a preset shooting time node when the vehicle door is detected to have passengers passing through, and a current RGB image and a current infrared image are obtained.
In a specific implementation scenario, a shooting system is arranged above a door frame of a public transport means such as a bus and a subway to acquire the condition of getting on or off a passenger. When the passenger is shot from top to bottom, only the image of the top of the head of the passenger, namely an approximately circular image of the top of the head, can be obtained, and the number and the position of the current passenger can be obtained according to the number and the position of the images of the top of the head in the collected images. In this implementation scenario, the camera system includes an RGB camera device and an infrared camera device. The RGB camera device can collect colorful RGB images, and the infrared camera device can collect infrared images. When the vehicle is at a stop position and passengers pass through the vehicle door, the shooting system is driven to work, the current RGB image is obtained through the RGB camera device, and the current infrared image is obtained through the infrared camera device.
In other implementation scenes, the shooting system can be driven to work when a vehicle arrives at a station or a vehicle door is opened. The working time of the shooting system is effectively reduced, and resources are saved.
S102: and judging whether the intensity of the shooting light for shooting the current RGB image is greater than a preset intensity threshold value, if so, executing step S103. If not, go to step S104.
In a specific implementation scene, when the RGB camera device shoots, the current shooting light intensity can be obtained, and the RGB camera device can intelligently adjust shooting parameters according to the shooting light intensity, so that the quality of the shot current RGB image is higher, but when the shooting light intensity is too low, the shooting light intensity is lower than the adjustable range of the RGB camera device, and the quality of the obtained current RGB image is lower, there may be the problems of unclear picture, dark color, low resolution and the like, and the accuracy and reliability of the obtained passenger position and the number of passengers will be affected by the current RGB image with lower quality.
In the implementation scene, a preset intensity threshold is set, and the preset intensity threshold is set according to the shooting performance parameters of the RGB camera device, the requirement on the accuracy of the number of people, and the lighting condition of the driving route of the public transport means.
S103: and comparing the current RGB image with the current infrared image to obtain the current passenger position and the current passenger number.
In a specific implementation scenario, when the shooting light intensity when shooting the current RGB image is greater than the preset intensity threshold, the quality of the shot current RGB image is high, and the use requirement can be met. And comparing the current RGB image with the current infrared image to obtain the current passenger position and the current passenger number.
In one implementation, the current RGB image is image processed to identify RGB passenger image regions in the current RGB image that are overhead images of passengers. For example, the RGB passenger image area may be identified based on characteristics of the passenger overhead image (e.g., a circle, a preset size, etc.). The RGB passenger image area in the current RGB image can be identified through the trained neural network. And carrying out image processing on the current infrared image to identify an infrared passenger image area in the current infrared image. The passenger's overhead image shows the origin of red in the current infrared image, and screening is performed according to the preset size to obtain the infrared passenger image area. And the image area of the infrared passenger in the current infrared image can be identified through the trained neural network.
The RGB image position of each passenger is obtained according to the RGB passenger image area, and the infrared image position of each passenger is obtained according to the infrared passenger image area. And if the RGB image position corresponds to the infrared image position, judging that the RGB image position corresponds to the head top of a passenger, and if the RGB image position does not correspond to the infrared image position, judging that the RGB image position does not have the passenger. The passenger position and the passenger number are obtained according to the RGB image position and the number with the corresponding infrared image position.
In an implementation scene, because the installation position of the RGB camera device and the infrared camera device is unlikely to coincide, there is a viewing angle difference between a current RGB image and a current infrared image captured by the RGB camera device and the infrared camera device, the installation positions (including a horizontal position and a vertical height) of the RGB camera device and the infrared camera device are obtained, image position correction is performed on the current RGB image position and the current infrared image position according to the installation positions, whether there is a passenger in the current RGB image position is judged based on the corrected current RGB image position and the current infrared image position, and the accuracy and reliability of judgment can be further improved.
In another implementation scene, the binocular distance measurement is carried out based on the installation positions of the RGB camera device and the infrared camera device, because the height of the roof is known and unchanged, the binocular distance measurement is carried out on the current passenger position obtained through the steps in the current RGB image according to the RGB camera device and the infrared camera device, the height of the passenger can be calculated, if the height meets the preset height range, the position of the current RGB image can be determined to correspond to a passenger, and the current passenger position is reserved. Furthermore, the head diameter can be calculated according to the diameter of the circle of the current RGB image position and the installation positions of the RGB camera device and the infrared camera device, the ratio of the head diameter to the height is obtained, whether the ratio meets the preset human head-body ratio or not is judged, and if yes, the current passenger position can be determined to really correspond to a passenger. For example, a passenger may carry a heated round object, but the round object may be carried by a hand, so that the height or the ratio of the round object is not similar to the value of a real human, and misjudgment can be effectively avoided through the method.
In one implementation scenario, image position correction is performed on a current RGB image and a current infrared image according to positions where an RGB camera and an infrared camera are installed, an RGB passenger image area in the current RGB image after the position correction is obtained, position parameters of the RGB passenger image area in the current RGB image are obtained, the RGB passenger image area is mapped to the current infrared image after the position correction according to the position parameters, a current mapping area corresponding to an image position of the RGB passenger image area in the current infrared image is obtained, and if the current mapping area includes a circular hotspot, it is indicated that the current mapping area corresponds to one passenger. And acquiring the current passenger position and the current passenger number according to the position and the number of the current mapping area comprising the circular hot spot.
In another implementation scenario, a training image set is prepared in advance, the training image set includes a plurality of training image groups and passenger positions and/or passenger numbers corresponding to each training image group, each training image group includes training RGB images and training infrared images corresponding to the training RGB images, and the training image set is used for training a neural network to obtain a pre-trained neural network. And inputting the current RGB image and the current infrared image into a pre-trained neural network to obtain the current passenger position and the current passenger number.
In another implementation scenario, the training image set includes a plurality of training image groups, each training image group includes a training RGB image and a training infrared image corresponding to the training RGB image, a passenger image position is marked on each training RGB image and the training infrared image corresponding to the training RGB image, and the neural network is trained through the training image set to obtain a pre-trained neural network. And inputting the current RGB image and the current infrared image into a pre-trained neural network, acquiring the positions of passengers in the current RGB image, counting according to the positions of the passengers, and acquiring the number of the passengers.
The passenger position and the passenger number are obtained through the neural network, and the accuracy and the efficiency of detection can be effectively improved. And performing image detection based on the current RGB image to acquire the RGB probability value of each pixel point belonging to the current passenger position, and performing image detection based on the current infrared image to acquire the infrared probability value of each pixel point belonging to the current passenger position. The image detection may be obtained by performing an operation through a neural network.
Obtaining the passenger probability value of each pixel point belonging to the current passenger position according to the following formula:
RGB probability value RGB weight value + infrared probability value (1-RGB weight value).
And taking the pixel points with the passenger probability value larger than the preset probability threshold value as target pixel points, acquiring a target image formed by the target pixel points, performing circular detection on the target image, acquiring overhead images corresponding to the tops of the passengers, and acquiring the positions and the number of the current passengers according to the positions and the number of the overhead images.
In another implementation scenario, the RGB weight value is obtained according to the shooting light intensity, and when the shooting light intensity falls within the safe shooting range, the higher the shooting light intensity is, the larger the RGB weight value is, and the infrared weight value can be obtained by subtracting the RGB weight value from 1-, where the RGB weight value is less than or equal to 0.5.
S104: and acquiring at least one adjacent infrared image of at least one frame adjacent to the current infrared image, and acquiring the current passenger position and the current passenger number according to the current infrared image and the adjacent infrared image.
In a specific implementation scenario, if the shooting light intensity is less than or equal to the preset intensity threshold, the quality of the current RGB image acquired at this time is not high, and therefore, the position of the passenger and the number of passengers can be mainly acquired by using the infrared camera device. And acquiring an adjacent infrared image adjacent to the current infrared image by at least one frame, wherein the adjacent infrared image can be at least one frame of image after the current infrared image or at least one frame of image before the current infrared image. And comparing the current infrared image with the adjacent infrared images to obtain a difference image, and if the difference image is caused by the movement of the passenger, obtaining the current passenger position and the current passenger number according to the difference image.
In one implementation scenario, the adjacent infrared image of the next frame of the current infrared image is acquired, and in other implementation scenarios, the adjacent infrared image may also be acquired at intervals of a preset time length, where the length of the preset time length needs to be short, so as to avoid that the passenger moves too far, for example, less than 0.1 s. The passenger is moving when getting off the vehicle, so the position in the two images will be different, thereby causing the difference of the two images, the difference image area is obtained between the two images, and the position and the number of the passenger can be obtained through the part caused by the movement of the passenger in the difference image area. And carrying out binarization processing on the current infrared image and the adjacent infrared image to obtain the current binarization image and the adjacent binarization image. And comparing the current binary image with the adjacent binary image through a frame difference method to obtain a difference image area.
Comparing each pixel point in the current binary image with each pixel point in the adjacent binary image, if the pixel value of one pixel point in the two images is different, marking the pixel point as a difference pixel point, and taking the area of the difference pixel point as a difference image area. Further, adjacent pixels near each difference pixel (for example, with the current difference pixel as a circle center and a plurality of pixels as circle centers) are obtained, if more than a preset number (for example, 5) of the adjacent pixels of the difference pixel are the difference pixels, the difference pixel is retained, and if not, the difference pixel is ignored.
And acquiring the images to be detected of the corresponding positions of the difference image areas in the adjacent binary images or the adjacent infrared images, wherein the difference image areas may not comprise a complete passenger vertex area because the passengers move far in a short time. Therefore, the range can be properly expanded on the basis of the difference image area to obtain the image to be detected, the Hoffman circular detection is carried out on the image to be detected, the circular image included in the image to be detected is obtained, the circular image is screened, and the circular image with the diameter within the preset range is used as the head area of the passenger, so that the current passenger number and the passenger position are obtained. The preset range can be subjected to big data statistics or calculated and obtained according to the diameter range of the human head in proportion.
In other implementation scenarios, the preselected screening condition during the hoffman circle detection is that the diameter is within the target detection diameter range, so that the selected circle is the head region of the passenger. The method comprises the steps of obtaining the focal distance range of an infrared camera device, obtaining a target detection diameter range according to the focal distance range and the human height range, taking the circle which belongs to the target detection diameter range in the circles obtained through Hofmann circle detection as the head image of a passenger, and obtaining the position of the passenger and the number of the passenger according to the position and the number of the head image.
In another implementation scenario, huffman circle detection is performed on adjacent binary images, a circular area possibly corresponding to the head area of the passenger is selected, the circular area is compared with the difference image area, and if the difference image area exceeding a preset area is included in the circular area, the circular area is used as the head area of the passenger.
In another implementation scenario, the adjacent infrared image is an image at least one frame before the current infrared image. And processing the adjacent infrared images through Kalman filtering to obtain the position corresponding to each passenger in the plurality of adjacent infrared images, obtaining the trend of position change, predicting according to the trend of position change to obtain the predicted position of the passenger. The passenger detection position in the current infrared image is obtained by adopting a similar Hoffman circle detection method in the steps. And when the intersection ratio operation result is larger than a preset threshold value and the size of the frame of the passenger prediction position and/or the passenger detection position is smaller than a preset size, the passenger detection position and the passenger prediction position are considered to correspond to the same passenger, namely, the current passenger position can be obtained according to the passenger prediction position and/or the passenger detection position, for example, any one of the passenger preset position or the passenger detection position is taken as the current passenger position, and further, the number of the current passengers in the current infrared image is calculated according to the current passenger position.
S105: the method comprises the steps of obtaining the moving trend and the passenger quantity change value of each passenger according to a plurality of current RGB images and current infrared images obtained when a vehicle is located at a stop position, and obtaining the number of passengers getting on or off the bus according to the moving trend and the passenger quantity change value of each passenger.
In a specific implementation scenario, when a passenger passes through a vehicle door, multiple times of shooting are performed, or periodically shooting is performed at preset time intervals, and current RGB images and current infrared images at multiple moments are acquired, so that the number of passengers and the positions of passengers at multiple moments are acquired. The positions of passengers and the number of passengers at every two adjacent moments are compared, the moving trend (moving outwards the vehicle door or moving inwards the vehicle door) of each passenger is obtained, so that whether each passenger gets on or off the vehicle is obtained, and the number of passengers getting on or off the vehicle when the vehicle is parked at present can be obtained by counting the vehicle getting on or off conditions.
Specifically, within the duration of the stop, if a passenger passes through the vehicle door, the shooting system is driven to shoot, and the current RGB image and the current infrared image of each frame may be acquired, or the current RGB image and the current infrared image of a plurality of time points may be acquired at preset time intervals. And acquiring the current passenger position and the current passenger number corresponding to each group of current RGB images and current infrared images, and marking the current passenger position. The method comprises the steps of obtaining the current passenger positions of two adjacent time points, and regarding the current passenger position at the previous moment and the current passenger position at the next moment, which are less than a preset threshold value, as the positions of the same passenger at the two moments. The operation is carried out on the current passenger positions at all times, and the moving path and the moving direction of each passenger are obtained by combining the current passenger number at each moment, so that the number of passengers getting on and off at each stop at the current stop can be obtained.
As can be seen from the above description, in this embodiment, when the shooting light intensity for shooting the current RGB image is less than or equal to the preset intensity threshold, the current RGB image has higher quality, the current RGB image is compared with the current infrared image, and the current passenger position and the current passenger number can be accurately obtained, so as to obtain an accurate number of passengers getting on or off, and when the shooting light intensity is less than or equal to the preset intensity threshold, the current RGB image has lower quality and lower reference value, and an adjacent infrared image of a next frame of the current infrared image is obtained, and a difference image area between the current infrared image and the adjacent infrared image is obtained.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an embodiment of a system for counting passengers getting on and off according to the present invention. The statistical system 10 for the number of passengers getting on and off is applied to a shooting system, and the shooting system comprises an RGB camera device and an infrared camera device, and comprises an acquisition module 11, a judgment module 12, a comparison module 13, a detection module 14 and a number module 15.
The acquisition module 11 is used for shooting according to a preset shooting time node when a passenger passes through a vehicle door when the vehicle is located at a stop position, and acquiring a current RGB image and a current infrared image; the judging module 12 is configured to judge whether the intensity of the shooting light for shooting the current RGB image is greater than a preset intensity threshold; the comparison module 13 is configured to compare the current RGB image with the current infrared image to obtain a current passenger position and a current passenger number if the shooting light intensity is greater than a preset intensity threshold; the detection module 14 is configured to, if the intensity of the shooting light is less than or equal to a preset intensity threshold, obtain an adjacent infrared image of at least one frame adjacent to the current infrared image, and obtain a current passenger position and a current passenger number according to the current infrared image and the adjacent infrared image; the number of people module 15 is used for obtaining the movement trend and the passenger number change value of each passenger according to a plurality of current RGB images and current infrared images obtained when the vehicle is at the stop position, and obtaining the number of people getting on or off the bus according to the movement trend and the passenger number change value of each passenger.
The comparison module 13 is configured to obtain installation parameters of the RGB camera device and the infrared camera device, where the installation parameters include a horizontal distance between the RGB camera device and the infrared camera device and a vertical height between the RGB camera device and the infrared camera device; and carrying out image correction on the current RGB image and the current infrared image according to the installation parameters, so that each pixel point in the current RGB image and each pixel point in the current infrared image correspond to each other one by one.
The comparison module 13 is used for carrying out binocular ranging on the current passenger position based on the installation parameters, obtaining the height of the passenger at the current passenger position, and if the height of the passenger meets the preset height range, keeping the current passenger position.
The adjacent infrared image is a frame of image behind the current infrared image; the detection module 14 is configured to perform binarization processing on the current infrared image and the adjacent infrared image to obtain a current binarized image and an adjacent binarized image; comparing the current binary image with the adjacent binary image through a frame difference method to obtain a difference image area, and obtaining an image to be detected of the difference image area at a corresponding position in the adjacent infrared image; and performing Hoffman circle detection on the image to be detected, and acquiring the position and the number of the current passengers according to the detected circle region.
The adjacent infrared image is an image of at least one frame before the current infrared image; the detection module 14 is further configured to process the adjacent binary image through kalman filtering, obtain a predicted passenger position, obtain a detected passenger position in the current binary image, perform a cross-comparison operation on the predicted passenger position and the detected passenger position, and obtain the current passenger position and the current passenger number according to the predicted passenger position and the detected passenger position when a result of the cross-comparison operation meets a preset requirement.
The comparison module 13 is configured to obtain an RGB passenger image area in the current RGB image, obtain a current mapping area corresponding to an image position of the RGB passenger image area in the current infrared image, and indicate that the current mapping area corresponds to one passenger if the current mapping area includes a circular hotspot; and acquiring the current passenger position and the current passenger number according to the position and the number of the current mapping area comprising the circular hot spot.
The comparison module 13 is configured to obtain a training image set, where the training image set includes a plurality of training image groups and passenger positions and/or passenger numbers corresponding to each training image group, and each training image group includes a training RGB image and a training infrared image corresponding to the training RGB image; training a neural network through a training image set to obtain a pre-trained neural network; and inputting the current RGB image and the current infrared image into a pre-trained neural network to obtain the current passenger position and the current passenger number.
As can be seen from the above description, in this embodiment, when the shooting light intensity for shooting the current RGB image is less than or equal to the preset intensity threshold, the current RGB image has higher quality, the current RGB image is compared with the current infrared image, and the current passenger position and the current passenger number can be accurately obtained, so as to obtain an accurate number of passengers getting on or off, and when the shooting light intensity is less than or equal to the preset intensity threshold, the current RGB image has lower quality and lower reference value, and an adjacent infrared image of a next frame of the current infrared image is obtained, and a difference image area between the current infrared image and the adjacent infrared image is obtained.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a control terminal according to an embodiment of the present invention. The control terminal 20 includes a processor 21 and a memory 22. The processor 21 is coupled to a memory 22. The memory 22 has stored therein a computer program which is executed by the processor 21 in operation to carry out the method as shown in fig. 1. The detailed methods can be referred to above and are not described herein.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a storage medium according to an embodiment of the present invention. The storage medium 30 stores at least one computer program 31, and the computer program 31 is used for being executed by a processor to implement the method shown in fig. 1, and the detailed method can be referred to above and is not described herein again. In one embodiment, the storage medium 30 may be a memory chip in a terminal, a hard disk, or a removable hard disk or a flash disk, an optical disk, or other readable and writable storage tool, and may also be a server or the like.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above may be implemented by hardware that is instructed by a computer program, and the program may be stored in a non-volatile computer-readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus (Rambus) direct RAM (RDRAM), direct bused dynamic RAM (DRDRAM), and bused dynamic RAM (RDRAM).
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not to be construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims. Please enter the implementation content part.
Claims (8)
1. A statistical method for the number of passengers getting on and off is characterized by being applied to a shooting system, wherein the shooting system comprises an RGB camera device and an infrared camera device; the statistical method for the number of people getting on or off the bus comprises the following steps: when the vehicle is located at a stop position, shooting according to a preset shooting time node when the vehicle door is detected to have passengers pass, and acquiring a current RGB image and a current infrared image; judging whether the intensity of shooting light for shooting the current RGB image is greater than a preset intensity threshold value or not; if the shooting light intensity is larger than the preset intensity threshold value, acquiring installation parameters of the RGB camera device and the infrared camera device, wherein the installation parameters comprise the horizontal distance between the RGB camera device and the infrared camera device and the vertical height between the RGB camera device and the infrared camera device, comparing the current RGB image with the current infrared image, acquiring the position of a current passenger and the number of the current passenger, performing binocular distance measurement on the position of the current passenger based on the installation parameters, acquiring the height of the passenger at the position of the current passenger, and if the height of the passenger meets the preset height range, keeping the position of the current passenger; if the shooting light intensity is smaller than or equal to the preset intensity threshold value, acquiring an adjacent infrared image of at least one frame adjacent to the current infrared image, and performing binarization processing on the current infrared image and the adjacent infrared image to acquire a current binarization image and an adjacent binarization image; comparing the current binarization image with the adjacent binarization image through a frame difference method to obtain a difference image area, and obtaining an image to be detected of the difference image area at a corresponding position in the adjacent infrared image; performing Hoffman circle detection on the image to be detected, and acquiring the position of the current passenger and the number of the current passengers according to the detected circle area; and acquiring the movement trend and the passenger quantity change value of each passenger according to the current RGB images and the current infrared images acquired when the vehicle is at the stop position, and acquiring the number of people getting on or off the bus according to the movement trend and the passenger quantity change value of each passenger.
2. The statistical method of the number of passengers getting on and off as claimed in claim 1, wherein the step of comparing the current RGB image with the current infrared image is preceded by: and carrying out image correction on the current RGB image and the current infrared image according to the installation parameters, so that each pixel point in the current RGB image and the current infrared image corresponds to each other one by one.
3. The statistical method of the number of passengers getting on and off according to claim 1, wherein the adjacent infrared image is an image of at least one frame before the current infrared image; the step of obtaining the current passenger position and the current passenger number according to the current infrared image and the adjacent infrared image comprises the following steps: and processing the adjacent binary images through Kalman filtering to obtain passenger predicted positions, obtaining passenger detection positions in the current binary images, carrying out intersection-comparison operation on the passenger predicted positions and the passenger detection positions, and obtaining the current passenger positions and the current passenger quantity according to the passenger predicted positions and the passenger detection positions when the result of the intersection-comparison operation meets the preset requirement.
4. The statistical method for the number of passengers getting on and off according to claim 1, wherein the step of comparing the current RGB image with the current infrared image to obtain the current passenger position and the current passenger number comprises: acquiring an RGB passenger image area in the current RGB image, acquiring a current mapping area corresponding to the image position of the RGB passenger image area in the current infrared image, and if the current mapping area comprises a circular hot spot, indicating that the current mapping area corresponds to a passenger; and acquiring the current passenger position and the current passenger number according to the position and the number of the current mapping area comprising the circular hot spot.
5. The statistical method of the number of passengers getting on and off according to claim 1, wherein the step of comparing the current RGB image with the current infrared image to obtain the current passenger position and the current passenger number comprises: acquiring a training image set, wherein the training image set comprises a plurality of training image groups and passenger positions and/or passenger numbers corresponding to the training image groups, and each training image group comprises a training RGB image and a training infrared image corresponding to the training RGB image; training a neural network through the training image set to obtain a pre-trained neural network; and inputting the current RGB image and the current infrared image into the pre-trained neural network to obtain the current passenger position and the current passenger number.
6. A statistical system for the number of passengers getting on and off is characterized by being applied to a shooting system, wherein the shooting system comprises an RGB camera device and an infrared camera device; the system comprises the following modules: the acquisition module is used for shooting according to a preset shooting time node when a vehicle door is detected to pass by when the vehicle is located at a stop position, and acquiring a current RGB image and a current infrared image; the judging module is used for judging whether the intensity of the shooting light for shooting the current RGB image is greater than a preset intensity threshold value or not; the comparison module is used for acquiring the installation parameters of the RGB camera device and the infrared camera device if the shooting light intensity is greater than the preset intensity threshold value, comparing the current RGB image with the current infrared image to acquire the current passenger position and the current passenger number, wherein the installation parameters comprise the horizontal distance between the RGB camera device and the infrared camera device and the vertical height between the RGB camera device and the infrared camera device; performing binocular ranging on the current passenger position based on the installation parameters to obtain the height of the passenger at the current passenger position, and if the height of the passenger meets a preset height range, keeping the current passenger position; the detection module is used for acquiring at least one adjacent infrared image of the current infrared image adjacent to the frame if the shooting light intensity is smaller than or equal to the preset intensity threshold value, and performing binarization processing on the current infrared image and the adjacent infrared image to acquire a current binarization image and an adjacent binarization image; comparing the current binary image with the adjacent binary image through a frame difference method to obtain a difference image area, and obtaining an image to be detected of the difference image area at a corresponding position in the adjacent infrared image; performing Hoffman circle detection on the image to be detected, and acquiring the position of the current passenger and the number of the current passengers according to the detected circle area; and the number module is used for acquiring the movement trend and the passenger number change value of each passenger according to the current RGB images and the current infrared images acquired when the vehicle is positioned at a stop station, and acquiring the number of passengers getting on or off the bus according to the movement trend and the passenger number change value of each passenger.
7. A storage medium storing a computer program which, when executed by a processor, causes the processor to carry out the steps of the method according to any one of claims 1 to 5.
8. A computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210401166.9A CN114511821B (en) | 2022-04-18 | 2022-04-18 | Statistical method and system for number of persons getting on and off, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210401166.9A CN114511821B (en) | 2022-04-18 | 2022-04-18 | Statistical method and system for number of persons getting on and off, computer equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114511821A CN114511821A (en) | 2022-05-17 |
CN114511821B true CN114511821B (en) | 2022-07-26 |
Family
ID=81554618
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210401166.9A Active CN114511821B (en) | 2022-04-18 | 2022-04-18 | Statistical method and system for number of persons getting on and off, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114511821B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11836825B1 (en) * | 2022-05-23 | 2023-12-05 | Dell Products L.P. | System and method for detecting postures of a user of an information handling system (IHS) during extreme lighting conditions |
CN117593341B (en) * | 2024-01-19 | 2024-05-07 | 深圳市超诺科技有限公司 | System and method for processing target object monitoring data based on hunting camera |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018148896A1 (en) * | 2017-02-16 | 2018-08-23 | 深圳市锐明技术股份有限公司 | Method and device for counting number of passengers in vehicle |
CN108875562A (en) * | 2018-04-28 | 2018-11-23 | 华南师范大学 | A kind of public transport people flow rate statistical method and system |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101763669A (en) * | 2009-10-26 | 2010-06-30 | 杭州六易科技有限公司 | Public bus people flow rate counting device |
CN106709444A (en) * | 2016-12-19 | 2017-05-24 | 集美大学 | Binocular infrared photography-based bus passenger flow counting device and method |
CN110147713A (en) * | 2019-03-28 | 2019-08-20 | 石化盈科信息技术有限责任公司 | A kind of method for detecting fatigue driving and system |
CN110675447A (en) * | 2019-08-21 | 2020-01-10 | 电子科技大学 | People counting method based on combination of visible light camera and thermal imager |
JP7303384B2 (en) * | 2020-05-28 | 2023-07-04 | 株式会社日立国際電気 | Passenger number counting system and passenger number counting device |
CN112183287A (en) * | 2020-09-22 | 2021-01-05 | 四川阿泰因机器人智能装备有限公司 | People counting method of mobile robot under complex background |
-
2022
- 2022-04-18 CN CN202210401166.9A patent/CN114511821B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018148896A1 (en) * | 2017-02-16 | 2018-08-23 | 深圳市锐明技术股份有限公司 | Method and device for counting number of passengers in vehicle |
CN108875562A (en) * | 2018-04-28 | 2018-11-23 | 华南师范大学 | A kind of public transport people flow rate statistical method and system |
Non-Patent Citations (1)
Title |
---|
passenger detection and counting for public transport system;Saddam Hussain Khan et al.;《NED University Journal of Research》;20200331;第1-14页 * |
Also Published As
Publication number | Publication date |
---|---|
CN114511821A (en) | 2022-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114511821B (en) | Statistical method and system for number of persons getting on and off, computer equipment and storage medium | |
CN111680746B (en) | Vehicle damage detection model training, vehicle damage detection method, device, equipment and medium | |
CN108037770A (en) | Unmanned plane power transmission line polling system and method based on artificial intelligence | |
JP6786279B2 (en) | Image processing device | |
CN105913685A (en) | Video surveillance-based carport recognition and intelligent guide method | |
CN108021848A (en) | Passenger flow volume statistical method and device | |
Bell et al. | A novel system for nighttime vehicle detection based on foveal classifiers with real-time performance | |
EP2057581A1 (en) | Detection and categorization of light spots using a camera in a vehicle environment | |
CN111523397B (en) | Intelligent lamp post visual identification device, method and system and electronic equipment thereof | |
CN109671090B (en) | Far infrared ray-based image processing method, device, equipment and storage medium | |
CN114943923B (en) | Method and system for recognizing explosion flare smoke of cannonball based on video of deep learning | |
CN112613336A (en) | Method and device for generating an object classification of an object | |
CN115546738A (en) | Rail foreign matter detection method | |
CN115180522A (en) | Safety monitoring method and system for hoisting device construction site | |
CN112585655A (en) | Unmanned electronic traffic police duty system based on 5G | |
CN106778675B (en) | A kind of recognition methods of target in video image object and device | |
CN111325708B (en) | Transmission line detection method and server | |
CN112329631A (en) | Method for carrying out traffic flow statistics on expressway by using unmanned aerial vehicle | |
KR102219906B1 (en) | Method and apparatus for automatically generating learning data for machine learning | |
CN115035543B (en) | Big data-based movement track prediction system | |
CN115205827A (en) | Image recognition method and device for road detection, electronic equipment and medium | |
CN115272284A (en) | Power transmission line defect identification method based on image quality evaluation | |
CN114926724A (en) | Data processing method, device, equipment and storage medium | |
CN109766865B (en) | Watershed, multi-region local area fusion and feature tracking passenger flow statistical method | |
KR101954404B1 (en) | CCTV System and Method for improving Recognition of Car Number by Using dynamic Thresholds |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20230620 Address after: 13C-18, Caihong Building, Caihong Xindu, No. 3002, Caitian South Road, Gangsha Community, Futian Street, Futian District, Shenzhen, Guangdong 518033 Patentee after: Core Computing Integrated (Shenzhen) Technology Co.,Ltd. Address before: 518000 1001, building G3, TCL International e city, Shuguang community, Xili street, Nanshan District, Shenzhen City, Guangdong Province Patentee before: Shenzhen Aishen Yingtong Information Technology Co.,Ltd. |
|
TR01 | Transfer of patent right |