WO2022215394A1 - Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle - Google Patents
Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle Download PDFInfo
- Publication number
- WO2022215394A1 WO2022215394A1 PCT/JP2022/008921 JP2022008921W WO2022215394A1 WO 2022215394 A1 WO2022215394 A1 WO 2022215394A1 JP 2022008921 W JP2022008921 W JP 2022008921W WO 2022215394 A1 WO2022215394 A1 WO 2022215394A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- image
- area
- detection
- congestion degree
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 238000001514 detection method Methods 0.000 claims abstract description 106
- 230000032258 transport Effects 0.000 claims description 8
- 238000000605 extraction Methods 0.000 description 15
- 238000004891 communication Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 5
- 208000025721 COVID-19 Diseases 0.000 description 2
- 208000035473 Communicable disease Diseases 0.000 description 2
- 241000711573 Coronaviridae Species 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000009385 viral infection Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
Definitions
- the present invention relates to a vehicle congestion level determination method and a vehicle congestion level determination system.
- Patent Literature 1 discloses a monitoring device for the number of passengers in a vehicle, which includes a first imaging device and a processing unit and is attached inside the vehicle to monitor the number of passengers in the vehicle.
- the dynamic detection module of Patent Document 1 determines that there is a person in the recognition block when the image variation value is greater than the threshold.
- An object of the present invention is to provide a vehicle congestion level determination method and a vehicle congestion level determination system that can appropriately determine the congestion level of vehicles that transport passengers.
- a vehicle congestion degree determination method includes the steps of acquiring an image of the interior of a vehicle that transports passengers, detecting people from the image, and surrounding each of the detected people. generating a rectangular detection area; and determining the congestion degree of the vehicle based on the target area of the image and the detection area, wherein the target area is a partial area of the image. and a passenger standing in the aisle of the vehicle is imaged, and in the determining step, the congestion degree of the vehicle is determined based on the number of the detection areas overlapping the target area. characterized by
- a vehicle congestion degree determination method comprises the steps of: detecting a person from an image; generating a rectangular detection area surrounding each detected person; and determining the degree of congestion of vehicles based on.
- the region of interest is the portion of the image in which a passenger standing in the aisle of the vehicle is imaged.
- the degree of vehicle congestion is determined based on the number of detection areas overlapping the target area. According to the vehicle congestion degree determination method according to the present invention, it is possible to appropriately determine the congestion degree of vehicles that transport passengers.
- FIG. 1 is a block diagram of a vehicle congestion degree determination system according to an embodiment.
- FIG. 2 is a diagram showing an image captured by the first camera.
- FIG. 3 is a diagram showing an image captured by the second camera.
- FIG. 4 is a diagram for explaining determination of the degree of overlap.
- FIG. 5 is a flow chart showing the operation of the embodiment.
- FIG. 1 is a block diagram of the vehicle congestion degree determination system according to the embodiment
- FIG. 2 is a diagram showing an image captured by the first camera
- FIG. 3 is a diagram showing an image captured by the second camera
- FIG. 4 is a diagram explaining determination of the degree of overlap
- FIG. 5 is a flowchart showing the operation of the embodiment.
- the vehicle congestion degree determination system 1 includes an on-vehicle device 2, a camera 3, and an office PC 7.
- the vehicle-mounted device 2 and the camera 3 are mounted on a vehicle 100 that transports passengers.
- Vehicle 100 of the present embodiment is a shared bus, for example, a route bus that travels on a predetermined route.
- the vehicle 100 allows passengers to get on and off at each stop on the route.
- the vehicle 100 is equipped with a door sensor 4 in addition to the vehicle-mounted device 2 and the camera 3 .
- the vehicle-mounted device 2 is, for example, a drive recorder or digital tachograph that records the operation status of the vehicle 100.
- the vehicle-mounted device 2 has a CPU 21 , a memory 22 , a GPS receiver 23 and a communication unit 24 .
- the CPU 21 is a computing device that performs various computations.
- the CPU 21 executes the operations of this embodiment according to a program stored in the memory 22, for example.
- the memory 22 is a storage unit that includes volatile memory and nonvolatile memory.
- the GPS receiver 23 calculates the current position of the vehicle 100 based on signals transmitted from satellites.
- the communication unit 24 is a communication module that communicates with the radio base station 5 . Communication unit 24 performs radio communication with radio base station 5 via the antenna of vehicle 100 in accordance with instructions from CPU 21 .
- the camera 3 is an imaging device that captures the interior of the vehicle 100 and outputs images G1 and G2.
- the position of the camera 3 and the angle of view of the camera 3 are set so that the passenger in the vehicle can be imaged.
- the vehicle 100 of this embodiment has a first camera 31 and a second camera 32 as the cameras 3 .
- the first camera 31 captures an image of the vicinity of the exit in the vehicle.
- the second camera 32 images the vicinity of the boarding gate inside the vehicle.
- the vehicle-mounted device 2 acquires the image captured by the camera 3 and stores it in the memory 22 .
- the door sensor 4 is a sensor that detects the state of the door that opens and closes the entrance/exit of the vehicle 100 .
- the door sensor 4 detects, for example, whether the door is fully closed.
- the door sensor 4 is arranged, for example, at each of the entrance door and the exit door. A detection result of the door sensor 4 is sent to the CPU 21 .
- the vehicle-mounted device 2 transmits data for determining the degree of congestion to the outside via the communication unit 24 .
- the data to be transmitted includes, for example, the image captured by the camera 3, the image capturing time of the image, the position of the vehicle 100 at the image capturing time, the identification code of the vehicle 100, and the like.
- the image data transmitted by the communication unit 24 is, for example, still image data.
- the transmitted data is stored, for example, in the server 6 connected to the Internet network NW.
- the office PC 7 is composed of, for example, a general-purpose computer installed in an office of a company or the like.
- the office PC 7 has a function of managing the operation status of the vehicles 100 to be managed and a function of determining the degree of congestion of the vehicles 100 .
- Office PC 7 has CPU 71 , memory 72 , communication section 73 , and external input interface 74 .
- Office PC 7 communicates with server 6 and wireless base station 5 via communication unit 73 and Internet network NW.
- the office PC 7 manages the operation status and determines the degree of congestion, for example, according to a program read from the memory 72 to the CPU 71 .
- FIG. 1 An example of the image G1 captured by the first camera 31 is shown in FIG.
- the inside 101 of the vehicle 100 is imaged in the image G1. More specifically, the image G1 includes a driver's seat 102, an aisle 103, and an exit 105.
- FIG. The exit 105 is arranged in the front part of the vehicle 100 and is opened and closed by the exit door 104 .
- FIG. 1 An example of the image G2 captured by the second camera 32 is shown in FIG.
- the inside 101 of the vehicle 100 is imaged in the image G2. More specifically, the image G2 includes the aisle 103, the boarding gate 107, the rear seat 108, and the front seat 109.
- the boarding gate 107 is arranged in the middle part of the vehicle 100 and is opened and closed by the boarding door 106 .
- the rear seat 108 is a seat arranged behind the entrance 107 of the vehicle.
- the rear seat 108 faces forward of the vehicle.
- the front seat 109 is a seat arranged in front of the vehicle from the boarding gate 107 .
- the front seat 109 faces in the vehicle width direction.
- the vehicle-mounted device 2 transmits the captured images G1 and G2 after the passenger has completed boarding and alighting. For example, the vehicle-mounted device 2 transmits images G1 and G2 captured when both the exit door 104 and the boarding door 106 are fully closed to the Internet network NW. It is possible to accurately estimate the degree of congestion in the vehicle interior 101 from the images G1 and G2 captured when the movement of passengers is small.
- the office PC 7 acquires data for determining the degree of congestion of the vehicles 100 from the server 6. Office PC 7 acquires data via communication unit 73 and stores the acquired data in memory 72 .
- the CPU 71 has a detection section 71a, an extraction section 71b, and a determination section 71c.
- the detection unit 71a, the extraction unit 71b, and the determination unit 71c may be part of a program executed by the CPU 71, may be part of a circuit that the CPU 71 has, or may be a chip or the like mounted on the CPU 71. There may be. That is, the CPU 71 may be configured so as to be capable of executing the function as the detection section 71a, the function as the extraction section 71b, and the function as the determination section 71c.
- the vehicle congestion degree determination system 1 determines the congestion degree of the vehicle 100 based on two indices.
- the first indicator is passenger density.
- the CPU 71 detects people from the images G ⁇ b>1 and G ⁇ b>2 and calculates the density of passengers in the aisle 103 .
- the second indicator is the total number of detected passengers.
- the CPU 71 calculates the total number of passengers detected from the image G2. That is, the density is calculated from both images G1 and G2, and the total number of detections is calculated from image G2.
- the vehicle congestion degree determination system 1 according to the present embodiment can accurately determine the congestion degree of the vehicle 100 based on the passenger density and the total number of detected passengers.
- the detection unit 71a detects people from the images G1 and G2, and generates a detection area DA for each detected person. For example, the detection unit 71a generates a detection area DA for each detected person, as shown in FIG.
- the detection area DA is a so-called bounding box. In FIG. 2, three persons P0, P1, P2 are detected.
- the detection unit 71a generates a detection area DA0 surrounding the person P0, a detection area DA1 surrounding the person P1, and a detection area DA2 surrounding the person P2.
- the detection unit 71a detects all persons including standing persons and sitting persons, and generates a detection area DA for all detected persons.
- the detection unit 71a generates detection areas DA3, DA4, and DA5 for persons P3, P4, and P5 detected from the image G2, as shown in FIG.
- the detection area DA is an area in which a person is recognized in the image, and may not be an area surrounding the entire person.
- the extraction unit 71b extracts the detection area DA for which the density is to be calculated and the detection area DA for which the total number of detections is to be calculated from all the generated detection areas DA.
- Drivers of the vehicle 100 are excluded from the calculation target of the density and also excluded from the calculation target of the total detection number.
- the person P0 detected in FIG. 2 is the driver of the vehicle 100.
- FIG. The extraction unit 71b recognizes the driver as described below and excludes the driver from the calculation target.
- each area has 240 pixels in the horizontal direction and 160 pixels in the vertical direction. That is, each area consists of 38,400 pixels.
- areas A11, A14, and A17 are set as areas where the driver is imaged.
- the extraction unit 71b determines that the detection area DA corresponds to the driver.
- the center of detection area DA0 is located in area A14. That is, it can be determined that the person A0 corresponding to the detection area DA0 is the driver. Therefore, the extracting unit 71b excludes the detection area DA0 from the calculation target of the density and the total number of detections.
- the centers of the detection areas DA1 and DA2 are not located in any of the areas A11, A14 and A17.
- the extraction unit 71b employs the detection areas DA1 and DA2 as targets for density calculation. Since the detection areas DA1 and DA2 are areas of the image G1, they are not subject to calculation of the total number of detections.
- each area has 240 pixels in the horizontal direction and 160 pixels in the vertical direction. That is, each area consists of 38,400 pixels.
- the image G2 does not include the area where the driver is imaged. Therefore, the extraction unit 71b does not determine whether or not the detection area DA corresponds to the driver with respect to the image G2.
- areas A23, A26, and A29 are areas where seated passengers are imaged. In the following description, an area in which a seated passenger is imaged is referred to as a "first area".
- the extraction unit 71b determines that the detection area DA is an area corresponding to the passenger seated.
- the center of detection area DA5 is located in area A29. That is, it can be determined that the detected person P5 is a passenger seated.
- the extraction unit 71b excludes the detection area DA5 from the density calculation target. On the other hand, the extraction unit 71b adopts the detection area DA5 as a calculation target for the total number of detections.
- the centers of the detection areas DA3 and DA4 are not located in any of the areas A23, A26 and A29. Therefore, the extraction unit 71b adopts the detection areas DA3 and DA4 as targets for calculating the density and the total number of detections.
- the determination unit 71c calculates the density and the total number of detections, and determines the degree of congestion of the vehicles 100 based on the calculation results.
- the density of this embodiment is calculated as follows. In the image G1 shown in FIG. 2, the area for which the density is calculated is area A15. Area A15 is an area where a passenger standing at the end of aisle 103 on the exit 105 side is imaged.
- the determination unit 71c sets the area A15 as the target region Tg.
- the determination unit 71c determines whether the number of pixels overlapping the target area Tg is equal to or greater than a threshold Th1 for each detection area DA. For example, the determination unit 71c detects an area Ov1 where the target area Tg and the detection area DA1 overlap, as shown in FIG.
- the determination unit 71c increments the count number C0 for the target region Tg when the number of pixels in the region Ov1 is equal to or greater than the threshold Th1.
- the initial value of the count number C0 is zero.
- the threshold Th1 is, for example, 1,024 [pixels]. In other words, when the number of pixels in the area Ov1 is 1,024 or more, the detection area DA1 is counted as one of the dense constituent elements.
- the determination unit 71c similarly detects areas overlapping the target area Tg for other detection areas DA, and if the number of pixels in the overlapping area is equal to or greater than the threshold value Th1, the count number C0 for the target area Tg is calculated. Increment. The determination unit 71c determines the degree of overlap with the target area Tg for all detection areas DA whose density is to be calculated. The determination unit 71c stores the calculated count number C0 as the count number C15 for the area A15. After that, the determination unit 71c initializes the count number C0.
- the determination unit 71c determines the degree of overlap for the image G2.
- the target regions Tg are area A25 and area A28.
- Area A25 and area A28 are areas in which a passenger standing near boarding gate 107 in aisle 103 is imaged.
- the determination unit 71c sets one area A25 as the target area Tg.
- the determination unit 71c determines whether the number of pixels overlapping the target area Tg is equal to or greater than a threshold Th2 for each detection area DA of the image G2.
- the threshold Th2 is, for example, 3,072 [pixels].
- the determining unit 71c increments the count number C0 for the target area Tg when the number of pixels in the area where the target area Tg and the detection area DA overlap is equal to or greater than the threshold value Th2.
- the determination unit 71c determines the degree of overlap with the target area Tg for all detection areas DA whose density is to be calculated.
- the determination unit 71c stores the calculated count number C0 as the count number C25 for the area A25. After that, the determination unit 71c initializes the count number C0.
- the determination unit 71c sets the area A28 as the target area Tg, and similarly calculates the count number C0.
- the determination unit 71c stores the calculated count number C0 as the count number C28 for the area A28.
- the determination unit 71c calculates the total number of detected passengers for the image G2.
- the total detection number CA is the number of detection areas DA generated for the image G2. In the image G2 illustrated in FIG. 3, the detection areas DA3, DA4, and DA5 are the targets for calculation of the total number of detections. That is, the total detection number CA is 3.
- the CPU 71 determines the degree of congestion based on the flowchart shown in FIG.
- the CPU 71 executes the flowchart of FIG. 5 after executing the step of acquiring the images G1 and G2.
- step S10 the detection unit 71a performs object detection on the images G1 and G2 and calculates the detection area DA.
- Step S10 is a step of detecting a person from the images G1 and G2 and generating a rectangular detection area DA surrounding each detected person. After step S10 is executed, the process proceeds to step S20.
- step S20 the extraction unit 71b extracts the detection area DA for which the density is to be calculated and the detection area DA for which the total number of detections is to be calculated. After step S20 is executed, the process proceeds to step S30.
- step S30 the determination unit 71c calculates the overlap between the detection area DA and the target area Tg.
- the determination unit 71c calculates count numbers C15, C25, and C28 for areas A15, A25, and A28, respectively.
- step S40 the determination unit 71c determines whether or not the index of the degree of congestion satisfies the condition of level II.
- the vehicle congestion degree determination system 1 classifies the congestion degree levels into three levels: level I, level II, and level III.
- Level I is the least congested level and Level III is the most congested level.
- Level I is, for example, congestion with two or fewer standing passengers. Strollers, passengers trying to sit down, and bus company employees are not considered "standing passengers”.
- Level II is, for example, a degree of congestion in which three or more passengers are standing and the central portion of the aisle 103 is unused.
- Level III is, for example, the degree of congestion in which passengers are standing in the central portion of the aisle 103 .
- the central portion of the passage 103 is, for example, an intermediate portion between the boarding door 106 and the exit door 104 in the passage 103 .
- Level II conditions are referred to as primary conditions.
- the first condition is that the following formula (1) holds. That is, when “the total value of the count number C15 of the area A15 and the count number C28 of the area A28 is 2 or more", it is determined that the degree of congestion is level II or more.
- the level I condition is "the total value of the count number C15 and the count number C28 is less than 2". If the first condition is satisfied, the determination unit 71c makes an affirmative determination in step S40 and proceeds to step S50. If the determination unit 71c makes a negative determination in step S40, the process proceeds to step S80. C15+C28 ⁇ 2 (1)
- step S50 the determination unit 71c determines whether or not the index of the degree of congestion satisfies the condition of level III.
- the condition of level III is that the second condition is satisfied in addition to the first condition.
- the second condition is that the following formulas (2), (3), (4), and (5) are all satisfied. That is, the first condition is satisfied, and "the count number C15 of the area A15 is 1 or more, the count number C25 of the area A25 and the count number C28 of the area A28 are both 2 or more, and the total detection number CA is 11 or more. , it is determined that the degree of congestion is level III.
- step S50 When the second condition is satisfied, the determination unit 71c makes an affirmative determination in step S50 and proceeds to step S60. If the determination unit 71c makes a negative determination in step S50, the process proceeds to step S70.
- step S60 the determination unit 71c substitutes the value of level III for the congestion degree level Lv.
- step S70 the determination unit 71c substitutes the level II value for the congestion degree level Lv.
- step S80 the determination unit 71c substitutes the value of level I for the congestion degree level Lv.
- the CPU 71 may transmit information about the congestion degree level Lv to the outside.
- the congestion level Lv may be wirelessly transmitted to the stop, for example.
- the congestion level Lv of the vehicle 100 arriving from now is displayed on the display of the stop.
- the congestion degree level Lv may be transmitted to the user's mobile terminal, for example.
- the congestion degree level Lv of the vehicle 100 may be displayed by application software on the user's portable terminal.
- the level Lv of the degree of congestion may be transmitted, for example, to other fixed-route buses running on the same route as the vehicle 100 .
- the level Lv of the degree of congestion can also be used for managing the operation schedule of the route bus including the vehicle 100 .
- the vehicle congestion level determination method includes an acquisition step, a generation step, and a determination step.
- the obtaining step images G1 and G2 of the interior 101 of the vehicle 100 for transporting passengers are obtained.
- the generating step a person is detected from the images G1 and G2, and a rectangular detection area DA surrounding each detected person is generated. In the flowchart of FIG. 5, this corresponds to the step generated by step S10.
- the congestion degree of the vehicle 100 is determined based on the target area Tg and the detection area DA of the images G1 and G2.
- steps S30 to S80 correspond to the determination steps.
- the target area Tg is a partial area of the images G1 and G2 and an area in which a passenger standing in the aisle 103 of the vehicle 100 is imaged.
- the degree of congestion of vehicles 100 is determined based on the number of detection areas DA overlapping the target area Tg.
- the number of detection areas DA overlapping the target area Tg represents the density of passengers standing in the aisle 103. Therefore, the vehicle congestion degree determination method according to the present embodiment can appropriately determine the congestion degree of vehicles that transport passengers. Further, by detecting a person from the entire wide images G1 and G2, better detection accuracy can be realized than when detecting a person from the target area Tg.
- the above vehicle congestion determination method is useful, for example, from the perspective of measures against infectious diseases such as the recent novel coronavirus (COVID-19). For example, it is possible to prompt users to timely and accurately grasp the congestion degree of the vehicle 100 and to encourage them to use the vehicle 100 without crowding. be able to. Further, in the vehicle congestion degree determination method according to the present embodiment, the congestion degree is determined based on a still image. Therefore, it is possible to suppress an increase in the load on each unit involved in the determination. For example, the amount of communication when transmitting an image captured by the camera 3 to the Internet network NW is small, so the communication load is reduced. Moreover, when the determination of congestion degree is performed in the onboard equipment 2, the calculation load in the onboard equipment 2 is reduced.
- the target area Tg of the present embodiment is an area in which a passenger standing near the entrance/exit of the vehicle 100 is imaged.
- the vehicle congestion degree determination method according to the present embodiment can appropriately determine the congestion degree of the vehicle 100 .
- the image G2 of this embodiment has a first area in which a seated passenger is imaged.
- the detection area DA having the center coordinates in the first area is excluded from the calculation target of the count number C0 of the detection areas overlapping the target area Tg. Therefore, the vehicle congestion degree determination method according to the present embodiment can accurately calculate the congestion degree of the aisle 103 by excluding seated passengers.
- the degree of congestion of vehicles 100 is determined. Therefore, it is possible to appropriately determine the degree of congestion of the vehicles 100 based on the density of the passage 103 and the total detected number CA.
- the first image and the second image are acquired in the acquisition step.
- the first image corresponds to an image G1 of a passenger standing near the exit 105 in the aisle 103 .
- the second image corresponds to an image G2 of a passenger standing near the boarding gate 107 in the aisle 103 .
- the degree of congestion of vehicles 100 is determined based on the first image and the second image. Based on the two images, it is possible to determine the congestion degree of the vehicle 100 with higher accuracy.
- the vehicle congestion degree determination system 1 includes a camera 3, a detection unit 71a, and a determination unit 71c.
- the determination unit 71c determines the congestion degree of the vehicle 100 based on the number of detection areas DA overlapping the target area Tg. Therefore, the congestion degree determination system 1 can appropriately determine the congestion degree of vehicles that transport passengers.
- the vehicle congestion degree determination method and congestion degree determination system 1 can accurately determine the congestion degree of the vehicle 100 . For example, by detecting a person not only in the target region Tg but also in the entire images G1 and G2, detection omissions and erroneous detections can be suppressed.
- the angle of view of the camera 3 is set so as to capture only the target region Tg. In this case, when passengers are densely packed, multiple people are overlapped in the image, making it difficult to detect people hidden behind them.
- the angle of view of the camera 3 is determined so as to image a range wider than the target region Tg. As a result, detection omissions and erroneous detections of people are less likely to occur.
- the camera 3 of this embodiment is arranged so as to capture an overhead view of the interior 101 of the vehicle. Therefore, even if passengers are crowded in the aisle 103, it is easy to detect each passenger.
- the angle of view of the camera 3 is determined so that not only passengers standing in the aisle 103 but also passengers sitting on the seats 108 and 109 can be imaged. Therefore, highly accurate congestion degree determination based on both the density and the total number of detections is possible.
- the congestion of the vehicle 100 can be alleviated.
- providing users with information on the degree of congestion of vehicles 100 in real time makes it difficult for passengers to concentrate on a specific vehicle 100 .
- crowding of passengers in the vehicle 100 can be suppressed and virus infection inside the vehicle can be suppressed.
- the determination unit 71c may collectively set a plurality of areas as one target area Tg. For example, the determination unit 71c may combine the area A25 and the area A28 in the image G2 to form one target area Tg. In this case, when the number of pixels in the area where the target area Tg and the detection area DA overlap is equal to or greater than the threshold Th3, the count numbers for areas A25 and A28 are incremented.
- the congestion degree determination system 1 of this embodiment is easily adaptable to the vehicle in which it is mounted.
- the position and angle of view of the camera 3 may differ depending on the type of vehicle 100 and the operating company.
- the congestion degree determination system 1 of the present embodiment can easily adjust the density determination accuracy by selecting the optimum target region Tg from a plurality of areas of the images G1 and G2.
- the congestion degree determination system 1 of this embodiment is a system having high versatility.
- the number of areas in the images G1 and G2 is not limited to the illustrated numbers. Also, the shapes of the areas in the images G1 and G2 are not limited to the illustrated shapes. The images G1 and G2 may not be equally divided.
- the CPU 71 may determine the degree of congestion based on either one of the images G1 and G2. For example, the CPU 71 may determine any one of level I, level II, and level III based on one of the images G1 and G2.
- At least one operation among the operation of the detection unit 71a, the operation of the extraction unit 71b, and the operation of the determination unit 71c may be performed in the onboard device 2 or the server 6 on the cloud.
- the CPU 21 of the vehicle-mounted device 2 may have a detection section 71a, an extraction section 71b, and a determination section 71c.
- the CPU 21 may transmit information about the congestion degree level Lv to the Internet network NW.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Train Traffic Observation, Control, And Security (AREA)
- Image Processing (AREA)
Abstract
Description
図1から図5を参照して、実施形態について説明する。本実施形態は、車両の混雑度判定方法、および車両の混雑度判定システムに関する。図1は、実施形態に係る車両の混雑度判定システムのブロック図、図2は、第一カメラによって撮像された画像を示す図、図3は、第二カメラによって撮像された画像を示す図、図4は、重なり度合いの判定について説明する図、図5は、実施形態の動作を示すフローチャートである。 [Embodiment]
An embodiment will be described with reference to FIGS. 1 to 5. FIG. The present embodiment relates to a vehicle congestion level determination method and a vehicle congestion level determination system. FIG. 1 is a block diagram of the vehicle congestion degree determination system according to the embodiment, FIG. 2 is a diagram showing an image captured by the first camera, FIG. 3 is a diagram showing an image captured by the second camera, FIG. 4 is a diagram explaining determination of the degree of overlap, and FIG. 5 is a flowchart showing the operation of the embodiment.
C15+C28≧2 (1) Level II conditions are referred to as primary conditions. The first condition is that the following formula (1) holds. That is, when "the total value of the count number C15 of the area A15 and the count number C28 of the area A28 is 2 or more", it is determined that the degree of congestion is level II or more. The level I condition is "the total value of the count number C15 and the count number C28 is less than 2". If the first condition is satisfied, the
C15+C28≧2 (1)
C15≧1 (2)
C25≧2 (3)
C28≧2 (4)
CA≧11 (5) In step S50, the
C15≧1 (2)
C25≧2 (3)
C28≧2 (4)
CA≧11 (5)
2 車載器
3 カメラ
4 ドアセンサ
5 無線基地局
6 サーバ
7 事務所PC
21:CPU、 22:メモリ、 23:GPS受信部、 24:通信部
31:第一カメラ、 32:第二カメラ
71:CPU、 72:メモリ、 73:通信部、 74:外部入力インタフェース
100:車両、 101:車内、 102:運転席、 103:通路
104:降車ドア、 105:降車口、 106:乗車ドア、 107:乗車口
108:後部座席、 109:前部座席
A11,A12,A13,A14,A15,A16,A17,A18,A19:エリア
A21,A22,A23,A24,A25,A26,A27,A28,A29:エリア
C0:対象領域のカウント数、 C15,C25,C28:エリアのカウント数
CA:総検出数
DA:検出領域
G1,G2:画像
P 人 1 vehicle congestion
21: CPU 22: Memory 23: GPS receiver 24: Communication unit 31: First camera 32: Second camera 71: CPU 72: Memory 73: Communication unit 74: External input interface 100: Vehicle 101: Interior 102: Driver's seat 103: Passage 104: Exit door 105: Exit 106: Boarding door 107: Boarding gate 108: Rear seat 109: Front seat A11, A12, A13, A14, A15, A16, A17, A18, A19: Areas A21, A22, A23, A24, A25, A26, A27, A28, A29: Area C0: Target area count, C15, C25, C28: Area count CA: Total number of detections DA: detection area G1, G2: image P people
Claims (6)
- 旅客を輸送する車両の車内を撮像した画像を取得するステップと、
前記画像から人を検出し、検出されたそれぞれの前記人に対して当該人を囲む矩形の検出領域を生成するステップと、
前記画像の対象領域および前記検出領域に基づいて前記車両の混雑度を判定するステップと、
を含み、
前記対象領域は、前記画像の一部の領域であって、かつ前記車両の通路に立っている旅客が撮像される領域であり、
前記判定するステップにおいて、前記対象領域と重なっている前記検出領域の数に基づいて前記車両の混雑度を判定する
ことを特徴とする車両の混雑度判定方法。 obtaining an image of the interior of a vehicle for transporting passengers;
detecting a person from the image and generating, for each detected person, a rectangular detection area surrounding the person;
determining the congestion degree of the vehicle based on the target area and the detection area of the image;
including
The target area is a partial area of the image and is an area in which a passenger standing in the aisle of the vehicle is imaged,
A vehicle congestion degree determination method, wherein, in the determining step, the congestion degree of the vehicle is determined based on the number of the detection areas overlapping the target area. - 前記対象領域は、前記車両の乗降口の近傍に立っている旅客が撮像される領域である
請求項1に記載の車両の混雑度判定方法。 2. The vehicle congestion degree determination method according to claim 1, wherein the target area is an area in which a passenger standing in the vicinity of an entrance/exit of the vehicle is imaged. - 前記画像は、着席している旅客が撮像される第一領域を有し、
前記第一領域に中心座標を有する前記検出領域は、前記対象領域と重なっている前記検出領域の数の算出対象から除外される
請求項1または2に記載の車両の混雑度判定方法。 the image has a first region in which a seated passenger is imaged;
3. The vehicle congestion degree determination method according to claim 1, wherein the detection area having center coordinates in the first area is excluded from the calculation target of the number of the detection areas overlapping with the target area. - 前記判定するステップにおいて、前記対象領域と重なっている前記検出領域の数と、前記画像から検出された前記人の総検出数と、に基づいて前記車両の混雑度を判定する
請求項1から3の何れか1項に記載の車両の混雑度判定方法。 4. The degree of congestion of the vehicles is determined in the determining step based on the number of the detection areas overlapping the target area and the total number of the persons detected from the image. The vehicle congestion degree determination method according to any one of the above. - 前記取得するステップにおいて、前記通路における降車口の近傍に立っている旅客を撮像した第一の画像、および前記通路における乗車口の近傍に立っている旅客を撮像する第二の画像を取得し、
前記判定するステップにおいて、前記第一の画像および前記第二の画像に基づいて前記車両の混雑度を判定する
請求項1から4の何れか1項に記載の車両の混雑度判定方法。 In the obtaining step, obtaining a first image of a passenger standing near an exit in the aisle and a second image of a passenger standing near the boarding entrance in the aisle;
The vehicle congestion degree determination method according to any one of claims 1 to 4, wherein in the determination step, the vehicle congestion degree is determined based on the first image and the second image. - 旅客を輸送する車両の車内を撮像して画像を出力するカメラと、
前記画像から人を検出し、検出されたそれぞれの前記人に対して当該人を囲む矩形の検出領域を生成する検出部と、
前記画像の対象領域および前記検出領域に基づいて前記車両の混雑度を判定する判定部と、
を備え、
前記対象領域は、前記画像の一部の領域であって、かつ前記車両の通路に立っている旅客が撮像される領域であり、
前記判定部は、前記対象領域と重なっている前記検出領域の数に基づいて前記車両の混雑度を判定する
車両の混雑度判定システム。 a camera that captures the interior of a vehicle that transports passengers and outputs an image;
a detection unit that detects a person from the image and generates a rectangular detection area surrounding each detected person;
a determination unit that determines the congestion degree of the vehicle based on the target area and the detection area of the image;
with
The target area is a partial area of the image and is an area in which a passenger standing in the aisle of the vehicle is imaged,
The vehicle congestion degree determination system, wherein the determination unit determines the congestion degree of the vehicle based on the number of the detection areas that overlap the target area.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MX2023010408A MX2023010408A (en) | 2021-04-06 | 2022-03-02 | Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle. |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021064498A JP7305698B2 (en) | 2021-04-06 | 2021-04-06 | Vehicle congestion determination method and vehicle congestion determination system |
JP2021-064498 | 2021-04-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022215394A1 true WO2022215394A1 (en) | 2022-10-13 |
Family
ID=83545842
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/008921 WO2022215394A1 (en) | 2021-04-06 | 2022-03-02 | Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP7305698B2 (en) |
MX (1) | MX2023010408A (en) |
TW (1) | TWI781069B (en) |
WO (1) | WO2022215394A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015220507A (en) * | 2014-05-14 | 2015-12-07 | 富士通株式会社 | Monitoring device, monitoring method and monitoring program |
CN111079696A (en) * | 2019-12-30 | 2020-04-28 | 深圳市昊岳电子有限公司 | Detection method based on vehicle monitoring personnel crowding degree |
JP2021003972A (en) * | 2019-06-26 | 2021-01-14 | 株式会社東芝 | Information processor, station management system, station management equipment and program |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010003110A (en) * | 2008-06-20 | 2010-01-07 | Panasonic Corp | On-vehicle moving image data recording device |
DE102009039162A1 (en) * | 2009-08-27 | 2011-03-17 | Knorr-Bremse Gmbh | Monitoring device and method for monitoring an entry or exit area from an access opening of a vehicle to a building part |
CN111079474A (en) * | 2018-10-19 | 2020-04-28 | 上海商汤智能科技有限公司 | Passenger state analysis method and device, vehicle, electronic device, and storage medium |
US20200273345A1 (en) * | 2019-02-26 | 2020-08-27 | Aptiv Technologies Limited | Transportation system and method |
-
2021
- 2021-04-06 JP JP2021064498A patent/JP7305698B2/en active Active
-
2022
- 2022-03-02 WO PCT/JP2022/008921 patent/WO2022215394A1/en active Application Filing
- 2022-03-02 MX MX2023010408A patent/MX2023010408A/en unknown
- 2022-03-24 TW TW111111019A patent/TWI781069B/en active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015220507A (en) * | 2014-05-14 | 2015-12-07 | 富士通株式会社 | Monitoring device, monitoring method and monitoring program |
JP2021003972A (en) * | 2019-06-26 | 2021-01-14 | 株式会社東芝 | Information processor, station management system, station management equipment and program |
CN111079696A (en) * | 2019-12-30 | 2020-04-28 | 深圳市昊岳电子有限公司 | Detection method based on vehicle monitoring personnel crowding degree |
Also Published As
Publication number | Publication date |
---|---|
JP2022160020A (en) | 2022-10-19 |
TW202241114A (en) | 2022-10-16 |
MX2023010408A (en) | 2023-09-18 |
JP7305698B2 (en) | 2023-07-10 |
TWI781069B (en) | 2022-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11747809B1 (en) | System and method for evaluating the perception system of an autonomous vehicle | |
CN111310994B (en) | Bus route prediction method and system based on data calibration | |
CN108898044B (en) | Loading rate obtaining method, device and system and storage medium | |
JP5988472B2 (en) | Monitoring system and congestion rate calculation method | |
WO2013088620A1 (en) | Electronic device | |
JP4845580B2 (en) | Train congestion notification system | |
US20130195364A1 (en) | Situation determining apparatus, situation determining method, situation determining program, abnormality determining apparatus, abnormality determining method, abnormality determining program, and congestion estimating apparatus | |
KR101159230B1 (en) | System and method for providing bus board information | |
Nakashima et al. | Passenger counter based on random forest regressor using drive recorder and sensors in buses | |
US11288886B2 (en) | People-gathering analysis device, movement destination prediction creation device, people-gathering analysis system, vehicle, and recording medium | |
US20190156672A1 (en) | Operation management system and operation management program | |
JP5971045B2 (en) | Surveillance system, imaging device, and data management device | |
CN108389392A (en) | A kind of traffic accident responsibility identification system based on machine learning | |
WO2022215394A1 (en) | Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle | |
CN110992678A (en) | Bus passenger flow statistical method based on big data face recognition | |
JP7186749B2 (en) | Management system, management method, management device, program and communication terminal | |
US11262205B2 (en) | Traffic control apparatus, traffic control system, traffic control method, and non-transitory computer recording medium | |
KR20150067018A (en) | Method and service server for providing crowd density information | |
CN110430120A (en) | A kind of Bus information reminding method and system based on SNS | |
RU121628U1 (en) | INTELLIGENT PASSENGER FLOW ANALYSIS SYSTEM USING TECHNICAL VISION | |
CN110188645A (en) | For the method for detecting human face of vehicle-mounted scene, device, vehicle and storage medium | |
JP7347787B2 (en) | Crowd situation management device, congestion situation management system, congestion situation management method and program | |
EP4068201A1 (en) | Generation device, data analysis system, generation method and generation program | |
JP2022128660A (en) | Congestion rate measuring system and program | |
JP7342054B2 (en) | Boarding and alighting person counting system, boarding and alighting person counting method, and boarding and alighting person counting program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22784379 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2023/010408 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2301006449 Country of ref document: TH |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22784379 Country of ref document: EP Kind code of ref document: A1 |