WO2022215394A1 - Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle - Google Patents

Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle Download PDF

Info

Publication number
WO2022215394A1
WO2022215394A1 PCT/JP2022/008921 JP2022008921W WO2022215394A1 WO 2022215394 A1 WO2022215394 A1 WO 2022215394A1 JP 2022008921 W JP2022008921 W JP 2022008921W WO 2022215394 A1 WO2022215394 A1 WO 2022215394A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
area
detection
congestion degree
Prior art date
Application number
PCT/JP2022/008921
Other languages
French (fr)
Japanese (ja)
Inventor
大輝 齊藤
隆司 塩田
正紀 矢農
Original Assignee
矢崎総業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 矢崎総業株式会社 filed Critical 矢崎総業株式会社
Priority to MX2023010408A priority Critical patent/MX2023010408A/en
Publication of WO2022215394A1 publication Critical patent/WO2022215394A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors

Definitions

  • the present invention relates to a vehicle congestion level determination method and a vehicle congestion level determination system.
  • Patent Literature 1 discloses a monitoring device for the number of passengers in a vehicle, which includes a first imaging device and a processing unit and is attached inside the vehicle to monitor the number of passengers in the vehicle.
  • the dynamic detection module of Patent Document 1 determines that there is a person in the recognition block when the image variation value is greater than the threshold.
  • An object of the present invention is to provide a vehicle congestion level determination method and a vehicle congestion level determination system that can appropriately determine the congestion level of vehicles that transport passengers.
  • a vehicle congestion degree determination method includes the steps of acquiring an image of the interior of a vehicle that transports passengers, detecting people from the image, and surrounding each of the detected people. generating a rectangular detection area; and determining the congestion degree of the vehicle based on the target area of the image and the detection area, wherein the target area is a partial area of the image. and a passenger standing in the aisle of the vehicle is imaged, and in the determining step, the congestion degree of the vehicle is determined based on the number of the detection areas overlapping the target area. characterized by
  • a vehicle congestion degree determination method comprises the steps of: detecting a person from an image; generating a rectangular detection area surrounding each detected person; and determining the degree of congestion of vehicles based on.
  • the region of interest is the portion of the image in which a passenger standing in the aisle of the vehicle is imaged.
  • the degree of vehicle congestion is determined based on the number of detection areas overlapping the target area. According to the vehicle congestion degree determination method according to the present invention, it is possible to appropriately determine the congestion degree of vehicles that transport passengers.
  • FIG. 1 is a block diagram of a vehicle congestion degree determination system according to an embodiment.
  • FIG. 2 is a diagram showing an image captured by the first camera.
  • FIG. 3 is a diagram showing an image captured by the second camera.
  • FIG. 4 is a diagram for explaining determination of the degree of overlap.
  • FIG. 5 is a flow chart showing the operation of the embodiment.
  • FIG. 1 is a block diagram of the vehicle congestion degree determination system according to the embodiment
  • FIG. 2 is a diagram showing an image captured by the first camera
  • FIG. 3 is a diagram showing an image captured by the second camera
  • FIG. 4 is a diagram explaining determination of the degree of overlap
  • FIG. 5 is a flowchart showing the operation of the embodiment.
  • the vehicle congestion degree determination system 1 includes an on-vehicle device 2, a camera 3, and an office PC 7.
  • the vehicle-mounted device 2 and the camera 3 are mounted on a vehicle 100 that transports passengers.
  • Vehicle 100 of the present embodiment is a shared bus, for example, a route bus that travels on a predetermined route.
  • the vehicle 100 allows passengers to get on and off at each stop on the route.
  • the vehicle 100 is equipped with a door sensor 4 in addition to the vehicle-mounted device 2 and the camera 3 .
  • the vehicle-mounted device 2 is, for example, a drive recorder or digital tachograph that records the operation status of the vehicle 100.
  • the vehicle-mounted device 2 has a CPU 21 , a memory 22 , a GPS receiver 23 and a communication unit 24 .
  • the CPU 21 is a computing device that performs various computations.
  • the CPU 21 executes the operations of this embodiment according to a program stored in the memory 22, for example.
  • the memory 22 is a storage unit that includes volatile memory and nonvolatile memory.
  • the GPS receiver 23 calculates the current position of the vehicle 100 based on signals transmitted from satellites.
  • the communication unit 24 is a communication module that communicates with the radio base station 5 . Communication unit 24 performs radio communication with radio base station 5 via the antenna of vehicle 100 in accordance with instructions from CPU 21 .
  • the camera 3 is an imaging device that captures the interior of the vehicle 100 and outputs images G1 and G2.
  • the position of the camera 3 and the angle of view of the camera 3 are set so that the passenger in the vehicle can be imaged.
  • the vehicle 100 of this embodiment has a first camera 31 and a second camera 32 as the cameras 3 .
  • the first camera 31 captures an image of the vicinity of the exit in the vehicle.
  • the second camera 32 images the vicinity of the boarding gate inside the vehicle.
  • the vehicle-mounted device 2 acquires the image captured by the camera 3 and stores it in the memory 22 .
  • the door sensor 4 is a sensor that detects the state of the door that opens and closes the entrance/exit of the vehicle 100 .
  • the door sensor 4 detects, for example, whether the door is fully closed.
  • the door sensor 4 is arranged, for example, at each of the entrance door and the exit door. A detection result of the door sensor 4 is sent to the CPU 21 .
  • the vehicle-mounted device 2 transmits data for determining the degree of congestion to the outside via the communication unit 24 .
  • the data to be transmitted includes, for example, the image captured by the camera 3, the image capturing time of the image, the position of the vehicle 100 at the image capturing time, the identification code of the vehicle 100, and the like.
  • the image data transmitted by the communication unit 24 is, for example, still image data.
  • the transmitted data is stored, for example, in the server 6 connected to the Internet network NW.
  • the office PC 7 is composed of, for example, a general-purpose computer installed in an office of a company or the like.
  • the office PC 7 has a function of managing the operation status of the vehicles 100 to be managed and a function of determining the degree of congestion of the vehicles 100 .
  • Office PC 7 has CPU 71 , memory 72 , communication section 73 , and external input interface 74 .
  • Office PC 7 communicates with server 6 and wireless base station 5 via communication unit 73 and Internet network NW.
  • the office PC 7 manages the operation status and determines the degree of congestion, for example, according to a program read from the memory 72 to the CPU 71 .
  • FIG. 1 An example of the image G1 captured by the first camera 31 is shown in FIG.
  • the inside 101 of the vehicle 100 is imaged in the image G1. More specifically, the image G1 includes a driver's seat 102, an aisle 103, and an exit 105.
  • FIG. The exit 105 is arranged in the front part of the vehicle 100 and is opened and closed by the exit door 104 .
  • FIG. 1 An example of the image G2 captured by the second camera 32 is shown in FIG.
  • the inside 101 of the vehicle 100 is imaged in the image G2. More specifically, the image G2 includes the aisle 103, the boarding gate 107, the rear seat 108, and the front seat 109.
  • the boarding gate 107 is arranged in the middle part of the vehicle 100 and is opened and closed by the boarding door 106 .
  • the rear seat 108 is a seat arranged behind the entrance 107 of the vehicle.
  • the rear seat 108 faces forward of the vehicle.
  • the front seat 109 is a seat arranged in front of the vehicle from the boarding gate 107 .
  • the front seat 109 faces in the vehicle width direction.
  • the vehicle-mounted device 2 transmits the captured images G1 and G2 after the passenger has completed boarding and alighting. For example, the vehicle-mounted device 2 transmits images G1 and G2 captured when both the exit door 104 and the boarding door 106 are fully closed to the Internet network NW. It is possible to accurately estimate the degree of congestion in the vehicle interior 101 from the images G1 and G2 captured when the movement of passengers is small.
  • the office PC 7 acquires data for determining the degree of congestion of the vehicles 100 from the server 6. Office PC 7 acquires data via communication unit 73 and stores the acquired data in memory 72 .
  • the CPU 71 has a detection section 71a, an extraction section 71b, and a determination section 71c.
  • the detection unit 71a, the extraction unit 71b, and the determination unit 71c may be part of a program executed by the CPU 71, may be part of a circuit that the CPU 71 has, or may be a chip or the like mounted on the CPU 71. There may be. That is, the CPU 71 may be configured so as to be capable of executing the function as the detection section 71a, the function as the extraction section 71b, and the function as the determination section 71c.
  • the vehicle congestion degree determination system 1 determines the congestion degree of the vehicle 100 based on two indices.
  • the first indicator is passenger density.
  • the CPU 71 detects people from the images G ⁇ b>1 and G ⁇ b>2 and calculates the density of passengers in the aisle 103 .
  • the second indicator is the total number of detected passengers.
  • the CPU 71 calculates the total number of passengers detected from the image G2. That is, the density is calculated from both images G1 and G2, and the total number of detections is calculated from image G2.
  • the vehicle congestion degree determination system 1 according to the present embodiment can accurately determine the congestion degree of the vehicle 100 based on the passenger density and the total number of detected passengers.
  • the detection unit 71a detects people from the images G1 and G2, and generates a detection area DA for each detected person. For example, the detection unit 71a generates a detection area DA for each detected person, as shown in FIG.
  • the detection area DA is a so-called bounding box. In FIG. 2, three persons P0, P1, P2 are detected.
  • the detection unit 71a generates a detection area DA0 surrounding the person P0, a detection area DA1 surrounding the person P1, and a detection area DA2 surrounding the person P2.
  • the detection unit 71a detects all persons including standing persons and sitting persons, and generates a detection area DA for all detected persons.
  • the detection unit 71a generates detection areas DA3, DA4, and DA5 for persons P3, P4, and P5 detected from the image G2, as shown in FIG.
  • the detection area DA is an area in which a person is recognized in the image, and may not be an area surrounding the entire person.
  • the extraction unit 71b extracts the detection area DA for which the density is to be calculated and the detection area DA for which the total number of detections is to be calculated from all the generated detection areas DA.
  • Drivers of the vehicle 100 are excluded from the calculation target of the density and also excluded from the calculation target of the total detection number.
  • the person P0 detected in FIG. 2 is the driver of the vehicle 100.
  • FIG. The extraction unit 71b recognizes the driver as described below and excludes the driver from the calculation target.
  • each area has 240 pixels in the horizontal direction and 160 pixels in the vertical direction. That is, each area consists of 38,400 pixels.
  • areas A11, A14, and A17 are set as areas where the driver is imaged.
  • the extraction unit 71b determines that the detection area DA corresponds to the driver.
  • the center of detection area DA0 is located in area A14. That is, it can be determined that the person A0 corresponding to the detection area DA0 is the driver. Therefore, the extracting unit 71b excludes the detection area DA0 from the calculation target of the density and the total number of detections.
  • the centers of the detection areas DA1 and DA2 are not located in any of the areas A11, A14 and A17.
  • the extraction unit 71b employs the detection areas DA1 and DA2 as targets for density calculation. Since the detection areas DA1 and DA2 are areas of the image G1, they are not subject to calculation of the total number of detections.
  • each area has 240 pixels in the horizontal direction and 160 pixels in the vertical direction. That is, each area consists of 38,400 pixels.
  • the image G2 does not include the area where the driver is imaged. Therefore, the extraction unit 71b does not determine whether or not the detection area DA corresponds to the driver with respect to the image G2.
  • areas A23, A26, and A29 are areas where seated passengers are imaged. In the following description, an area in which a seated passenger is imaged is referred to as a "first area".
  • the extraction unit 71b determines that the detection area DA is an area corresponding to the passenger seated.
  • the center of detection area DA5 is located in area A29. That is, it can be determined that the detected person P5 is a passenger seated.
  • the extraction unit 71b excludes the detection area DA5 from the density calculation target. On the other hand, the extraction unit 71b adopts the detection area DA5 as a calculation target for the total number of detections.
  • the centers of the detection areas DA3 and DA4 are not located in any of the areas A23, A26 and A29. Therefore, the extraction unit 71b adopts the detection areas DA3 and DA4 as targets for calculating the density and the total number of detections.
  • the determination unit 71c calculates the density and the total number of detections, and determines the degree of congestion of the vehicles 100 based on the calculation results.
  • the density of this embodiment is calculated as follows. In the image G1 shown in FIG. 2, the area for which the density is calculated is area A15. Area A15 is an area where a passenger standing at the end of aisle 103 on the exit 105 side is imaged.
  • the determination unit 71c sets the area A15 as the target region Tg.
  • the determination unit 71c determines whether the number of pixels overlapping the target area Tg is equal to or greater than a threshold Th1 for each detection area DA. For example, the determination unit 71c detects an area Ov1 where the target area Tg and the detection area DA1 overlap, as shown in FIG.
  • the determination unit 71c increments the count number C0 for the target region Tg when the number of pixels in the region Ov1 is equal to or greater than the threshold Th1.
  • the initial value of the count number C0 is zero.
  • the threshold Th1 is, for example, 1,024 [pixels]. In other words, when the number of pixels in the area Ov1 is 1,024 or more, the detection area DA1 is counted as one of the dense constituent elements.
  • the determination unit 71c similarly detects areas overlapping the target area Tg for other detection areas DA, and if the number of pixels in the overlapping area is equal to or greater than the threshold value Th1, the count number C0 for the target area Tg is calculated. Increment. The determination unit 71c determines the degree of overlap with the target area Tg for all detection areas DA whose density is to be calculated. The determination unit 71c stores the calculated count number C0 as the count number C15 for the area A15. After that, the determination unit 71c initializes the count number C0.
  • the determination unit 71c determines the degree of overlap for the image G2.
  • the target regions Tg are area A25 and area A28.
  • Area A25 and area A28 are areas in which a passenger standing near boarding gate 107 in aisle 103 is imaged.
  • the determination unit 71c sets one area A25 as the target area Tg.
  • the determination unit 71c determines whether the number of pixels overlapping the target area Tg is equal to or greater than a threshold Th2 for each detection area DA of the image G2.
  • the threshold Th2 is, for example, 3,072 [pixels].
  • the determining unit 71c increments the count number C0 for the target area Tg when the number of pixels in the area where the target area Tg and the detection area DA overlap is equal to or greater than the threshold value Th2.
  • the determination unit 71c determines the degree of overlap with the target area Tg for all detection areas DA whose density is to be calculated.
  • the determination unit 71c stores the calculated count number C0 as the count number C25 for the area A25. After that, the determination unit 71c initializes the count number C0.
  • the determination unit 71c sets the area A28 as the target area Tg, and similarly calculates the count number C0.
  • the determination unit 71c stores the calculated count number C0 as the count number C28 for the area A28.
  • the determination unit 71c calculates the total number of detected passengers for the image G2.
  • the total detection number CA is the number of detection areas DA generated for the image G2. In the image G2 illustrated in FIG. 3, the detection areas DA3, DA4, and DA5 are the targets for calculation of the total number of detections. That is, the total detection number CA is 3.
  • the CPU 71 determines the degree of congestion based on the flowchart shown in FIG.
  • the CPU 71 executes the flowchart of FIG. 5 after executing the step of acquiring the images G1 and G2.
  • step S10 the detection unit 71a performs object detection on the images G1 and G2 and calculates the detection area DA.
  • Step S10 is a step of detecting a person from the images G1 and G2 and generating a rectangular detection area DA surrounding each detected person. After step S10 is executed, the process proceeds to step S20.
  • step S20 the extraction unit 71b extracts the detection area DA for which the density is to be calculated and the detection area DA for which the total number of detections is to be calculated. After step S20 is executed, the process proceeds to step S30.
  • step S30 the determination unit 71c calculates the overlap between the detection area DA and the target area Tg.
  • the determination unit 71c calculates count numbers C15, C25, and C28 for areas A15, A25, and A28, respectively.
  • step S40 the determination unit 71c determines whether or not the index of the degree of congestion satisfies the condition of level II.
  • the vehicle congestion degree determination system 1 classifies the congestion degree levels into three levels: level I, level II, and level III.
  • Level I is the least congested level and Level III is the most congested level.
  • Level I is, for example, congestion with two or fewer standing passengers. Strollers, passengers trying to sit down, and bus company employees are not considered "standing passengers”.
  • Level II is, for example, a degree of congestion in which three or more passengers are standing and the central portion of the aisle 103 is unused.
  • Level III is, for example, the degree of congestion in which passengers are standing in the central portion of the aisle 103 .
  • the central portion of the passage 103 is, for example, an intermediate portion between the boarding door 106 and the exit door 104 in the passage 103 .
  • Level II conditions are referred to as primary conditions.
  • the first condition is that the following formula (1) holds. That is, when “the total value of the count number C15 of the area A15 and the count number C28 of the area A28 is 2 or more", it is determined that the degree of congestion is level II or more.
  • the level I condition is "the total value of the count number C15 and the count number C28 is less than 2". If the first condition is satisfied, the determination unit 71c makes an affirmative determination in step S40 and proceeds to step S50. If the determination unit 71c makes a negative determination in step S40, the process proceeds to step S80. C15+C28 ⁇ 2 (1)
  • step S50 the determination unit 71c determines whether or not the index of the degree of congestion satisfies the condition of level III.
  • the condition of level III is that the second condition is satisfied in addition to the first condition.
  • the second condition is that the following formulas (2), (3), (4), and (5) are all satisfied. That is, the first condition is satisfied, and "the count number C15 of the area A15 is 1 or more, the count number C25 of the area A25 and the count number C28 of the area A28 are both 2 or more, and the total detection number CA is 11 or more. , it is determined that the degree of congestion is level III.
  • step S50 When the second condition is satisfied, the determination unit 71c makes an affirmative determination in step S50 and proceeds to step S60. If the determination unit 71c makes a negative determination in step S50, the process proceeds to step S70.
  • step S60 the determination unit 71c substitutes the value of level III for the congestion degree level Lv.
  • step S70 the determination unit 71c substitutes the level II value for the congestion degree level Lv.
  • step S80 the determination unit 71c substitutes the value of level I for the congestion degree level Lv.
  • the CPU 71 may transmit information about the congestion degree level Lv to the outside.
  • the congestion level Lv may be wirelessly transmitted to the stop, for example.
  • the congestion level Lv of the vehicle 100 arriving from now is displayed on the display of the stop.
  • the congestion degree level Lv may be transmitted to the user's mobile terminal, for example.
  • the congestion degree level Lv of the vehicle 100 may be displayed by application software on the user's portable terminal.
  • the level Lv of the degree of congestion may be transmitted, for example, to other fixed-route buses running on the same route as the vehicle 100 .
  • the level Lv of the degree of congestion can also be used for managing the operation schedule of the route bus including the vehicle 100 .
  • the vehicle congestion level determination method includes an acquisition step, a generation step, and a determination step.
  • the obtaining step images G1 and G2 of the interior 101 of the vehicle 100 for transporting passengers are obtained.
  • the generating step a person is detected from the images G1 and G2, and a rectangular detection area DA surrounding each detected person is generated. In the flowchart of FIG. 5, this corresponds to the step generated by step S10.
  • the congestion degree of the vehicle 100 is determined based on the target area Tg and the detection area DA of the images G1 and G2.
  • steps S30 to S80 correspond to the determination steps.
  • the target area Tg is a partial area of the images G1 and G2 and an area in which a passenger standing in the aisle 103 of the vehicle 100 is imaged.
  • the degree of congestion of vehicles 100 is determined based on the number of detection areas DA overlapping the target area Tg.
  • the number of detection areas DA overlapping the target area Tg represents the density of passengers standing in the aisle 103. Therefore, the vehicle congestion degree determination method according to the present embodiment can appropriately determine the congestion degree of vehicles that transport passengers. Further, by detecting a person from the entire wide images G1 and G2, better detection accuracy can be realized than when detecting a person from the target area Tg.
  • the above vehicle congestion determination method is useful, for example, from the perspective of measures against infectious diseases such as the recent novel coronavirus (COVID-19). For example, it is possible to prompt users to timely and accurately grasp the congestion degree of the vehicle 100 and to encourage them to use the vehicle 100 without crowding. be able to. Further, in the vehicle congestion degree determination method according to the present embodiment, the congestion degree is determined based on a still image. Therefore, it is possible to suppress an increase in the load on each unit involved in the determination. For example, the amount of communication when transmitting an image captured by the camera 3 to the Internet network NW is small, so the communication load is reduced. Moreover, when the determination of congestion degree is performed in the onboard equipment 2, the calculation load in the onboard equipment 2 is reduced.
  • the target area Tg of the present embodiment is an area in which a passenger standing near the entrance/exit of the vehicle 100 is imaged.
  • the vehicle congestion degree determination method according to the present embodiment can appropriately determine the congestion degree of the vehicle 100 .
  • the image G2 of this embodiment has a first area in which a seated passenger is imaged.
  • the detection area DA having the center coordinates in the first area is excluded from the calculation target of the count number C0 of the detection areas overlapping the target area Tg. Therefore, the vehicle congestion degree determination method according to the present embodiment can accurately calculate the congestion degree of the aisle 103 by excluding seated passengers.
  • the degree of congestion of vehicles 100 is determined. Therefore, it is possible to appropriately determine the degree of congestion of the vehicles 100 based on the density of the passage 103 and the total detected number CA.
  • the first image and the second image are acquired in the acquisition step.
  • the first image corresponds to an image G1 of a passenger standing near the exit 105 in the aisle 103 .
  • the second image corresponds to an image G2 of a passenger standing near the boarding gate 107 in the aisle 103 .
  • the degree of congestion of vehicles 100 is determined based on the first image and the second image. Based on the two images, it is possible to determine the congestion degree of the vehicle 100 with higher accuracy.
  • the vehicle congestion degree determination system 1 includes a camera 3, a detection unit 71a, and a determination unit 71c.
  • the determination unit 71c determines the congestion degree of the vehicle 100 based on the number of detection areas DA overlapping the target area Tg. Therefore, the congestion degree determination system 1 can appropriately determine the congestion degree of vehicles that transport passengers.
  • the vehicle congestion degree determination method and congestion degree determination system 1 can accurately determine the congestion degree of the vehicle 100 . For example, by detecting a person not only in the target region Tg but also in the entire images G1 and G2, detection omissions and erroneous detections can be suppressed.
  • the angle of view of the camera 3 is set so as to capture only the target region Tg. In this case, when passengers are densely packed, multiple people are overlapped in the image, making it difficult to detect people hidden behind them.
  • the angle of view of the camera 3 is determined so as to image a range wider than the target region Tg. As a result, detection omissions and erroneous detections of people are less likely to occur.
  • the camera 3 of this embodiment is arranged so as to capture an overhead view of the interior 101 of the vehicle. Therefore, even if passengers are crowded in the aisle 103, it is easy to detect each passenger.
  • the angle of view of the camera 3 is determined so that not only passengers standing in the aisle 103 but also passengers sitting on the seats 108 and 109 can be imaged. Therefore, highly accurate congestion degree determination based on both the density and the total number of detections is possible.
  • the congestion of the vehicle 100 can be alleviated.
  • providing users with information on the degree of congestion of vehicles 100 in real time makes it difficult for passengers to concentrate on a specific vehicle 100 .
  • crowding of passengers in the vehicle 100 can be suppressed and virus infection inside the vehicle can be suppressed.
  • the determination unit 71c may collectively set a plurality of areas as one target area Tg. For example, the determination unit 71c may combine the area A25 and the area A28 in the image G2 to form one target area Tg. In this case, when the number of pixels in the area where the target area Tg and the detection area DA overlap is equal to or greater than the threshold Th3, the count numbers for areas A25 and A28 are incremented.
  • the congestion degree determination system 1 of this embodiment is easily adaptable to the vehicle in which it is mounted.
  • the position and angle of view of the camera 3 may differ depending on the type of vehicle 100 and the operating company.
  • the congestion degree determination system 1 of the present embodiment can easily adjust the density determination accuracy by selecting the optimum target region Tg from a plurality of areas of the images G1 and G2.
  • the congestion degree determination system 1 of this embodiment is a system having high versatility.
  • the number of areas in the images G1 and G2 is not limited to the illustrated numbers. Also, the shapes of the areas in the images G1 and G2 are not limited to the illustrated shapes. The images G1 and G2 may not be equally divided.
  • the CPU 71 may determine the degree of congestion based on either one of the images G1 and G2. For example, the CPU 71 may determine any one of level I, level II, and level III based on one of the images G1 and G2.
  • At least one operation among the operation of the detection unit 71a, the operation of the extraction unit 71b, and the operation of the determination unit 71c may be performed in the onboard device 2 or the server 6 on the cloud.
  • the CPU 21 of the vehicle-mounted device 2 may have a detection section 71a, an extraction section 71b, and a determination section 71c.
  • the CPU 21 may transmit information about the congestion degree level Lv to the Internet network NW.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Image Processing (AREA)

Abstract

This method for determining the degree of crowding in a vehicle comprises: a step for acquiring an image obtained by capturing the interior 101 of a vehicle transporting passengers; a step for detecting people P3, P4, P5 from the image and generating, for the detected people, rectangular detection regions DA3, DA4, DA5 that surround the people, respectively; and a step for determining the degree of crowding in the vehicle on the basis of a subject region Tg of the image and the detection regions DA3, DA4, DA5. The subject region is a partial region of the image, and a region in which passengers standing in an aisle 103 of the vehicle are imaged. In the determining step, the degree of crowding in the vehicle is determined on the basis of the number of detection regions that overlap with the subject region.

Description

車両の混雑度判定方法、および車両の混雑度判定システムVehicle congestion determination method and vehicle congestion determination system
 本発明は、車両の混雑度判定方法、および車両の混雑度判定システムに関する。 The present invention relates to a vehicle congestion level determination method and a vehicle congestion level determination system.
 従来、車両の乗車人数を監視する装置がある。特許文献1には、第1の撮像装置及び処理ユニットを備え、車体内に取り付けて車体内の乗車人数を監視する車両乗車人数の監視装置が開示されている。特許文献1の記動態検出モジュールは、画像変動値が閾値より大きい場合、認識ブロックに人がいると判断する。 Conventionally, there is a device that monitors the number of passengers in a vehicle. Patent Literature 1 discloses a monitoring device for the number of passengers in a vehicle, which includes a first imaging device and a processing unit and is attached inside the vehicle to monitor the number of passengers in the vehicle. The dynamic detection module of Patent Document 1 determines that there is a person in the recognition block when the image variation value is greater than the threshold.
特開2015-7953号公報JP 2015-7953 A
 近年の新型コロナウイルス(COVID-19)等の感染症対策(with/after コロナ)の観点からも旅客を輸送する車両の混雑度を適切に判定できる技術が望まれている。例えば、通路に複数の旅客が立っている場合であっても、車内を撮像した画像に基づいて適切に混雑度を判定できることが好ましい。 From the perspective of measures against infectious diseases (with/after Corona) such as the recent novel coronavirus (COVID-19), there is a demand for technology that can appropriately determine the degree of congestion of vehicles that transport passengers. For example, even when a plurality of passengers are standing in an aisle, it is preferable to be able to appropriately determine the degree of congestion based on an image of the interior of the vehicle.
 本発明の目的は、旅客を輸送する車両の混雑度を適切に判定できる車両の混雑度判定方法、および車両の混雑度判定システムを提供することである。 An object of the present invention is to provide a vehicle congestion level determination method and a vehicle congestion level determination system that can appropriately determine the congestion level of vehicles that transport passengers.
 本発明の車両の混雑度判定方法は、旅客を輸送する車両の車内を撮像した画像を取得するステップと、前記画像から人を検出し、検出されたそれぞれの前記人に対して当該人を囲む矩形の検出領域を生成するステップと、前記画像の対象領域および前記検出領域に基づいて前記車両の混雑度を判定するステップと、を含み、前記対象領域は、前記画像の一部の領域であって、かつ前記車両の通路に立っている旅客が撮像される領域であり、前記判定するステップにおいて、前記対象領域と重なっている前記検出領域の数に基づいて前記車両の混雑度を判定することを特徴とする。 A vehicle congestion degree determination method according to the present invention includes the steps of acquiring an image of the interior of a vehicle that transports passengers, detecting people from the image, and surrounding each of the detected people. generating a rectangular detection area; and determining the congestion degree of the vehicle based on the target area of the image and the detection area, wherein the target area is a partial area of the image. and a passenger standing in the aisle of the vehicle is imaged, and in the determining step, the congestion degree of the vehicle is determined based on the number of the detection areas overlapping the target area. characterized by
 本発明に係る車両の混雑度判定方法は、画像から人を検出し、検出されたそれぞれの人に対して当該人を囲む矩形の検出領域を生成するステップと、画像の対象領域および検出領域に基づいて車両の混雑度を判定するステップと、を含む。対象領域は、画像の一部の領域であって、かつ車両の通路に立っている旅客が撮像される領域である。判定するステップにおいて、対象領域と重なっている検出領域の数に基づいて車両の混雑度が判定される。本発明に係る車両の混雑度判定方法によれば、旅客を輸送する車両の混雑度を適切に判定できるという効果を奏する。 A vehicle congestion degree determination method according to the present invention comprises the steps of: detecting a person from an image; generating a rectangular detection area surrounding each detected person; and determining the degree of congestion of vehicles based on. The region of interest is the portion of the image in which a passenger standing in the aisle of the vehicle is imaged. In the determining step, the degree of vehicle congestion is determined based on the number of detection areas overlapping the target area. According to the vehicle congestion degree determination method according to the present invention, it is possible to appropriately determine the congestion degree of vehicles that transport passengers.
図1は、実施形態に係る車両の混雑度判定システムのブロック図である。FIG. 1 is a block diagram of a vehicle congestion degree determination system according to an embodiment. 図2は、第一カメラによって撮像された画像を示す図である。FIG. 2 is a diagram showing an image captured by the first camera. 図3は、第二カメラによって撮像された画像を示す図である。FIG. 3 is a diagram showing an image captured by the second camera. 図4は、重なり度合いの判定について説明する図である。FIG. 4 is a diagram for explaining determination of the degree of overlap. 図5は、実施形態の動作を示すフローチャートである。FIG. 5 is a flow chart showing the operation of the embodiment.
 以下に、本発明の実施形態に係る車両の混雑度判定方法、および車両の混雑度判定システムにつき図面を参照しつつ詳細に説明する。なお、この実施形態によりこの発明が限定されるものではない。また、下記の実施形態における構成要素には、当業者が容易に想定できるものあるいは実質的に同一のものが含まれる。 The vehicle congestion degree determination method and the vehicle congestion degree determination system according to the embodiment of the present invention will be described in detail below with reference to the drawings. In addition, this invention is not limited by this embodiment. In addition, components in the following embodiments include those that can be easily assumed by those skilled in the art or substantially the same components.
[実施形態]
 図1から図5を参照して、実施形態について説明する。本実施形態は、車両の混雑度判定方法、および車両の混雑度判定システムに関する。図1は、実施形態に係る車両の混雑度判定システムのブロック図、図2は、第一カメラによって撮像された画像を示す図、図3は、第二カメラによって撮像された画像を示す図、図4は、重なり度合いの判定について説明する図、図5は、実施形態の動作を示すフローチャートである。
[Embodiment]
An embodiment will be described with reference to FIGS. 1 to 5. FIG. The present embodiment relates to a vehicle congestion level determination method and a vehicle congestion level determination system. FIG. 1 is a block diagram of the vehicle congestion degree determination system according to the embodiment, FIG. 2 is a diagram showing an image captured by the first camera, FIG. 3 is a diagram showing an image captured by the second camera, FIG. 4 is a diagram explaining determination of the degree of overlap, and FIG. 5 is a flowchart showing the operation of the embodiment.
 図1に示すように、車両の混雑度判定システム1は、車載器2、カメラ3、および事務所PC7を含む。車載器2およびカメラ3は、旅客を輸送する車両100に搭載されている。本実施形態の車両100は、乗合いバスであり、例えば、予め定められた路線を走行する路線バスである。車両100は、路線上の各停留所において旅客を乗降させる。車両100には、車載器2およびカメラ3に加えて、ドアセンサ4が搭載されている。 As shown in FIG. 1, the vehicle congestion degree determination system 1 includes an on-vehicle device 2, a camera 3, and an office PC 7. The vehicle-mounted device 2 and the camera 3 are mounted on a vehicle 100 that transports passengers. Vehicle 100 of the present embodiment is a shared bus, for example, a route bus that travels on a predetermined route. The vehicle 100 allows passengers to get on and off at each stop on the route. The vehicle 100 is equipped with a door sensor 4 in addition to the vehicle-mounted device 2 and the camera 3 .
 車載器2は、例えば、車両100の運行状況を記録するドライブレコーダやデジタルタコグラフである。車載器2は、CPU21、メモリ22、GPS受信部23、および通信部24を有する。CPU21は、各種の演算を行なう演算装置である。CPU21は、例えば、メモリ22に記憶されているプログラムに従って本実施形態の動作を実行する。 The vehicle-mounted device 2 is, for example, a drive recorder or digital tachograph that records the operation status of the vehicle 100. The vehicle-mounted device 2 has a CPU 21 , a memory 22 , a GPS receiver 23 and a communication unit 24 . The CPU 21 is a computing device that performs various computations. The CPU 21 executes the operations of this embodiment according to a program stored in the memory 22, for example.
 メモリ22は、揮発性メモリおよび不揮発性メモリを含む記憶部である。GPS受信部23は、衛星から送信される信号に基づいて車両100の現在位置を算出する。通信部24は、無線基地局5との間で通信を行なう通信モジュールである。通信部24は、CPU21の指令に従い、車両100のアンテナを介して無線基地局5と無線通信を行なう。 The memory 22 is a storage unit that includes volatile memory and nonvolatile memory. The GPS receiver 23 calculates the current position of the vehicle 100 based on signals transmitted from satellites. The communication unit 24 is a communication module that communicates with the radio base station 5 . Communication unit 24 performs radio communication with radio base station 5 via the antenna of vehicle 100 in accordance with instructions from CPU 21 .
 カメラ3は、車両100の車内を撮像して画像G1,G2を出力する撮像装置である。カメラ3の位置およびカメラ3の画角は、車内の旅客を撮像できるように設定されている。本実施形態の車両100は、カメラ3として、第一カメラ31および第二カメラ32を有する。第一カメラ31は、車内における降車口の近傍を撮像する。第二カメラ32は、車内における乗車口の近傍を撮像する。車載器2は、カメラ3によって撮像された画像を取得してメモリ22に記憶する。 The camera 3 is an imaging device that captures the interior of the vehicle 100 and outputs images G1 and G2. The position of the camera 3 and the angle of view of the camera 3 are set so that the passenger in the vehicle can be imaged. The vehicle 100 of this embodiment has a first camera 31 and a second camera 32 as the cameras 3 . The first camera 31 captures an image of the vicinity of the exit in the vehicle. The second camera 32 images the vicinity of the boarding gate inside the vehicle. The vehicle-mounted device 2 acquires the image captured by the camera 3 and stores it in the memory 22 .
 ドアセンサ4は、車両100の乗降口を開閉するドアの状態を検出するセンサである。ドアセンサ4は、例えば、ドアが全閉状態であるか否かを検出する。ドアセンサ4は、例えば、乗車口ドアおよび降車口ドアのそれぞれに配置されている。ドアセンサ4の検出結果は、CPU21に送られる。 The door sensor 4 is a sensor that detects the state of the door that opens and closes the entrance/exit of the vehicle 100 . The door sensor 4 detects, for example, whether the door is fully closed. The door sensor 4 is arranged, for example, at each of the entrance door and the exit door. A detection result of the door sensor 4 is sent to the CPU 21 .
 車載器2は、通信部24を介して混雑度判定のためのデータを外部に送信する。送信されるデータは、例えば、カメラ3によって撮像された画像、その画像の撮像時刻、撮像時刻における車両100の位置、および車両100の識別コード等を含む。通信部24が送信する画像データは、例えば、静止画のデータである。送信されたデータは、例えば、インターネット網NWに接続されたサーバ6に保存される。 The vehicle-mounted device 2 transmits data for determining the degree of congestion to the outside via the communication unit 24 . The data to be transmitted includes, for example, the image captured by the camera 3, the image capturing time of the image, the position of the vehicle 100 at the image capturing time, the identification code of the vehicle 100, and the like. The image data transmitted by the communication unit 24 is, for example, still image data. The transmitted data is stored, for example, in the server 6 connected to the Internet network NW.
 事務所PC7は、例えば、企業等の事務所に設置された汎用のコンピュータ装置で構成される。事務所PC7は、管理対象である車両100の運行状況を管理する機能、および車両100の混雑度を判定する機能を有している。事務所PC7は、CPU71、メモリ72、通信部73、および外部入力インタフェース74を有する。事務所PC7は、通信部73およびインターネット網NWを介してサーバ6および無線基地局5のそれぞれと通信を行なう。事務所PC7は、例えば、メモリ72からCPU71に読み込んだプログラムに従って、運行状況の管理および混雑度の判定を実行する。 The office PC 7 is composed of, for example, a general-purpose computer installed in an office of a company or the like. The office PC 7 has a function of managing the operation status of the vehicles 100 to be managed and a function of determining the degree of congestion of the vehicles 100 . Office PC 7 has CPU 71 , memory 72 , communication section 73 , and external input interface 74 . Office PC 7 communicates with server 6 and wireless base station 5 via communication unit 73 and Internet network NW. The office PC 7 manages the operation status and determines the degree of congestion, for example, according to a program read from the memory 72 to the CPU 71 .
 図2には、第一カメラ31によって撮像された画像G1の一例が示されている。画像G1には、車両100の車内101が撮像されている。より詳しくは、画像G1には、運転席102、通路103、および降車口105が撮像されている。降車口105は、車両100の前部に配置されており、降車ドア104によって開閉される。 An example of the image G1 captured by the first camera 31 is shown in FIG. The inside 101 of the vehicle 100 is imaged in the image G1. More specifically, the image G1 includes a driver's seat 102, an aisle 103, and an exit 105. FIG. The exit 105 is arranged in the front part of the vehicle 100 and is opened and closed by the exit door 104 .
 図3には、第二カメラ32によって撮像された画像G2の一例が示されている。画像G2には、車両100の車内101が撮像されている。より詳しくは、画像G2には、通路103、乗車口107、後部座席108、および前部座席109が撮像されている。乗車口107は、車両100の中間部に配置されており、乗車ドア106によって開閉される。後部座席108は、乗車口107よりも車両後方に配置された座席である。後部座席108は、車両前方に向いている。前部座席109は、乗車口107よりも車両前方に配置された座席である。前部座席109は、車幅方向に向いている。 An example of the image G2 captured by the second camera 32 is shown in FIG. The inside 101 of the vehicle 100 is imaged in the image G2. More specifically, the image G2 includes the aisle 103, the boarding gate 107, the rear seat 108, and the front seat 109. As shown in FIG. The boarding gate 107 is arranged in the middle part of the vehicle 100 and is opened and closed by the boarding door 106 . The rear seat 108 is a seat arranged behind the entrance 107 of the vehicle. The rear seat 108 faces forward of the vehicle. The front seat 109 is a seat arranged in front of the vehicle from the boarding gate 107 . The front seat 109 faces in the vehicle width direction.
 車載器2は、旅客の乗降が完了した後に撮像された画像G1,G2を送信する。例えば、車載器2は、降車ドア104および乗車ドア106の両方が全閉状態であるときに撮像された画像G1,G2をインターネット網NWに送信する。旅客の動きが少ないときに撮像された画像G1,G2により、車内101の混雑度合いを精度よく推定することが可能となる。 The vehicle-mounted device 2 transmits the captured images G1 and G2 after the passenger has completed boarding and alighting. For example, the vehicle-mounted device 2 transmits images G1 and G2 captured when both the exit door 104 and the boarding door 106 are fully closed to the Internet network NW. It is possible to accurately estimate the degree of congestion in the vehicle interior 101 from the images G1 and G2 captured when the movement of passengers is small.
 事務所PC7は、車両100の混雑度を判定するためのデータをサーバ6から取得する。事務所PC7は、通信部73を介してデータを取得し、取得したデータをメモリ72に保存する。CPU71は、検出部71a、抽出部71b、および判定部71cを有する。 The office PC 7 acquires data for determining the degree of congestion of the vehicles 100 from the server 6. Office PC 7 acquires data via communication unit 73 and stores the acquired data in memory 72 . The CPU 71 has a detection section 71a, an extraction section 71b, and a determination section 71c.
 検出部71a、抽出部71b、および判定部71cは、CPU71において実行されるプログラムの一部であってもよく、CPU71が有する回路の一部であってもよく、CPU71に実装されたチップ等であってもよい。すなわち、CPU71は、検出部71aとしての機能、抽出部71bとしての機能、および判定部71cとしての機能を実行できるように構成されていればよい。 The detection unit 71a, the extraction unit 71b, and the determination unit 71c may be part of a program executed by the CPU 71, may be part of a circuit that the CPU 71 has, or may be a chip or the like mounted on the CPU 71. There may be. That is, the CPU 71 may be configured so as to be capable of executing the function as the detection section 71a, the function as the extraction section 71b, and the function as the determination section 71c.
 本実施形態に係る車両の混雑度判定システム1は、二つの指標に基づいて車両100の混雑度を判定する。一つ目の指標は、旅客の密集度である。CPU71は、画像G1,G2から人を検出し、通路103における旅客の密集度を算出する。二つ目の指標は、旅客の総検出数である。CPU71は、画像G2から検出された旅客の総数を算出する。つまり、密集度は画像G1,G2の両方から算出され、総検出数は、画像G2から算出される。本実施形態に係る車両の混雑度判定システム1は、旅客の密集度および旅客の総検出数に基づいて精度よく車両100の混雑度を判定することができる。 The vehicle congestion degree determination system 1 according to the present embodiment determines the congestion degree of the vehicle 100 based on two indices. The first indicator is passenger density. The CPU 71 detects people from the images G<b>1 and G<b>2 and calculates the density of passengers in the aisle 103 . The second indicator is the total number of detected passengers. The CPU 71 calculates the total number of passengers detected from the image G2. That is, the density is calculated from both images G1 and G2, and the total number of detections is calculated from image G2. The vehicle congestion degree determination system 1 according to the present embodiment can accurately determine the congestion degree of the vehicle 100 based on the passenger density and the total number of detected passengers.
 検出部71aは、画像G1,G2から人を検出し、検出されたそれぞれの人に対して検出領域DAを生成する。例えば、検出部71aは、図2に示すように、検出された人のそれぞれに対して検出領域DAを生成する。検出領域DAは、所謂バウンディングボックスである。図2では、三人の人P0,P1,P2が検出されている。検出部71aは、人P0を囲む検出領域DA0、人P1を囲む検出領域DA1、および人P2を囲む検出領域DA2を生成する。検出部71aは、立っている人、および座っている人を含む全ての人を検出し、検出された全ての人に対して検出領域DAを生成する。 The detection unit 71a detects people from the images G1 and G2, and generates a detection area DA for each detected person. For example, the detection unit 71a generates a detection area DA for each detected person, as shown in FIG. The detection area DA is a so-called bounding box. In FIG. 2, three persons P0, P1, P2 are detected. The detection unit 71a generates a detection area DA0 surrounding the person P0, a detection area DA1 surrounding the person P1, and a detection area DA2 surrounding the person P2. The detection unit 71a detects all persons including standing persons and sitting persons, and generates a detection area DA for all detected persons.
 検出部71aは、図3に示すように、画像G2から検出された人P3,P4,P5に対して検出領域DA3,DA4,DA5を生成する。なお、検出領域DAは、画像において人が写っていると認識された領域であり、人の全体を囲む領域とはならない場合がある。 The detection unit 71a generates detection areas DA3, DA4, and DA5 for persons P3, P4, and P5 detected from the image G2, as shown in FIG. Note that the detection area DA is an area in which a person is recognized in the image, and may not be an area surrounding the entire person.
 抽出部71bは、生成された全ての検出領域DAから、密集度の算出対象とする検出領域DA、および総検出数の算出対象とする検出領域DAを抽出する。車両100の運転手は、密集度の算出対象から除外され、かつ総検出数の算出対象からも除外される。図2において検出された人P0は、車両100の運転手である。抽出部71bは、以下に説明するように運転手を認識し、運転手を算出対象から除外する。 The extraction unit 71b extracts the detection area DA for which the density is to be calculated and the detection area DA for which the total number of detections is to be calculated from all the generated detection areas DA. Drivers of the vehicle 100 are excluded from the calculation target of the density and also excluded from the calculation target of the total detection number. The person P0 detected in FIG. 2 is the driver of the vehicle 100. FIG. The extraction unit 71b recognizes the driver as described below and excludes the driver from the calculation target.
 図2に示すように、画像G1は、A11からA19までの九個のエリアに等分されている。各エリアの形状は、長方形である。例示された画像G1では、各エリアにおける画像横方向の画素数が240、画像縦方向の画素数が160である。すなわち、各エリアは、38,400画素で構成されている。 As shown in FIG. 2, the image G1 is equally divided into nine areas from A11 to A19. The shape of each area is a rectangle. In the illustrated image G1, each area has 240 pixels in the horizontal direction and 160 pixels in the vertical direction. That is, each area consists of 38,400 pixels.
 画像G1において、エリアA11,A14,A17は、運転手が撮像されるエリアとして設定されている。抽出部71bは、検出領域DAの中心座標がエリアA11,A14,A17の何れかに位置する場合、その検出領域DAが運転手に対応すると判断する。図2において、検出領域DA0の中心は、エリアA14に位置している。つまり、検出領域DA0に対応する人A0は、運転手であると判断できる。従って、抽出部71bは、密集度および総検出数の算出対象から検出領域DA0を除外する。一方、検出領域DA1,DA2の中心は、エリアA11,A14,A17の何れにも位置していない。抽出部71bは、検出領域DA1,DA2を密集度の算出対象として採用する。検出領域DA1,DA2は、画像G1の領域であるため、総検出数の算出対象ではない。 In the image G1, areas A11, A14, and A17 are set as areas where the driver is imaged. When the center coordinates of the detection area DA are located in any of the areas A11, A14, and A17, the extraction unit 71b determines that the detection area DA corresponds to the driver. In FIG. 2, the center of detection area DA0 is located in area A14. That is, it can be determined that the person A0 corresponding to the detection area DA0 is the driver. Therefore, the extracting unit 71b excludes the detection area DA0 from the calculation target of the density and the total number of detections. On the other hand, the centers of the detection areas DA1 and DA2 are not located in any of the areas A11, A14 and A17. The extraction unit 71b employs the detection areas DA1 and DA2 as targets for density calculation. Since the detection areas DA1 and DA2 are areas of the image G1, they are not subject to calculation of the total number of detections.
 図3に示すように、画像G2は、A21からA29までの九個のエリアに等分されている。各エリアの形状は、長方形である。例示された画像G2では、各エリアにおける画像横方向の画素数が240、画像縦方向の画素数が160である。すなわち、各エリアは、38,400画素で構成されている。 As shown in FIG. 3, the image G2 is equally divided into nine areas from A21 to A29. The shape of each area is a rectangle. In the illustrated image G2, each area has 240 pixels in the horizontal direction and 160 pixels in the vertical direction. That is, each area consists of 38,400 pixels.
 画像G2は、運転手が撮像されるエリアを含まない。従って、抽出部71bは、画像G2に関しては検出領域DAが運転手に対応するか否かの判定を実行しない。画像G2において、エリアA23,A26,A29は、着席した旅客が撮像されるエリアである。以下の説明では、着席している旅客が撮像されるエリアを「第一領域」と称する。抽出部71bは、検出領域DAの中心座標が第一領域に位置する場合、その検出領域DAが着席している旅客に対応する領域であると判断する。図3において、検出領域DA5の中心は、エリアA29に位置している。つまり、検出された人P5は、着席している旅客であると判断できる。 The image G2 does not include the area where the driver is imaged. Therefore, the extraction unit 71b does not determine whether or not the detection area DA corresponds to the driver with respect to the image G2. In the image G2, areas A23, A26, and A29 are areas where seated passengers are imaged. In the following description, an area in which a seated passenger is imaged is referred to as a "first area". When the center coordinates of the detection area DA are located in the first area, the extraction unit 71b determines that the detection area DA is an area corresponding to the passenger seated. In FIG. 3, the center of detection area DA5 is located in area A29. That is, it can be determined that the detected person P5 is a passenger seated.
 着席している旅客は、通路103の密集度との関連性が低い。抽出部71bは、密集度の算出対象から検出領域DA5を除外する。一方、抽出部71bは、検出領域DA5を総検出数の算出対象として採用する。 Seated passengers have a low correlation with the density of the aisle 103. The extraction unit 71b excludes the detection area DA5 from the density calculation target. On the other hand, the extraction unit 71b adopts the detection area DA5 as a calculation target for the total number of detections.
 検出領域DA3,DA4の中心は、エリアA23,A26,A29の何れにも位置していない。従って、抽出部71bは、検出領域DA3,DA4を密集度および総検出数の算出対象として採用する。 The centers of the detection areas DA3 and DA4 are not located in any of the areas A23, A26 and A29. Therefore, the extraction unit 71b adopts the detection areas DA3 and DA4 as targets for calculating the density and the total number of detections.
 判定部71cは、密集度および総検出数を算出し、算出結果に基づいて車両100の混雑度を判定する。本実施形態の密集度は、以下のように算出される。図2に示す画像G1において、密集度が算出されるエリアは、エリアA15である。エリアA15は、通路103における降車口105の側の端部に立っている旅客が撮像されるエリアである。 The determination unit 71c calculates the density and the total number of detections, and determines the degree of congestion of the vehicles 100 based on the calculation results. The density of this embodiment is calculated as follows. In the image G1 shown in FIG. 2, the area for which the density is calculated is area A15. Area A15 is an area where a passenger standing at the end of aisle 103 on the exit 105 side is imaged.
 判定部71cは、エリアA15を対象領域Tgに設定する。判定部71cは、各検出領域DAについて、対象領域Tgと重畳している画素数が閾値Th1以上であるか、を判定する。例えば、判定部71cは、図4に示すように、対象領域Tgと検出領域DA1とが重なっている領域Ov1を検出する。判定部71cは、領域Ov1の画素数が閾値Th1以上である場合に対象領域Tgについてのカウント数C0をインクリメントする。カウント数C0の初期値は0である。閾値Th1は、例えば、1,024[画素]である。つまり、領域Ov1の画素数が1,024以上である場合、検出領域DA1が密集の構成要素の一つとしてカウントされる。 The determination unit 71c sets the area A15 as the target region Tg. The determination unit 71c determines whether the number of pixels overlapping the target area Tg is equal to or greater than a threshold Th1 for each detection area DA. For example, the determination unit 71c detects an area Ov1 where the target area Tg and the detection area DA1 overlap, as shown in FIG. The determination unit 71c increments the count number C0 for the target region Tg when the number of pixels in the region Ov1 is equal to or greater than the threshold Th1. The initial value of the count number C0 is zero. The threshold Th1 is, for example, 1,024 [pixels]. In other words, when the number of pixels in the area Ov1 is 1,024 or more, the detection area DA1 is counted as one of the dense constituent elements.
 判定部71cは、他の検出領域DAについても同様に対象領域Tgと重なっている領域を検出し、かつ重なっている領域の画素数が閾値Th1以上であれば対象領域Tgについてのカウント数C0をインクリメントする。判定部71cは、密集度の算出対象である全ての検出領域DAについて、対象領域Tgとの重なり度合いを判定する。判定部71cは、算出されたカウント数C0をエリアA15についてのカウント数C15として記憶する。その後、判定部71cは、カウント数C0を初期化する。 The determination unit 71c similarly detects areas overlapping the target area Tg for other detection areas DA, and if the number of pixels in the overlapping area is equal to or greater than the threshold value Th1, the count number C0 for the target area Tg is calculated. Increment. The determination unit 71c determines the degree of overlap with the target area Tg for all detection areas DA whose density is to be calculated. The determination unit 71c stores the calculated count number C0 as the count number C15 for the area A15. After that, the determination unit 71c initializes the count number C0.
 更に、判定部71cは、画像G2について重なり度合いを判定する。図3に示す画像G2において、対象領域Tgは、エリアA25およびエリアA28である。エリアA25およびエリアA28は、通路103における乗車口107の近傍に立っている旅客が撮像されるエリアである。 Furthermore, the determination unit 71c determines the degree of overlap for the image G2. In the image G2 shown in FIG. 3, the target regions Tg are area A25 and area A28. Area A25 and area A28 are areas in which a passenger standing near boarding gate 107 in aisle 103 is imaged.
 判定部71cは、一つのエリアA25を対象領域Tgとして設定する。判定部71cは、画像G2の各検出領域DAについて、対象領域Tgと重畳している画素数が閾値Th2以上であるか、を判定する。閾値Th2は、例えば、3,072[画素]である。判定部71cは、対象領域Tgと検出領域DAとが重なっている領域の画素数が閾値Th2以上である場合、対象領域Tgについてのカウント数C0をインクリメントする。判定部71cは、密集度の算出対象である全ての検出領域DAについて、対象領域Tgとの重なり度合いを判定する。判定部71cは、算出されたカウント数C0をエリアA25についてのカウント数C25として記憶する。その後、判定部71cは、カウント数C0を初期化する。 The determination unit 71c sets one area A25 as the target area Tg. The determination unit 71c determines whether the number of pixels overlapping the target area Tg is equal to or greater than a threshold Th2 for each detection area DA of the image G2. The threshold Th2 is, for example, 3,072 [pixels]. The determining unit 71c increments the count number C0 for the target area Tg when the number of pixels in the area where the target area Tg and the detection area DA overlap is equal to or greater than the threshold value Th2. The determination unit 71c determines the degree of overlap with the target area Tg for all detection areas DA whose density is to be calculated. The determination unit 71c stores the calculated count number C0 as the count number C25 for the area A25. After that, the determination unit 71c initializes the count number C0.
 判定部71cは、エリアA28を対象領域Tgとして設定し、同様にカウント数C0を算出する。判定部71cは、算出されたカウント数C0をエリアA28についてのカウント数C28として記憶する。判定部71cは、画像G2について、旅客の総検出数を算出する。総検出数CAは、画像G2に対して生成された検出領域DAの個数である。図3に例示された画像G2では、検出領域DA3,DA4,DA5が総検出数の算出対象である。つまり、総検出数CAは、3である。 The determination unit 71c sets the area A28 as the target area Tg, and similarly calculates the count number C0. The determination unit 71c stores the calculated count number C0 as the count number C28 for the area A28. The determination unit 71c calculates the total number of detected passengers for the image G2. The total detection number CA is the number of detection areas DA generated for the image G2. In the image G2 illustrated in FIG. 3, the detection areas DA3, DA4, and DA5 are the targets for calculation of the total number of detections. That is, the total detection number CA is 3.
 CPU71は、図5に示すフローチャートに基づいて混雑度を判定する。CPU71は、画像G1,G2を取得するステップを実行した後に図5のフローチャートを実行する。 The CPU 71 determines the degree of congestion based on the flowchart shown in FIG. The CPU 71 executes the flowchart of FIG. 5 after executing the step of acquiring the images G1 and G2.
 ステップS10において、検出部71aは、画像G1,G2について物体検出を実行し、検出領域DAを計算する。ステップS10は、画像G1,G2から人を検出し、検出されたそれぞれの人に対して当該人を囲む矩形の検出領域DAを生成するステップである。ステップS10が実行されると、ステップS20に進む。 In step S10, the detection unit 71a performs object detection on the images G1 and G2 and calculates the detection area DA. Step S10 is a step of detecting a person from the images G1 and G2 and generating a rectangular detection area DA surrounding each detected person. After step S10 is executed, the process proceeds to step S20.
 ステップS20において、抽出部71bは、密集度の算出対象とする検出領域DA、および総検出数の算出対象とする検出領域DAを抽出する。ステップS20が実行されると、ステップS30に進む。 In step S20, the extraction unit 71b extracts the detection area DA for which the density is to be calculated and the detection area DA for which the total number of detections is to be calculated. After step S20 is executed, the process proceeds to step S30.
 ステップS30において、判定部71cは、検出領域DAと対象領域Tgとの重複を計算する。判定部71cは、エリアA15,A25,A28のそれぞれについて、カウント数C15,C25,C28を算出する。ステップS30が実行されると、ステップS40に進む。 In step S30, the determination unit 71c calculates the overlap between the detection area DA and the target area Tg. The determination unit 71c calculates count numbers C15, C25, and C28 for areas A15, A25, and A28, respectively. After step S30 is executed, the process proceeds to step S40.
 ステップS40において、判定部71cは、混雑度の指標がレベルIIの条件を満たすか否かを判定する。本実施形態に係る車両の混雑度判定システム1は、混雑度のレベルをレベルI、レベルII、およびレベルIIIの三つに分類する。レベルIは、最も混雑度が低いレベルであり、レベルIIIは、最も混雑度が高いレベルである。レベルIは、例えば、立っている旅客が2名以下の混雑度合いである。なお、ベビーカー、座ろうとしている旅客、およびバス会社の社員は「立っている旅客」の対象外である。レベルIIは、例えば、立っている旅客が3名以上で、かつ通路103の中央部分が未使用の混雑度合いである。レベルIIIは、例えば、通路103の中央部分に旅客が立っている混雑度合いである。なお、通路103の中央部分は、例えば、通路103における乗車ドア106と降車ドア104との間の中間部分である。 In step S40, the determination unit 71c determines whether or not the index of the degree of congestion satisfies the condition of level II. The vehicle congestion degree determination system 1 according to the present embodiment classifies the congestion degree levels into three levels: level I, level II, and level III. Level I is the least congested level and Level III is the most congested level. Level I is, for example, congestion with two or fewer standing passengers. Strollers, passengers trying to sit down, and bus company employees are not considered "standing passengers". Level II is, for example, a degree of congestion in which three or more passengers are standing and the central portion of the aisle 103 is unused. Level III is, for example, the degree of congestion in which passengers are standing in the central portion of the aisle 103 . Note that the central portion of the passage 103 is, for example, an intermediate portion between the boarding door 106 and the exit door 104 in the passage 103 .
 レベルIIの条件を第一条件と称する。第一条件は、下記式(1)が成立することである。すなわち、「エリアA15のカウント数C15とエリアA28のカウント数C28との合計値が2以上」である場合に、混雑度がレベルII以上であると判定される。レベルIの条件は、「カウント数C15とカウント数C28との合計値が2未満」である。判定部71cは、第一条件が成立する場合、ステップS40で肯定判定してステップS50に進む。判定部71cは、ステップS40で否定判定した場合はステップS80に進む。
 C15+C28≧2 (1)
Level II conditions are referred to as primary conditions. The first condition is that the following formula (1) holds. That is, when "the total value of the count number C15 of the area A15 and the count number C28 of the area A28 is 2 or more", it is determined that the degree of congestion is level II or more. The level I condition is "the total value of the count number C15 and the count number C28 is less than 2". If the first condition is satisfied, the determination unit 71c makes an affirmative determination in step S40 and proceeds to step S50. If the determination unit 71c makes a negative determination in step S40, the process proceeds to step S80.
C15+C28≧2 (1)
 ステップS50において、判定部71cは、混雑度の指標がレベルIIIの条件を満たすか否かを判定する。レベルIIIの条件は、第一条件に加えて第二条件が成立することである。第二条件は、下記式(2)、式(3)、式(4)、および式(5)が全て成立することである。すなわち、第一条件が成立し、かつ「エリアA15のカウント数C15が1以上、エリアA25のカウント数C25およびエリアA28のカウント数C28が何れも2以上であり、かつ総検出数CAが11以上」である場合に混雑度がレベルIIIであると判定される。
 C15≧1 (2)
 C25≧2 (3)
 C28≧2 (4)
 CA≧11 (5)
In step S50, the determination unit 71c determines whether or not the index of the degree of congestion satisfies the condition of level III. The condition of level III is that the second condition is satisfied in addition to the first condition. The second condition is that the following formulas (2), (3), (4), and (5) are all satisfied. That is, the first condition is satisfied, and "the count number C15 of the area A15 is 1 or more, the count number C25 of the area A25 and the count number C28 of the area A28 are both 2 or more, and the total detection number CA is 11 or more. , it is determined that the degree of congestion is level III.
C15≧1 (2)
C25≧2 (3)
C28≧2 (4)
CA≧11 (5)
 判定部71cは、第二条件が成立する場合、ステップS50で肯定判定してステップS60に進む。判定部71cは、ステップS50で否定判定した場合はステップS70に進む。 When the second condition is satisfied, the determination unit 71c makes an affirmative determination in step S50 and proceeds to step S60. If the determination unit 71c makes a negative determination in step S50, the process proceeds to step S70.
 ステップS60において、判定部71cは、混雑度のレベルLvに対しレベルIIIの値を代入する。ステップS60が実行されると、フローチャートが終了する。 In step S60, the determination unit 71c substitutes the value of level III for the congestion degree level Lv. When step S60 is executed, the flowchart ends.
 ステップS70において、判定部71cは、混雑度のレベルLvに対しレベルIIの値を代入する。ステップS70が実行されると、フローチャートが終了する。 In step S70, the determination unit 71c substitutes the level II value for the congestion degree level Lv. After step S70 is executed, the flowchart ends.
 ステップS80において、判定部71cは、混雑度のレベルLvに対しレベルIの値を代入する。ステップS80が実行されると、フローチャートが終了する。 In step S80, the determination unit 71c substitutes the value of level I for the congestion degree level Lv. When step S80 is executed, the flowchart ends.
 CPU71は、混雑度のレベルLvについての情報を外部に送信してもよい。混雑度のレベルLvは、例えば、停留所に向けて無線送信されてもよい。この場合、停留所の表示器には、これから到着する車両100の混雑度のレベルLvが表示される。混雑度のレベルLvは、例えば、ユーザの携帯端末に向けて送信されてもよい。この場合、ユーザの携帯端末において、車両100の混雑度のレベルLvがアプリケーションソフトによって表示されてもよい。混雑度のレベルLvは、例えば、車両100と同じ路線を走行する他の路線バスに送信されてもよい。混雑度のレベルLvは、車両100を含む路線バスの運行スケジュールの管理に用いることもできる。 The CPU 71 may transmit information about the congestion degree level Lv to the outside. The congestion level Lv may be wirelessly transmitted to the stop, for example. In this case, the congestion level Lv of the vehicle 100 arriving from now is displayed on the display of the stop. The congestion degree level Lv may be transmitted to the user's mobile terminal, for example. In this case, the congestion degree level Lv of the vehicle 100 may be displayed by application software on the user's portable terminal. The level Lv of the degree of congestion may be transmitted, for example, to other fixed-route buses running on the same route as the vehicle 100 . The level Lv of the degree of congestion can also be used for managing the operation schedule of the route bus including the vehicle 100 .
 以上説明したように、本実施形態に係る車両の混雑度判定方法は、取得するステップと、生成するステップと、判定するステップと、を含む。取得するステップでは、旅客を輸送する車両100の車内101を撮像した画像G1,G2が取得される。生成するステップでは、画像G1,G2から人が検出され、検出されたそれぞれの人に対して当該人を囲む矩形の検出領域DAが生成される。図5のフローチャートでは、ステップS10が生成するステップに相当する。 As described above, the vehicle congestion level determination method according to the present embodiment includes an acquisition step, a generation step, and a determination step. In the obtaining step, images G1 and G2 of the interior 101 of the vehicle 100 for transporting passengers are obtained. In the generating step, a person is detected from the images G1 and G2, and a rectangular detection area DA surrounding each detected person is generated. In the flowchart of FIG. 5, this corresponds to the step generated by step S10.
 判定するステップでは、画像G1,G2の対象領域Tgおよび検出領域DAに基づいて車両100の混雑度が判定される。図5のフローチャートでは、ステップS30からステップS80が判定するステップに相当する。対象領域Tgは、画像G1,G2の一部の領域であって、かつ車両100の通路103に立っている旅客が撮像される領域である。判定するステップでは、対象領域Tgと重なっている検出領域DAの数に基づいて車両100の混雑度が判定される。 In the determining step, the congestion degree of the vehicle 100 is determined based on the target area Tg and the detection area DA of the images G1 and G2. In the flowchart of FIG. 5, steps S30 to S80 correspond to the determination steps. The target area Tg is a partial area of the images G1 and G2 and an area in which a passenger standing in the aisle 103 of the vehicle 100 is imaged. In the determining step, the degree of congestion of vehicles 100 is determined based on the number of detection areas DA overlapping the target area Tg.
 対象領域Tgと重なっている検出領域DAの数は、通路103に立っている旅客の密集度を表す。よって、本実施形態に係る車両の混雑度判定方法は、旅客を輸送する車両の混雑度を適切に判定することができる。また、広い画像G1,G2の全体から人を検出することで、対象領域Tgから人を検出する場合と比較して良好な検出精度を実現できる。 The number of detection areas DA overlapping the target area Tg represents the density of passengers standing in the aisle 103. Therefore, the vehicle congestion degree determination method according to the present embodiment can appropriately determine the congestion degree of vehicles that transport passengers. Further, by detecting a person from the entire wide images G1 and G2, better detection accuracy can be realized than when detecting a person from the target area Tg.
 以上のような車両の混雑度判定方法は、例えば、近年の新型コロナウイルス(COVID-19)等の感染症対策の観点からも有用である。例えば、利用者に対してタイムリーに正確に車両100の混雑度を把握させ密にならない利用を促すことができ、また、事業者に対しても密にならないように運行計画の改善に活用させることができる。また、本実施形態に係る車両の混雑度判定方法では、静止画に基づいて混雑度が判定される。よって、判定に係る各部の負荷の増加を抑制することができる。例えば、カメラ3によって撮像された画像をインターネット網NWに送信する場合の通信量が小さいので、通信負荷が軽減される。また、混雑度の判定を車載器2において実行する場合、車載器2における演算負荷が軽減される。 The above vehicle congestion determination method is useful, for example, from the perspective of measures against infectious diseases such as the recent novel coronavirus (COVID-19). For example, it is possible to prompt users to timely and accurately grasp the congestion degree of the vehicle 100 and to encourage them to use the vehicle 100 without crowding. be able to. Further, in the vehicle congestion degree determination method according to the present embodiment, the congestion degree is determined based on a still image. Therefore, it is possible to suppress an increase in the load on each unit involved in the determination. For example, the amount of communication when transmitting an image captured by the camera 3 to the Internet network NW is small, so the communication load is reduced. Moreover, when the determination of congestion degree is performed in the onboard equipment 2, the calculation load in the onboard equipment 2 is reduced.
 本実施形態の対象領域Tgは、車両100の乗降口の近傍に立っている旅客が撮像される領域である。車両100が混雑している場合、乗降口の近傍に立つ旅客が多くなると考えられる。よって、本実施形態に係る車両の混雑度判定方法は、車両100の混雑度を適切に判定することができる。 The target area Tg of the present embodiment is an area in which a passenger standing near the entrance/exit of the vehicle 100 is imaged. When the vehicle 100 is congested, it is conceivable that many passengers stand near the entrance/exit. Therefore, the vehicle congestion degree determination method according to the present embodiment can appropriately determine the congestion degree of the vehicle 100 .
 本実施形態の画像G2は、着席している旅客が撮像される第一領域を有する。第一領域に中心座標を有する検出領域DAは、対象領域Tgと重なっている検出領域のカウント数C0の算出対象から除外される。よって、本実施形態に係る車両の混雑度判定方法は、着席している旅客を除外して精度よく通路103の密集度を算出することができる。 The image G2 of this embodiment has a first area in which a seated passenger is imaged. The detection area DA having the center coordinates in the first area is excluded from the calculation target of the count number C0 of the detection areas overlapping the target area Tg. Therefore, the vehicle congestion degree determination method according to the present embodiment can accurately calculate the congestion degree of the aisle 103 by excluding seated passengers.
 本実施形態に係る車両の混雑度判定方法では、判定するステップにおいて、対象領域Tgと重なっている検出領域DAのカウント数と、画像G2から検出された人の総検出数CAと、に基づいて車両100の混雑度が判定される。よって、通路103の密集度と、総検出数CAとに基づいて車両100の混雑度を適切に判定することが可能となる。 In the vehicle congestion degree determination method according to the present embodiment, in the determination step, based on the count number of the detection area DA overlapping the target area Tg and the total number CA of people detected from the image G2, The degree of congestion of vehicles 100 is determined. Therefore, it is possible to appropriately determine the degree of congestion of the vehicles 100 based on the density of the passage 103 and the total detected number CA.
 本実施形態に係る車両の混雑度判定方法では、取得するステップにおいて、第一の画像および第二の画像が取得される。第一の画像は、通路103における降車口105の近傍に立っている旅客を撮像した画像G1に対応する。第二の画像は、通路103における乗車口107の近傍に立っている旅客を撮像した画像G2に対応する。判定するステップにおいて、第一の画像および第二の画像に基づいて車両100の混雑度が判定される。二つの画像に基づき、車両100の混雑度をより精度よく判定することが可能となる。 In the vehicle congestion degree determination method according to the present embodiment, the first image and the second image are acquired in the acquisition step. The first image corresponds to an image G1 of a passenger standing near the exit 105 in the aisle 103 . The second image corresponds to an image G2 of a passenger standing near the boarding gate 107 in the aisle 103 . In the determining step, the degree of congestion of vehicles 100 is determined based on the first image and the second image. Based on the two images, it is possible to determine the congestion degree of the vehicle 100 with higher accuracy.
 本実施形態に係る車両の混雑度判定システム1は、カメラ3と、検出部71aと、判定部71cと、を有する。判定部71cは、対象領域Tgと重なっている検出領域DAの数にもとづいて車両100の混雑度を判定する。よって、混雑度判定システム1は、旅客を輸送する車両の混雑度を適切に判定することができる。 The vehicle congestion degree determination system 1 according to the present embodiment includes a camera 3, a detection unit 71a, and a determination unit 71c. The determination unit 71c determines the congestion degree of the vehicle 100 based on the number of detection areas DA overlapping the target area Tg. Therefore, the congestion degree determination system 1 can appropriately determine the congestion degree of vehicles that transport passengers.
 本実施形態に係る車両の混雑度判定方法および混雑度判定システム1は、精度よく車両100の混雑度を判定することができる。例えば、対象領域Tgだけでなく画像G1,G2の全体を対象として人を検出することで、検出漏れや誤検出が抑制される。比較例として、対象領域Tgのみを撮像するようにカメラ3の画角が定められたとする。この場合、旅客が密集した場合には複数の人が重なって写るため、後ろに隠れている人が検出されにくくなる。これに対して、本実施形態では、対象領域Tgよりも広い範囲を撮像するようにカメラ3の画角が定められている。これにより、人の検出漏れや誤検出が発生しにくい。 The vehicle congestion degree determination method and congestion degree determination system 1 according to the present embodiment can accurately determine the congestion degree of the vehicle 100 . For example, by detecting a person not only in the target region Tg but also in the entire images G1 and G2, detection omissions and erroneous detections can be suppressed. As a comparative example, it is assumed that the angle of view of the camera 3 is set so as to capture only the target region Tg. In this case, when passengers are densely packed, multiple people are overlapped in the image, making it difficult to detect people hidden behind them. On the other hand, in the present embodiment, the angle of view of the camera 3 is determined so as to image a range wider than the target region Tg. As a result, detection omissions and erroneous detections of people are less likely to occur.
 本実施形態のカメラ3は、車内101を俯瞰して撮影するように配置されている。よって、通路103に旅客が密集していたとしても、旅客のそれぞれを検出することが容易である。 The camera 3 of this embodiment is arranged so as to capture an overhead view of the interior 101 of the vehicle. Therefore, even if passengers are crowded in the aisle 103, it is easy to detect each passenger.
 また、カメラ3の画角は、通路103に立っている旅客だけでなく、座席108,109に座っている旅客も撮像できるように定められている。よって、密集度および総検出数の両方に基づく高精度の混雑度判定が可能となる。 Also, the angle of view of the camera 3 is determined so that not only passengers standing in the aisle 103 but also passengers sitting on the seats 108 and 109 can be imaged. Therefore, highly accurate congestion degree determination based on both the density and the total number of detections is possible.
 本実施形態に係る車両の混雑度判定システム1および車両の混雑度判定方法によれば、車両100の混雑緩和を実現できる。例えば、車両100の混雑度に関する情報をリアルタイムでユーザに提供すれば、特定の車両100へ旅客が集中しにくくなる。その結果、車両100における旅客の密集を未然に抑制し、車内におけるウイルス感染を抑制することができる。 According to the vehicle congestion degree determination system 1 and the vehicle congestion degree determination method according to the present embodiment, the congestion of the vehicle 100 can be alleviated. For example, providing users with information on the degree of congestion of vehicles 100 in real time makes it difficult for passengers to concentrate on a specific vehicle 100 . As a result, crowding of passengers in the vehicle 100 can be suppressed and virus infection inside the vehicle can be suppressed.
 なお、判定部71cは、複数のエリアをまとめて一つの対象領域Tgとして設定してもよい。例えば、判定部71cは、画像G2において、エリアA25とエリアA28とをまとめて一つの対象領域Tgとしてもよい。この場合、対象領域Tgと検出領域DAとが重なる領域の画素数が閾値Th3以上である場合、エリアA25,A28に対するカウント数がインクリメントされる。 Note that the determination unit 71c may collectively set a plurality of areas as one target area Tg. For example, the determination unit 71c may combine the area A25 and the area A28 in the image G2 to form one target area Tg. In this case, when the number of pixels in the area where the target area Tg and the detection area DA overlap is equal to or greater than the threshold Th3, the count numbers for areas A25 and A28 are incremented.
 本実施形態の混雑度判定システム1は、搭載される車両に対する適合が容易である。例えば、カメラ3の位置や画角は、車両100の車種や運行会社に応じて異なる場合がある。本実施形態の混雑度判定システム1は、画像G1,G2が有する複数のエリアから最適な対象領域Tgを選択することで密集度の判定精度を容易に調節できる。また、閾値Th1,Th2によって密集度の判定精度を容易に調節できる。また、運転手として除外するエリアや、座席に座っている旅客として除外するエリアを設定することで、密集度の判定精度を容易に調節できる。このように、本実施形態の混雑度判定システム1は高い汎用性を有するシステムである。 The congestion degree determination system 1 of this embodiment is easily adaptable to the vehicle in which it is mounted. For example, the position and angle of view of the camera 3 may differ depending on the type of vehicle 100 and the operating company. The congestion degree determination system 1 of the present embodiment can easily adjust the density determination accuracy by selecting the optimum target region Tg from a plurality of areas of the images G1 and G2. In addition, it is possible to easily adjust the determination accuracy of the density by the threshold values Th1 and Th2. In addition, by setting areas to be excluded as drivers and areas to be excluded as passengers sitting in seats, it is possible to easily adjust the determination accuracy of the degree of congestion. Thus, the congestion degree determination system 1 of this embodiment is a system having high versatility.
 なお、画像G1,G2におけるエリア数は例示された数には限定されない。また、画像G1,G2におけるエリアの形状は、例示された形状には限定されない。画像G1,G2は、等分割されていなくてもよい。 Note that the number of areas in the images G1 and G2 is not limited to the illustrated numbers. Also, the shapes of the areas in the images G1 and G2 are not limited to the illustrated shapes. The images G1 and G2 may not be equally divided.
 CPU71は、画像G1,G2の何れか一方に基づいて混雑度を判定してもよい。例えば、CPU71は、画像G1,G2の一方に基づいて、レベルI、レベルII、およびレベルIIIの何れかを判定してもよい。 The CPU 71 may determine the degree of congestion based on either one of the images G1 and G2. For example, the CPU 71 may determine any one of level I, level II, and level III based on one of the images G1 and G2.
 検出部71aの動作、抽出部71bの動作、および判定部71cの動作のうち少なくとも一つの動作は、車載器2やクラウド上のサーバ6において実行されてもよい。例えば、車載器2のCPU21は、検出部71a、抽出部71b、および判定部71cを有していてもよい。この場合、CPU21は、混雑度のレベルLvについての情報をインターネット網NWに送信してもよい。 At least one operation among the operation of the detection unit 71a, the operation of the extraction unit 71b, and the operation of the determination unit 71c may be performed in the onboard device 2 or the server 6 on the cloud. For example, the CPU 21 of the vehicle-mounted device 2 may have a detection section 71a, an extraction section 71b, and a determination section 71c. In this case, the CPU 21 may transmit information about the congestion degree level Lv to the Internet network NW.
 上記の実施形態に開示された内容は、適宜組み合わせて実行することができる。 The contents disclosed in the above embodiments can be executed in combination as appropriate.
 1 車両の混雑度判定システム
 2 車載器
 3 カメラ
 4 ドアセンサ
 5 無線基地局
 6 サーバ
 7 事務所PC
 21:CPU、 22:メモリ、 23:GPS受信部、 24:通信部
 31:第一カメラ、 32:第二カメラ
 71:CPU、 72:メモリ、 73:通信部、 74:外部入力インタフェース
 100:車両、 101:車内、 102:運転席、 103:通路
 104:降車ドア、 105:降車口、 106:乗車ドア、 107:乗車口
 108:後部座席、 109:前部座席
 A11,A12,A13,A14,A15,A16,A17,A18,A19:エリア
 A21,A22,A23,A24,A25,A26,A27,A28,A29:エリア
 C0:対象領域のカウント数、 C15,C25,C28:エリアのカウント数
 CA:総検出数
 DA:検出領域
 G1,G2:画像
 P 人
1 vehicle congestion degree determination system 2 vehicle-mounted device 3 camera 4 door sensor 5 wireless base station 6 server 7 office PC
21: CPU 22: Memory 23: GPS receiver 24: Communication unit 31: First camera 32: Second camera 71: CPU 72: Memory 73: Communication unit 74: External input interface 100: Vehicle 101: Interior 102: Driver's seat 103: Passage 104: Exit door 105: Exit 106: Boarding door 107: Boarding gate 108: Rear seat 109: Front seat A11, A12, A13, A14, A15, A16, A17, A18, A19: Areas A21, A22, A23, A24, A25, A26, A27, A28, A29: Area C0: Target area count, C15, C25, C28: Area count CA: Total number of detections DA: detection area G1, G2: image P people

Claims (6)

  1.  旅客を輸送する車両の車内を撮像した画像を取得するステップと、
     前記画像から人を検出し、検出されたそれぞれの前記人に対して当該人を囲む矩形の検出領域を生成するステップと、
     前記画像の対象領域および前記検出領域に基づいて前記車両の混雑度を判定するステップと、
     を含み、
     前記対象領域は、前記画像の一部の領域であって、かつ前記車両の通路に立っている旅客が撮像される領域であり、
     前記判定するステップにおいて、前記対象領域と重なっている前記検出領域の数に基づいて前記車両の混雑度を判定する
     ことを特徴とする車両の混雑度判定方法。
    obtaining an image of the interior of a vehicle for transporting passengers;
    detecting a person from the image and generating, for each detected person, a rectangular detection area surrounding the person;
    determining the congestion degree of the vehicle based on the target area and the detection area of the image;
    including
    The target area is a partial area of the image and is an area in which a passenger standing in the aisle of the vehicle is imaged,
    A vehicle congestion degree determination method, wherein, in the determining step, the congestion degree of the vehicle is determined based on the number of the detection areas overlapping the target area.
  2.  前記対象領域は、前記車両の乗降口の近傍に立っている旅客が撮像される領域である
     請求項1に記載の車両の混雑度判定方法。
    2. The vehicle congestion degree determination method according to claim 1, wherein the target area is an area in which a passenger standing in the vicinity of an entrance/exit of the vehicle is imaged.
  3.  前記画像は、着席している旅客が撮像される第一領域を有し、
     前記第一領域に中心座標を有する前記検出領域は、前記対象領域と重なっている前記検出領域の数の算出対象から除外される
     請求項1または2に記載の車両の混雑度判定方法。
    the image has a first region in which a seated passenger is imaged;
    3. The vehicle congestion degree determination method according to claim 1, wherein the detection area having center coordinates in the first area is excluded from the calculation target of the number of the detection areas overlapping with the target area.
  4.  前記判定するステップにおいて、前記対象領域と重なっている前記検出領域の数と、前記画像から検出された前記人の総検出数と、に基づいて前記車両の混雑度を判定する
     請求項1から3の何れか1項に記載の車両の混雑度判定方法。
    4. The degree of congestion of the vehicles is determined in the determining step based on the number of the detection areas overlapping the target area and the total number of the persons detected from the image. The vehicle congestion degree determination method according to any one of the above.
  5.  前記取得するステップにおいて、前記通路における降車口の近傍に立っている旅客を撮像した第一の画像、および前記通路における乗車口の近傍に立っている旅客を撮像する第二の画像を取得し、
     前記判定するステップにおいて、前記第一の画像および前記第二の画像に基づいて前記車両の混雑度を判定する
     請求項1から4の何れか1項に記載の車両の混雑度判定方法。
    In the obtaining step, obtaining a first image of a passenger standing near an exit in the aisle and a second image of a passenger standing near the boarding entrance in the aisle;
    The vehicle congestion degree determination method according to any one of claims 1 to 4, wherein in the determination step, the vehicle congestion degree is determined based on the first image and the second image.
  6.  旅客を輸送する車両の車内を撮像して画像を出力するカメラと、
     前記画像から人を検出し、検出されたそれぞれの前記人に対して当該人を囲む矩形の検出領域を生成する検出部と、
     前記画像の対象領域および前記検出領域に基づいて前記車両の混雑度を判定する判定部と、
     を備え、
     前記対象領域は、前記画像の一部の領域であって、かつ前記車両の通路に立っている旅客が撮像される領域であり、
     前記判定部は、前記対象領域と重なっている前記検出領域の数に基づいて前記車両の混雑度を判定する
     車両の混雑度判定システム。
    a camera that captures the interior of a vehicle that transports passengers and outputs an image;
    a detection unit that detects a person from the image and generates a rectangular detection area surrounding each detected person;
    a determination unit that determines the congestion degree of the vehicle based on the target area and the detection area of the image;
    with
    The target area is a partial area of the image and is an area in which a passenger standing in the aisle of the vehicle is imaged,
    The vehicle congestion degree determination system, wherein the determination unit determines the congestion degree of the vehicle based on the number of the detection areas that overlap the target area.
PCT/JP2022/008921 2021-04-06 2022-03-02 Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle WO2022215394A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
MX2023010408A MX2023010408A (en) 2021-04-06 2022-03-02 Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle.

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021064498A JP7305698B2 (en) 2021-04-06 2021-04-06 Vehicle congestion determination method and vehicle congestion determination system
JP2021-064498 2021-04-06

Publications (1)

Publication Number Publication Date
WO2022215394A1 true WO2022215394A1 (en) 2022-10-13

Family

ID=83545842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/008921 WO2022215394A1 (en) 2021-04-06 2022-03-02 Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle

Country Status (4)

Country Link
JP (1) JP7305698B2 (en)
MX (1) MX2023010408A (en)
TW (1) TWI781069B (en)
WO (1) WO2022215394A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015220507A (en) * 2014-05-14 2015-12-07 富士通株式会社 Monitoring device, monitoring method and monitoring program
CN111079696A (en) * 2019-12-30 2020-04-28 深圳市昊岳电子有限公司 Detection method based on vehicle monitoring personnel crowding degree
JP2021003972A (en) * 2019-06-26 2021-01-14 株式会社東芝 Information processor, station management system, station management equipment and program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010003110A (en) * 2008-06-20 2010-01-07 Panasonic Corp On-vehicle moving image data recording device
DE102009039162A1 (en) * 2009-08-27 2011-03-17 Knorr-Bremse Gmbh Monitoring device and method for monitoring an entry or exit area from an access opening of a vehicle to a building part
CN111079474A (en) * 2018-10-19 2020-04-28 上海商汤智能科技有限公司 Passenger state analysis method and device, vehicle, electronic device, and storage medium
US20200273345A1 (en) * 2019-02-26 2020-08-27 Aptiv Technologies Limited Transportation system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015220507A (en) * 2014-05-14 2015-12-07 富士通株式会社 Monitoring device, monitoring method and monitoring program
JP2021003972A (en) * 2019-06-26 2021-01-14 株式会社東芝 Information processor, station management system, station management equipment and program
CN111079696A (en) * 2019-12-30 2020-04-28 深圳市昊岳电子有限公司 Detection method based on vehicle monitoring personnel crowding degree

Also Published As

Publication number Publication date
JP2022160020A (en) 2022-10-19
TW202241114A (en) 2022-10-16
MX2023010408A (en) 2023-09-18
JP7305698B2 (en) 2023-07-10
TWI781069B (en) 2022-10-11

Similar Documents

Publication Publication Date Title
US11747809B1 (en) System and method for evaluating the perception system of an autonomous vehicle
CN111310994B (en) Bus route prediction method and system based on data calibration
CN108898044B (en) Loading rate obtaining method, device and system and storage medium
JP5988472B2 (en) Monitoring system and congestion rate calculation method
WO2013088620A1 (en) Electronic device
JP4845580B2 (en) Train congestion notification system
US20130195364A1 (en) Situation determining apparatus, situation determining method, situation determining program, abnormality determining apparatus, abnormality determining method, abnormality determining program, and congestion estimating apparatus
KR101159230B1 (en) System and method for providing bus board information
Nakashima et al. Passenger counter based on random forest regressor using drive recorder and sensors in buses
US11288886B2 (en) People-gathering analysis device, movement destination prediction creation device, people-gathering analysis system, vehicle, and recording medium
US20190156672A1 (en) Operation management system and operation management program
JP5971045B2 (en) Surveillance system, imaging device, and data management device
CN108389392A (en) A kind of traffic accident responsibility identification system based on machine learning
WO2022215394A1 (en) Method for determining degree of crowding in vehicle, and system for determining degree of crowding in vehicle
CN110992678A (en) Bus passenger flow statistical method based on big data face recognition
JP7186749B2 (en) Management system, management method, management device, program and communication terminal
US11262205B2 (en) Traffic control apparatus, traffic control system, traffic control method, and non-transitory computer recording medium
KR20150067018A (en) Method and service server for providing crowd density information
CN110430120A (en) A kind of Bus information reminding method and system based on SNS
RU121628U1 (en) INTELLIGENT PASSENGER FLOW ANALYSIS SYSTEM USING TECHNICAL VISION
CN110188645A (en) For the method for detecting human face of vehicle-mounted scene, device, vehicle and storage medium
JP7347787B2 (en) Crowd situation management device, congestion situation management system, congestion situation management method and program
EP4068201A1 (en) Generation device, data analysis system, generation method and generation program
JP2022128660A (en) Congestion rate measuring system and program
JP7342054B2 (en) Boarding and alighting person counting system, boarding and alighting person counting method, and boarding and alighting person counting program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22784379

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: MX/A/2023/010408

Country of ref document: MX

WWE Wipo information: entry into national phase

Ref document number: 2301006449

Country of ref document: TH

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22784379

Country of ref document: EP

Kind code of ref document: A1