WO2022172400A1 - Vehicular monitoring device and vehicular monitoring system - Google Patents

Vehicular monitoring device and vehicular monitoring system Download PDF

Info

Publication number
WO2022172400A1
WO2022172400A1 PCT/JP2021/005263 JP2021005263W WO2022172400A1 WO 2022172400 A1 WO2022172400 A1 WO 2022172400A1 JP 2021005263 W JP2021005263 W JP 2021005263W WO 2022172400 A1 WO2022172400 A1 WO 2022172400A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
occupant
detection
rear seat
detection result
Prior art date
Application number
PCT/JP2021/005263
Other languages
French (fr)
Japanese (ja)
Inventor
太郎 熊谷
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to DE112021006342.1T priority Critical patent/DE112021006342T5/en
Priority to PCT/JP2021/005263 priority patent/WO2022172400A1/en
Priority to JP2022581114A priority patent/JP7446492B2/en
Publication of WO2022172400A1 publication Critical patent/WO2022172400A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/31Detection related to theft or to other events relevant to anti-theft systems of human presence inside or outside the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/593Recognising seat occupancy
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing

Definitions

  • the present disclosure relates to a vehicle monitoring device and a vehicle monitoring system that detect an occupant inside a vehicle or a person outside the vehicle.
  • control processing such as airbags and seat belt reminders according to the detection results such as the position of the occupants, notification of the presence of a suspicious person according to the detection results of people outside the vehicle, etc.
  • an imaging device is provided so that the imaging range includes the front and rear seats of the vehicle. In this case, it is detected in which seat the occupant is located (see, for example, Patent Document 1).
  • the present disclosure has been made to solve the above-described problems, and by using a detection result obtained from an imaging device and a detection result obtained from a sensor different from the imaging device, The purpose is to improve detection accuracy.
  • a first vehicle monitoring device includes a first information acquisition unit that acquires sensor information from a sensor whose detection range includes at least the rear seats of a vehicle; a second information acquisition unit that acquires a captured image from the device; an occupant detection unit that detects an occupant in a rear seat from the sensor information and the captured image respectively acquired by the first information acquisition unit and the second information acquisition unit; The occupant detection unit obtains from the captured image when the detection result regarding the presence or absence of the rear seat occupant obtained from the sensor information and the detection result regarding the presence or absence of the rear seat occupant obtained from the captured image are different.
  • the presence/absence of a rear-seat passenger is specified using the detection result regarding the presence/absence of a rear-seat passenger.
  • a first vehicle monitoring system includes a sensor mounted in a vehicle and including at least the rear seats of the vehicle in its detection range, and an imaging sensor mounted in the vehicle and including the front and rear seats of the vehicle in its imaging range.
  • a device a first information acquisition unit that acquires sensor information from a sensor, a second information acquisition unit that acquires a captured image from an imaging device, and the first information acquisition unit and the second information acquisition unit respectively acquired, and an occupant detection unit that detects an occupant in the rear seat from the sensor information and the captured image. If the detection result regarding the presence/absence of the passenger in the rear seat is different, the presence/absence of the passenger in the rear seat is specified using the detection result regarding the presence/absence of the passenger in the rear seat obtained from the captured image.
  • a second vehicle monitoring device includes a first information acquisition unit that acquires sensor information from a sensor whose detection range includes the outside of the vehicle, and an imaging device that includes the outside of the vehicle in its imaging range.
  • a second information acquisition unit that acquires an image
  • a vehicle exterior detection unit that detects a person outside the vehicle from the sensor information and the captured image respectively acquired by the first information acquisition unit and the second information acquisition unit, If the detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information differs from the detection result regarding the presence or absence of a person outside the vehicle obtained from the captured image, the vehicle exterior detection unit The existence or non-existence of a person outside the vehicle is specified using the detection result regarding the existence or non-existence of the person outside the vehicle.
  • a second vehicle monitoring system includes a sensor mounted on a vehicle and including the exterior of the vehicle in its detection range, an imaging device mounted on the vehicle and including the exterior of the vehicle in its imaging range, and a sensor, A first information acquisition unit that acquires sensor information, a second information acquisition unit that acquires a captured image from an imaging device, and the sensor information and the captured image acquired by the first information acquisition unit and the second information acquisition unit, respectively and a vehicle exterior detection unit that detects a person outside the vehicle, and the vehicle exterior detection unit detects a detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information, and a person outside the vehicle obtained from the captured image. If the detection result regarding the presence or absence of a person is different, the presence or absence of the person outside the vehicle is specified using the detection result regarding the presence or absence of the person outside the vehicle obtained from the captured image.
  • an occupant inside the vehicle or a person outside the vehicle that is, a target person is detected using an imaging device and a sensor different from the imaging device, so the detection accuracy of the target person can be improved.
  • FIG. 1 is a block diagram showing a configuration example of a vehicle monitoring system according to Embodiment 1;
  • FIG. 4A and 4B are explanatory diagrams showing a detection range of a sensor and an imaging range of an imaging device according to Embodiment 1;
  • FIG. FIG. 5 is an explanatory diagram showing an example of occupant detection using a captured image according to Embodiment 1;
  • 4 is a flowchart showing an operation example of the vehicle monitoring device according to Embodiment 1;
  • 1 is a diagram showing an example hardware configuration of a vehicle monitoring device according to Embodiment 1;
  • FIG. FIG. 11 is an explanatory diagram showing an example of occupant detection using a captured image according to Embodiment 2;
  • FIG. 11 is an explanatory diagram showing an example of occupant detection using a captured image according to Embodiment 2; 9 is a flowchart showing an operation example of the vehicle monitoring device according to Embodiment 2; FIG. 11 is a block diagram showing a configuration example of a vehicle monitoring system according to Embodiment 3; FIG. 11 is an explanatory diagram showing an example of detection of a person outside the vehicle by the vehicle monitoring device according to Embodiment 3; 10 is a flow chart showing an operation example of the vehicle monitoring device according to Embodiment 3; FIG. 11 is a diagram showing a hardware configuration example of a vehicle monitoring device according to Embodiment 3;
  • FIG. 1 is a block diagram showing a configuration example of a vehicle monitoring system 100 according to Embodiment 1.
  • the vehicle monitoring system 100 includes a vehicle monitoring device 10, a sensor 20, and an imaging device 30.
  • the vehicle monitoring device 10, the sensor 20, and the imaging device 30 are each mounted on a vehicle.
  • the vehicle monitoring device 10 is connected to a vehicle-side control device 200 that controls in-vehicle devices such as air conditioners, audio devices, navigation devices, and notification units, and the engine, etc., in the vehicle on which the vehicle monitoring device 10 is mounted.
  • in-vehicle devices such as air conditioners, audio devices, navigation devices, and notification units, and the engine, etc.
  • a target person in detection by the vehicle monitoring system 100 is a passenger in the rear seat.
  • the vehicle monitoring system 100 may detect not only rear seat passengers, but also front seat passengers as target persons.
  • FIG. 2 is an explanatory diagram showing the detection range of the sensor 20 and the imaging range of the imaging device 30 according to the first embodiment.
  • 2A is a side view of the interior of a vehicle 50 equipped with the vehicle monitoring device 10
  • FIG. 2B is a top view of the interior of the vehicle 50 equipped with the vehicle monitoring device 10.
  • the sensor 20 is a sensor 20 mounted in a vehicle and provided so as to be able to detect at least rear seat passengers among the passengers in the vehicle.
  • the sensor 20 is a sensor 20 different from the imaging device 30, such as a radio wave sensor 21, a weight sensor (seating sensor), a voice sensor, an ultrasonic sensor, etc., which can detect a passenger, for example.
  • the sensor 20 is any one of these or a combination of a plurality of them.
  • FIG. 2A shows an example in which the sensor 20 is a radio wave sensor 21, and the detection range of the radio wave sensor 21 is indicated by area A.
  • the radio wave sensor 21 is, for example, a sensor that transmits millimeter waves and receives reflected waves reflected by a moving object.
  • the sensor 20 is provided on the ceiling of the compartment of the vehicle 50 or the like so that at least the rear seats are included in the detection range.
  • FIG. 2B shows an example in which an occupant 58 is seated in the seat 53 on the left side of the rear seat.
  • the seat on the left side of the rear seat refers to the seat 53 on the left side with respect to the front of the vehicle among the rear seats
  • the seat on the right side of the rear seat refers to the seat 54 on the right side with respect to the front of the vehicle among the rear seats
  • the seat in the middle of the rear seat refers to the seat 55 provided between the left seat and the right seat of the rear seats.
  • the term “rear seat” when the term “rear seat” is simply used without any particular notice, it refers to the seat 53 on the left side of the rear seat, the seat 54 on the right side of the rear seat, and the seat 55 in the center of the rear seat.
  • the number of rear seats does not have to be three as in the example of FIG. 2B, and the number of rear seats is arbitrary.
  • the imaging device 30 is composed of, for example, a wide-angle camera, an infrared camera, etc., and images the inside of the vehicle 50 . Further, the imaging device 30 may be a TOF (Time-of-flight) camera capable of capturing an image reflecting the distance between the imaging device 30 and the subject. The imaging device 30 captures an image of the interior of the vehicle at intervals of, for example, 30 to 60 fps (frames per second), and outputs the captured image to the second information acquisition section 2 of the information acquisition section 11 of the vehicle monitoring device 10 . An image captured by the imaging device 30 is hereinafter referred to as a captured image.
  • a captured image An image captured by the imaging device 30 is hereinafter referred to as a captured image.
  • the area B indicates the imaging range of the imaging device 30 .
  • the imaging device 30 includes one or more units, an overhead console, an instrument panel, and a steering wheel so as to include the front seats of the driver's seat 51 and the front passenger seat 52 and at least one of the rear seats in the imaging range. They are installed in columns, room mirrors, etc.
  • passengers in the vehicle such as passengers 56, 57, and 58, who are to be detected by the vehicle monitoring device 10, are collectively referred to as "passengers". That is, the occupants include the driver.
  • the vehicle monitoring device 10 includes an information acquisition unit 11 that acquires sensor information and captured images from the sensor 20 and the imaging device 30, respectively, and an occupant detection unit 12 that detects passengers in the rear seats from the sensor information and captured images. .
  • the information acquisition section 11 has a first information acquisition section 1 and a second information acquisition section 2 .
  • the first information acquisition section 1 of the information acquisition section 11 is connected to the sensor 20 and acquires sensor information from the sensor 20 .
  • the sensor information is information regarding the presence or absence of passengers in the vehicle.
  • the sensor information includes distance data indicating the distance between the radio wave sensor 21 and the detection target, angle data indicating the angle of the detection target with respect to the radio wave sensor 21, and the like.
  • the sensor information includes seat data indicating information on the seat provided with the seat sensor, weight data detected by the seat sensor, and the like.
  • the sensor information is voice data or the like indicating the volume, direction of arrival, etc. of the voice uttered by the passenger.
  • the sensor information is any one of these or a combination of a plurality of them.
  • the second information acquisition section 2 of the information acquisition section 11 is connected to the imaging device 30 and acquires a captured image from the imaging device 30 .
  • the first information acquisition unit 1 and the second information acquisition unit 2 are shown separately in the example of FIG. You can go with one configuration.
  • the information acquisition unit 11 then outputs the acquired sensor information and captured image to the occupant detection unit 12, which will be described later.
  • the information acquisition section 11 also has a vehicle information acquisition section 3 connected to the vehicle-side control device 200 .
  • the vehicle information acquisition unit 3 acquires vehicle information such as signals related to starting and stopping of the vehicle from the vehicle-side control device 200 . Then, the vehicle information acquisition unit 3 instructs the first information acquisition unit 1 and the second information acquisition unit 2 to start acquiring the sensor information and the captured image using the vehicle information acquired from the vehicle-side control device 200. A signal or a signal indicating that the acquisition of the sensor information and the captured image is to be terminated is output.
  • the vehicle information acquisition unit 3 receives information from the vehicle-side control device 200 that the doors are unlocked, the doors are opened, the ignition is turned on, the human sensor is turned on, the shift lever is moved to the drive position, and the vehicle speed is 0 km/h. , the navigation device has started guidance, or the vehicle has left home, the sensor information and the captured image are sent to the first information acquisition unit 1 and the second information acquisition unit 2 output a signal to start acquisition of
  • the vehicle information acquisition unit 3 receives information from the vehicle-side control device 200 that the ignition is turned off, the motion sensor is turned off, the shift lever is moved to the parking position, the navigation device has finished guidance, and the vehicle is heading home.
  • a signal to the effect that the acquisition of the sensor information and the captured image is to be terminated is output to the first information acquisition section 1 and the second information acquisition section 2 .
  • the occupant detection unit 12 of the vehicle monitoring device 10 will be explained.
  • the occupant detection unit 12 includes a first detection unit 4 that detects an occupant using sensor information, a second detection unit 5 that detects an occupant using a captured image, and the first detection unit 4 and the second detection unit 5.
  • a presence/absence determination unit 6 that determines whether or not an occupant is present in the rear seat using the detection result.
  • the first detection unit 4 and the second detection unit 5 of the occupant detection unit 12 are connected to the first information acquisition unit 1 and the second information acquisition unit 2 of the information acquisition unit 11, respectively.
  • the process of detecting passengers in the rear seats by the passenger detection unit 12 will be explained.
  • the first detection unit 4 obtains distance data and angle data obtained from the radio wave sensor 21 as sensor information and detects an occupant in the rear seat will be described.
  • the first detection unit 4 acquires distance data and angle data as sensor information from the first information acquisition unit 1 of the information acquisition unit 11, and determines the size of the detection target and the detection target such as a person or an animal. Calculate the existing range, etc. For example, if the first detection unit 4 finds that the detection target exists on the left side of the rear seat from the calculated size and range of the detection target, the detection result indicates that an occupant has been detected on the left side of the rear seat. is derived, and the detection result is output to a storage unit (not shown) of the vehicle monitoring device 10 .
  • the first detection unit 4 determines that the occupant has been detected on the right side of the rear seat.
  • the detection result is output to the storage unit.
  • the first detection unit 4 finds that the detection target exists in the center of the rear seat from the calculated size and range of the detection target, the passenger is detected in the center of the rear seat. output the detection result.
  • the occupant detection processing by the first detection unit 4 is not limited to the example described above, and various known algorithms can be used. In the above example, the detection result of the first detection unit 4 is stored in the storage unit, but the first detection unit 4 may output the detection result to the existence determination unit 6 .
  • the existence of an occupant includes not only the state in which the occupant is strictly sitting on the seat, but also the state in which the occupant is present at the feet of the seat.
  • a small-sized occupant such as an infant may be at the foot of the seat.
  • the sensor information obtained from the radio wave sensor 21 is used to detect the occupant, it is possible to detect the occupant at the foot of the seat. It's for.
  • the occupant detection process using the sensor information by the first detection unit 4 will be referred to as the first occupant detection process for the sake of explanation.
  • a detection result of the first occupant detection process by the first detection unit 4 is called a first detection result.
  • the reliability of the detection results may become an issue.
  • the sensor 20 is the radio wave sensor 21
  • the resolution of the distance data and angle data obtained from the radio wave sensor 21 cannot be guaranteed, and multiple occupants sitting in different seats are detected in a row. In some cases, it may not be possible to accurately determine which seat the occupant is in.
  • the sensor 20 is a seat sensor, depending on the posture of the occupant sitting on the seat, the weight of the occupant sitting on the seat cannot be measured correctly, and it is possible to determine whether the object placed on the seat is a person. It may not be possible to determine whether
  • the imaging device 30 when the imaging device 30 is used to detect the occupant, a more reliable detection result can be obtained than when the sensor 20 different from the imaging device 30 is used to detect the occupant. Therefore, regarding the detection of the occupant in the rear seat, when the occupant is detected from the captured image, the occupant detection accuracy can be improved by using the detection result of the occupant detection processing using the captured image.
  • FIG. 3 is an explanatory diagram showing an example of occupant detection using captured images according to the first embodiment.
  • FIG. 3 shows captured images acquired from the imaging device 30, and an occupant 56 sitting in the driver's seat 51, an occupant 57 sitting in the passenger's seat 52, and an occupant 57 sitting in the passenger's seat 52 are detected by the occupant detection processing of the second detection unit 5, which will be described later.
  • An example is shown in which an occupant 58 seated on the seat 53 on the left side of the seat and an occupant 60 seated on the seat 55 in the center of the rear seat are respectively detected.
  • the second detection unit 5 analyzes the captured image and detects the face of the passenger in the captured image. Then, the second detection unit 5 acquires an area where the occupant's face is detected (the area indicated by the dashed line in FIG. 3; hereinafter referred to as a face area) and occupant feature information in the face area.
  • the characteristic information of the occupant is, for example, the contrast ratio of the eyes, nose, mouth, and cheeks after normalizing the size of the face.
  • the second detection unit 5 may determine whether or not the face detection is successful using the occupant's characteristic information.
  • the second detection unit 5 determines that face detection has succeeded, and the luminance distribution does not look like a face. , it is determined that the face detection has failed.
  • the second detection unit 5 After detecting the occupant's face, acquires the coordinates of the face area surrounding the occupant's face, such as a rectangle that contacts the contour of the occupant's face, from the captured image.
  • the coordinates relating to the face area are, for example, the coordinates of each vertex, center, etc. of the rectangle when the face area is rectangular. Further, the second detection unit 5 calculates dimensions such as the width, height, and area of the face area from the coordinates of the face area.
  • the second detection unit 5 identifies the seat where the occupant is present from the acquired coordinates, width, height, area, etc. of the face region, and detects information regarding the presence or absence of the occupant and the seat where the occupant is present. As a result, it is output to the storage unit of the vehicle monitoring device 10 .
  • the occupant detection process by the second detection unit 5 is not limited to the above example, and various known algorithms can be used.
  • the detection result of the second detection unit 5 is stored in the storage unit, but the second detection unit 5 may output the detection result to the existence determination unit 6.
  • the occupant detection process using the captured image by the second detection unit 5 will be referred to as the second occupant detection process.
  • a detection result of the second occupant detection process by the second detection unit 5 is called a second detection result.
  • the presence/absence determination unit 6 of the present embodiment determines that there is a passenger in the rear seat if the passenger in the rear seat can be detected from the captured image even if the passenger in the rear seat is not detected from the sensor information. In this way, if the detection result regarding the presence or absence of a rear seat passenger obtained from the sensor information differs from the detection result regarding the presence or absence of a rear seat passenger obtained from the captured image, the passenger detection unit 12 The presence or absence of the passenger in the rear seat is specified using the obtained detection result regarding the presence or absence of the passenger in the rear seat.
  • FIG. 4 is a flow chart showing an operation example of the vehicle monitoring device 10 according to the first embodiment. Although the flowchart of FIG. 4 does not show a process for terminating the operation of the vehicle monitoring device 10, the vehicle monitoring device 10, for example, the vehicle information acquisition unit 3 receives the vehicle 50 from the vehicle-side control device 200. When the signal indicating that the engine has stopped is acquired, the operations of the first occupant detection process and the second occupant detection process are terminated. In the explanation below, even if multiple rear seats are provided, the rear seats are collectively referred to as rear seats. shall be processed for each seat.
  • the first information acquiring section 1 of the vehicle monitoring device 10 acquires sensor information from the sensor 20 (ST101).
  • the first detection section 4 performs a first occupant detection process using sensor information (ST102).
  • the first detection unit 4 performs the first occupant detection process for each of the rear seats, that is, each of the rear seats provided in the vehicle.
  • the determination of whether or not an occupant has been detected for each of the rear seats by the first detection unit 4 is performed by, for example, the distance data acquired as sensor information, the size and presence of the detection target calculated from the angle data, and the like. The determination may be made based on which of the rear seats the detection target is located in, based on the range of detection.
  • the first detection unit 4 outputs to the storage unit of the vehicle monitoring device 10 the detection result indicating whether or not the passenger in the rear seat has been detected.
  • the first detection unit 4 stores the detection result indicating that an occupant in the rear seat has been detected in the storage unit of the vehicle monitoring device 10 as the first detection result. output, and the rear seat first flag is turned ON (ST103).
  • the first detection unit 4 sets the detection result indicating that the passenger in the rear seat is not detected as the first detection result to the vehicle monitoring device 10. and the rear seat first flag is turned OFF (ST104).
  • the rear seat first flag stored in the storage unit indicates whether or not an occupant is detected in the rear seat by the first occupant detection process. That is, when the first detection unit 4 outputs the first detection result indicating that the passenger in the rear seat has been detected, the rear seat first flag is turned ON, and the first detection unit 4 detects the passenger in the rear seat. If the first detection result indicating that there is no seat is output, the rear seat first flag is turned OFF.
  • a plurality or a single number of rear seat first flags are provided according to the number of rear seats. For example, if the number of rear seats is three and the vehicle is provided with a seat on the left side of the rear seat, a seat on the right side of the rear seat, and a seat in the center of the rear seat, the rear seat first flag The first flag, the rear seat right side first flag, and the rear seat center first flag.
  • the second information acquisition section 2 of the vehicle monitoring device 10 acquires the captured image from the imaging device 30 (ST105).
  • the second detection unit 5 uses the captured image to perform a second occupant detection process (ST106).
  • the second detection unit 5 detects the occupant of each rear seat, that is, each of the rear seats provided in the vehicle.
  • the determination of whether or not an occupant has been detected for each of the rear seats by the second detection unit 5 is performed by, for example, characteristic information detected from the captured image, the coordinates of the face region, the width, the height, and the The determination may be made on the basis of which of the rear seats the detection target is located in, based on the area or the like.
  • the second detection unit 5 outputs the detection result as to whether or not an occupant in the rear seat has been detected to the storage unit of the vehicle monitoring device 10 .
  • the second detection unit 5 stores the detection result indicating that an occupant in the rear seat has been detected in the storage unit of the vehicle monitoring device 10 as a second detection result. output, and the rear seat second flag is turned ON (ST107).
  • the second detection unit 5 sets the detection result indicating that the passenger in the rear seat is not detected as the second detection result to the vehicle monitoring device 10. , and the rear seat second flag is turned OFF (ST108).
  • the rear seat second flag stored in the storage unit indicates whether or not a passenger is detected in the rear seat by the second passenger detection process. That is, when the second detection unit 5 outputs a second detection result indicating that a rear seat passenger has been detected, the rear seat second flag is turned ON, and the second detection unit 5 detects a rear seat passenger. If the second detection result indicating that there is no seat is output, the rear seat second flag is turned OFF.
  • a plurality or a single number of rear seat second flags are provided according to the number of rear seats. For example, if the number of rear seats is three, and the vehicle is provided with a seat on the left side of the rear seat, a seat on the right side of the rear seat, and a seat in the center of the rear seat, the second rear seat flag The second flag, the rear seat right side second flag, and the rear seat center second flag.
  • the presence/absence determination unit 6 of the vehicle monitoring device 10 acquires the first detection result from the first detection unit 4 and acquires the second detection result from the second detection unit 5 . Then, the presence/absence determination unit 6 uses the first detection result and the second detection result to determine whether or not an occupant is present in the rear seat.
  • the presence/absence determining unit 6 refers to the rear seat first flag as the first detection result, and determines whether or not the rear seat first flag is ON (ST109), and the rear seat occupant is detected from the sensor information. Check whether or not Then, if the rear seat first flag is ON (ST109; YES), that is, if it is confirmed from the sensor information that an occupant in the rear seat is detected, the presence/absence determination unit 6 determines that there is an occupant in the rear seat. It determines and outputs the determination result to the vehicle-side control device 200 (ST110).
  • the determination result by the presence/absence determination unit 6 is output to, for example, a notification control unit (not shown) that controls a seatbelt reminder of the vehicle-side control device 200, and the notification control unit detects that a passenger is present in the rear seat. When it receives the determination result that it should, the function of the seat belt reminder in the rear seat is turned on.
  • the presence/absence determination unit 6 performs ST111 to be described next. Proceed to processing.
  • the presence/absence determining unit 6 refers to the rear seat second flag as the second detection result, and determines whether or not the rear seat second flag is ON (ST111), and the rear seat occupant is detected from the captured image. Check whether or not Then, if the rear seat second flag is ON (ST111; YES), that is, if it is confirmed that a rear seat occupant is detected from the captured image, the presence/absence determination unit 6 determines that a rear seat occupant is present. It determines and outputs the determination result to the vehicle-side control device 200 (ST110).
  • the rear seat second flag is OFF (ST111; NO), that is, if the presence/absence determination unit 6 confirms that the passenger in the rear seat is not detected from the captured image, the passenger is present in the rear seat. It determines not to do so, and outputs the determination result to vehicle-side control device 200 (ST112).
  • the occupant detection unit 12 obtains a detection result indicating that no occupant was detected in the rear seat from the sensor information, and obtained a detection result indicating that an occupant in the rear seat was detected from the captured image. In this case, it is specified that there is an occupant in the rear seat. As a result, even if an occupant in the rear seat is not actually detected from the sensor information, even if the occupant in the rear seat is not detected from the captured image, when the occupant in the rear seat is detected from the captured image, Since it is specified that a passenger is present in the rear seat using the detection result obtained from the above, it is possible to prevent failure to detect the passenger in the rear seat and improve the detection accuracy of the passenger in the rear seat, that is, the target person.
  • the second detection unit 5 may perform the second occupant detection process on the rear seat for which the first detection result indicates that no occupant has been detected. . In this way, detection omissions can be suppressed, and the processing load can be reduced because the second occupant detection processing is performed for the seats where there is a possibility of detection omissions.
  • the vehicle monitoring device 10 performs the processing of ST101 to ST104 and then performs the processing of ST105 to ST108, but the flowchart of FIG. 4 is an example. The processing of ST101 to ST104 and the processing of ST105 to ST108 may be performed in parallel.
  • FIG. 5 is a diagram showing a hardware configuration example of the vehicle monitoring device 10 according to the first embodiment.
  • First information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first detection unit 4, second detection unit 5, presence/absence determination unit 6, occupant detection unit in vehicle monitoring device 10 12, and the functions of the storage unit are realized by the processing circuit. That is, of the vehicle monitoring device 10, the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, the first detection unit 4, the second detection unit 5, the presence/absence determination unit 6,
  • the occupant detection unit 12 and the storage unit may be a processing circuit 10a that is dedicated hardware as shown in FIG. 5A, or may be a processor that executes a program stored in a memory 10c as shown in FIG. 5B. 10b.
  • the processing circuit 10a is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-programmable Gate Array), or a combination of these.
  • Each function of each section may be realized by a processing circuit, or the functions of each section may be collectively realized by one processing circuit.
  • the functions of each unit are realized by software, firmware, or a combination of software and firmware.
  • Software or firmware is written as a program and stored in the memory 10c.
  • the processor 10b reads and executes the programs stored in the memory 10c to obtain the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, the first detection unit 4, It implements the functions of the second detection unit 5, the presence/absence determination unit 6, the occupant detection unit 12, and the storage unit.
  • the storage unit comprises a memory 10c for storing a program which, when executed by the processor 10b, results in the execution of the steps shown in FIG. Further, these programs include a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first detection unit 4, a second detection unit 5, a presence/absence determination unit 6, an occupant It can also be said that a computer executes the procedures or methods of the detection unit 12 and the storage unit.
  • the processor 10b is, for example, a CPU (Central Processing Unit), a processing device, an arithmetic device, a processor, a microprocessor, a microcomputer, or a DSP (Digital Signal Processor).
  • the memory 10c may be non-volatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically EPROM), etc.
  • it may be a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a mini disk, a CD (Compact Disc) or a DVD (Digital Versatile Disc).
  • Each function of the storage unit may be partly realized by dedicated hardware and partly realized by software or firmware.
  • the processing circuit 10a in the vehicle monitoring device 10 can realize each function described above by hardware, software, firmware, or a combination thereof.
  • a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first detection unit 4, a second detection unit 5, a presence/absence determination unit 6, an occupant detection unit 12, and At least part of the functions of the storage unit may be executed by an external server.
  • the vehicle monitoring device 10 includes the first information acquisition unit 1 that acquires sensor information from the sensor 20 whose detection range includes the rear seats of the vehicle, and the imaging device that includes the front and rear seats of the vehicle as its imaging range. From 30, a second information acquisition unit 2 that acquires a captured image, and a passenger detection that detects a passenger in a rear seat from the sensor information and the captured image respectively acquired by the first information acquisition unit 1 and the second information acquisition unit 2.
  • the rear seat occupant when the detection result regarding the presence or absence of the occupant in the rear seat obtained from the sensor information and the detection result regarding the presence or absence of the occupant in the rear seat obtained from the captured image are different, If the presence or absence of the rear seat occupant is specified using the detection result regarding the presence or absence of the rear seat occupant obtained from the captured image, the rear seat occupant, that is, the target, can be prevented from failing to detect the rear seat occupant. People detection accuracy can be improved.
  • a vehicle monitoring apparatus 10 includes, as in Embodiment 1, a first information acquisition unit 1 that acquires sensor information, a second information acquisition unit 2 that acquires a captured image, and a first information acquisition unit 2 that acquires a captured image.
  • An occupant detection unit 12 that detects an occupant in a rear seat from the sensor information and captured image acquired by the unit 1 and the second information acquisition unit 2, respectively.
  • the captured image is used to determine whether or not there is an occupant in the rear seat.
  • the processing operation of the vehicle monitoring device 10 is different.
  • the target person in the detection of the vehicle monitoring system 100 is the passenger in the rear seat. It should be noted that the vehicle monitoring system 100 may detect not only rear-seat passengers, but also front-seat passengers as target persons.
  • the sensor 20 is the radio wave sensor 21, and if the radio wave sensor 21 detects that the reflected wave is reflected from the moving object present on the seat. For example, it is determined that the detection target exists on the seat. In such a case, even though the moving object present on the seat is, for example, water in a PET bottle and an object other than the occupant, it is possible to determine if the occupant is on the seat based on the reflected wave from the PET bottle. Existence may be falsely detected.
  • the object on the seat may be an object other than the occupant depending on the weight of the object on the seat. In spite of this, there is a risk of erroneous detection when an occupant is present in the seat.
  • the senor 20 different from the imaging device 30 detects an occupant
  • the captured image of the second detection unit 5 is used. If the second occupant detection process reveals that no occupant is present in the rear seat, priority is given to the detection result using the captured image, and it is specified that no occupant is present in the rear seat.
  • the occupant detection processing by the second detection unit 5 of the present embodiment will be described below.
  • the second detection unit 5 of the present embodiment detects whether a rear seat passenger is present from the captured image. determine whether or not
  • FIG. 6 is an explanatory diagram showing an example of occupant detection using captured images according to the second embodiment.
  • crew members 56 and 57 are respectively seated in a driver's seat 51 and a front passenger seat 52 of a vehicle, and among the rear seats, a passenger 58 and a passenger 58 are seated in a seat 53 on the left side of the rear seat and a seat 54 on the right side of the rear seat.
  • 59 shows an example in which the state in which the person 59 is seated is imaged.
  • a child seat 61 is placed on a seat 55 in the center of the rear seat, without an occupant sitting there.
  • the detection result of the first occupant detection process using the sensor information is obtained by the child seat 61 placed in the seat 55 in the center of the rear seat, as described above. , may indicate that an occupant has been detected in the middle rear seat 55 .
  • the second detection unit 5 acquires the captured image from the second information acquisition unit 2, and extracts the image in the set area from the captured image.
  • the set area is, for example, an area set so as to include a specific seat among the rear seats, such as the seat 55 in the center of the rear seat, as shown in area C in FIG.
  • the second detection unit 5 compares the extracted image with an image for similarity determination, which is stored in advance in the storage unit or the like, such as an image of the child seat 61 or an image of a PET bottle. By doing so, the object in the extracted image is specified.
  • the identification of the object in the image extracted by the second detection unit 5 can use a known object recognition method such as instance segmentation or template matching.
  • the second detection unit 5 When the second detection unit 5 identifies that an object in the image extracted corresponding to a specific seat among the rear seats is, for example, a child seat 61 or the like, and is an object other than a passenger, A detection result indicating that there is an object other than the passenger, that is, there is no passenger in the rear seat, is output to the storage unit of the vehicle monitoring device 10 . On the other hand, there are cases where an object on a seat cannot be specified because it is blocked by passengers in other seats than the specific seat. If the object appearing in the extracted image is not specified, the second detection unit 5 outputs the detection result indicating that it is not clear whether or not the passenger is present in the rear seat. output to the storage unit of Note that the detection processing described above is an example, and it is also possible to specify an object in a captured image using various known algorithms.
  • the process of determining whether or not there is a passenger in the rear seat by the second detection unit 5 is not limited to the above example.
  • Embodiment 1 an example in which the second detection unit 5 performs face detection processing on a captured image has been described. It is also possible to set and perform face detection processing within the detection area.
  • the detection region provided in the rear seat is the driver's seat 51 or the passenger's seat 52, that is, the passenger in the front seat. If the passenger in the rear seat is not detected within the detection area while the vehicle is not intruded, it becomes clear that no passenger is seated in the rear seat. This is because when the front seat occupant does not enter the detection area provided in the rear seat, the rear seat occupant is less likely to be blocked by the front seat occupant.
  • FIG. 7 is an explanatory diagram showing an example of occupant detection using captured images according to the second embodiment.
  • passengers 56 and 57 are seated in a driver's seat 51 and a front passenger's seat 52 of the vehicle, respectively.
  • 59 are taken in a seated state.
  • neither an occupant nor an object is present in the seat 55 in the middle of the rear seat.
  • the occupants 56 and 57 in the driver's seat 51 or the passenger's seat 52 do not enter the detection area (area E shown in FIG. 7) provided for the center of the rear seat.
  • the occupant's face is not detected within the detection area provided for the center of the seat. In such a case, even if an occupant is not detected in the center of the rear seat by the detection processing using the captured image, it is clear that there is no occupant in the center of the rear seat.
  • FIG. 7 shows an example in which an occupant 59 in the seat 54 on the right side of the rear seat is shielded by an occupant 56 in the driver's seat 51 . That is, in the detection area provided for the right side of the rear seat (area D shown in FIG. 7), the occupant 59 on the right side of the rear seat is not detected, while the occupant 56 on the driver's seat 51 is provided on the right side of the rear seat. significantly intrudes into the detection area. In the above case, even if an occupant is not detected on the right side of the rear seat in the detection process using the captured image, it is not clear whether there is an occupant on the right side of the rear seat.
  • the second detection unit 5 detects the presence of the driver's seat 51 or the passenger's seat in the detection area provided for the rear seat in the captured image. If the area of the face area detected from the occupants 56 and 57 in the seat 52 is less than the set threshold value, there is no occupant in the rear seat that is not blocked by the occupant in the front seat. Determine that it does not exist. On the other hand, if the area of the detected face area of the occupant in the driver's seat or front passenger's seat in the detection area provided for the rear seat is greater than or equal to the set threshold value, whether or not the occupant is present in the rear seat. is not clear.
  • the passenger detection unit 12 detects the presence or absence of the passenger in the rear seat obtained from the captured image.
  • the presence or absence of a passenger in the rear seat is specified using the detection result regarding the presence or absence of the passenger in the seat. That is, even if the presence/absence determination unit 6 obtains a detection result indicating that a passenger is present in the rear seat from the sensor information from the second detection unit 5, the captured image indicates that there is no passenger present in the rear seat.
  • the result is obtained, it is determined that there is no occupant in the rear seat using the detection result obtained from the captured image.
  • the detection result that there is a passenger in the rear seat obtained from the sensor information is due to an erroneous detection, if it is clear from the captured image that there is no passenger in the rear seat, the rear seat Since it is specified that there is no occupant in the vehicle, erroneous detection in the occupant detection process can be suppressed, and the accuracy of occupant detection can be improved.
  • the presence/absence determining unit 6 obtains a detection result indicating that a passenger is present in the rear seat from the sensor information, and obtains a detection result indicating that it is unclear whether or not a passenger is present in the rear seat from the captured image. If so, it is determined that there is an occupant in the rear seat.
  • FIG. 8 is a flow chart showing an operation example of the vehicle monitoring device 10 according to the second embodiment.
  • steps that are the same as the processing of the vehicle monitoring device 10 according to the first embodiment are denoted by the same reference numerals as those shown in FIG. 4, and description thereof will be omitted or simplified.
  • the flowchart of FIG. 8 does not show a process for terminating the operation of the vehicle monitoring device 10, the vehicle monitoring device 10, for example, the vehicle information acquisition unit 3 receives the information of the vehicle 50 from the vehicle-side control device 200.
  • the vehicle information acquisition unit 3 receives the information of the vehicle 50 from the vehicle-side control device 200.
  • the operations of the first occupant detection process and the second occupant detection process are terminated.
  • the rear seats are collectively referred to as rear seats. shall be processed for each seat.
  • the first information acquiring section 1 of the vehicle monitoring device 10 acquires sensor information from the sensor 20 (ST101).
  • the first detection section 4 performs a first occupant detection process using sensor information (ST102).
  • the first detection unit 4 outputs the detection result as to whether or not an occupant in the rear seat has been detected to the storage unit of the vehicle monitoring device 10 .
  • the first detection unit 4 stores the detection result indicating that an occupant in the rear seat has been detected in the storage unit of the vehicle monitoring device 10 as the first detection result. output, and the rear seat first flag is turned ON (ST103).
  • the first detection unit 4 sets the detection result indicating that the passenger in the rear seat is not detected as the first detection result to the vehicle monitoring device 10. and the rear seat first flag is turned OFF (ST104).
  • the second information acquisition section 2 of the vehicle monitoring device 10 acquires the captured image from the imaging device 30 (ST105).
  • the second detection unit 5 uses the captured image to perform a second occupant detection process (ST106).
  • the second detection unit 5 outputs the detection result as to whether or not an occupant in the rear seat has been detected to the storage unit of the vehicle monitoring device 10 .
  • the second detection unit 5 stores the detection result indicating that an occupant in the rear seat has been detected in the storage unit of the vehicle monitoring device 10 as a second detection result. output, and the rear seat second flag is turned ON (ST107).
  • the second detection unit 5 sets the detection result indicating that the passenger in the rear seat is not detected as the second detection result to the vehicle monitoring device 10. , and the rear seat second flag is turned OFF (ST108).
  • the second detection unit 5 determines whether or not there is an occupant in the rear seat (ST201).
  • the second detection unit 5 determines whether or not an occupant is present in the rear seat by, for example, extracting an image in a region set to include a specific seat among the rear seats in the captured image. , by identifying whether or not the object shown in the extracted image is a passenger.
  • the second detection unit 5 determines that there is no passenger in the rear seat (ST201; YES)
  • the second detection unit 5 outputs a detection result indicating that there is no passenger in the rear seat to the storage unit, and turns on the third rear seat flag. (ST202).
  • the second detection unit 5 determines that it is unclear whether or not there is an occupant in the rear seat (ST201; NO)
  • it stores the detection result indicating that it is unclear whether or not there is an occupant in the rear seat. and the rear seat third flag is turned OFF (ST203).
  • the rear seat third flag stored in the storage unit indicates whether or not there is an occupant in the rear seat. That is, if the second detection unit 5 outputs a detection result indicating that there is no passenger in the rear seat, the rear seat third flag is turned ON, and the second detection unit 5 clearly determines whether or not there is a passenger in the rear seat. If the detection result that it is not is output, the rear seat third flag is turned OFF.
  • a plurality or a single number of rear seat third flags are provided according to the number of rear seats. For example, if the number of rear seats is three and the vehicle is provided with a seat on the left side of the rear seat, a seat on the right side of the rear seat, and a seat in the center of the rear seat, the rear seat third flag The 3rd flag, the 3rd flag on the right side of the rear seat, and the 3rd flag on the center of the rear seat.
  • the presence/absence determination unit 6 of the vehicle monitoring device 10 acquires the second detection result from the second detection unit 5, and the presence/absence determination unit 6 uses the second detection result to determine whether a passenger is present in the rear seat. determine whether or not there is
  • the presence/absence determining unit 6 refers to the rear seat second flag as the second detection result, and determines whether or not the rear seat second flag is ON (ST204), and the passenger in the rear seat is detected from the captured image. Check whether or not
  • the presence/absence determination unit 6 confirms that a rear seat occupant is present. It is determined that the vehicle is on, and the determination result is output to the vehicle-side control device 200 (ST205).
  • the determination result by the presence/absence determination unit 6 is output to, for example, a notification control unit that controls a seat belt reminder of the vehicle-side control device 200, and the notification control unit outputs the determination result that the passenger is present in the rear seat.
  • the function of the seat belt reminder in the rear seat is turned ON.
  • the presence/absence determination unit 6 proceeds to the processing described below. First, the presence/absence determination unit 6 acquires the first detection result from the first detection unit 4 . Then, the presence/absence determination unit 6 refers to the rear seat first flag as the first detection result, determines whether or not the rear seat first flag is ON (ST206), and determines whether or not the rear seat occupant is on the basis of the sensor information. Check whether it has been detected.
  • the presence/absence determination unit 6 determines that there is no occupant in the rear seat. It judges and outputs the judgment result to the vehicle side control device 200 (ST207).
  • the presence/absence determining unit 6 performs the processing of ST208 described below. proceed to
  • the presence/absence determination section 6 refers to the rear seat third flag to determine whether or not the rear seat third flag is ON (ST208), and whether it is found from the captured image that there is no passenger in the rear seat. confirm whether or not Then, if the rear seat third flag is ON (ST208; YES), that is, if it is found from the captured image that there is no passenger in the rear seat, the presence/absence determination unit 6 determines that there is no passenger in the rear seat. It judges and outputs the judgment result to the vehicle side control device 200 (ST207).
  • the presence/absence determination unit 6 derives the detection result that the passenger is present in the rear seat from the sensor information, and the captured image indicates that the passenger is in the rear seat. When it is confirmed that it is not clear whether or not the passenger is present, it is determined that the passenger is present in the rear seat, and the determination result is output to the vehicle side control device 200 (ST205).
  • the vehicle monitoring device 10 obtains a detection result indicating that there is a passenger in the rear seat from the sensor information, even though there is no passenger in the rear seat, the captured image indicates that there is no passenger in the rear seat. If it is clear that there is no occupant in the rear seat, it is specified that there is no occupant in the rear seat. Therefore, erroneous detection can be suppressed and the detection accuracy of the occupant in the rear seat, that is, the target person can be improved.
  • the second occupant detection process may be performed for the rear seats for which it is indicated that no occupant is present in the first detection result.
  • the vehicle monitoring device 10 performs the processing of ST101 to ST104, and then performs the processing of ST105 to ST108 and ST201 to ST203, but the flowchart of FIG. 8 is an example. Yes, for example, ST101 to ST104 and ST105 to ST108 and ST201 to ST203 may be performed in parallel.
  • the occupant detection unit 12 detects an occupant in the rear seat and identifies the presence or absence of the occupant in the rear seat. It may be detected and the presence or absence of the passenger in the front seat may be specified.
  • the sensor 20 may be provided so as to include the front and rear seats of the vehicle in its detection range, and the first detection unit 4 may detect the occupant in the front seat.
  • the presence or absence of a passenger in the front seat is specified by the passenger detection unit 12 by detecting the presence or absence of the passenger in the front seat from the sensor information in the same manner as in the operation example described with reference to the flowchart of FIG. If a detection result indicating that there was no passenger in the front seat was obtained, and a detection result indicating that the passenger in the front seat was detected from the captured image, the presence of the passenger in the front seat could be specified. It is possible to improve the detection accuracy of the passenger in the front seat, that is, the target person.
  • the presence or absence of a passenger in the front seat is identified by the passenger detection unit 12 by detecting the presence or absence of a passenger in the front seat from sensor information in the same manner as in the operation example described using the flowchart of FIG.
  • a detection result indicating that there is no occupant in the front seat is obtained from the captured image, it is determined that there is no occupant in the front seat. It is possible to suppress erroneous detection and improve the detection accuracy of the passenger in the front seat, that is, the target person.
  • the first detection unit 4 uses sensor information to perform an attribute estimation process for estimating attributes such as the physique, age, or sex of the detected occupant, and to determine the state of the posture and the like. At least one of state estimation processing to estimate may be performed.
  • the sensor 20 is composed of a radio wave sensor 21 or an ultrasonic sensor or the like, and the physique, posture, etc. of the occupant are estimated by the first detection unit 4 using distance data, angle data, etc. obtained from the sensor 20, good.
  • Various well-known algorithms can be used for the attribute estimation processing and the state estimation processing described above, and detailed description thereof will be omitted.
  • the second detection unit 5 detects an occupant using feature information related to the occupant's face, such as the occupant's face area and the positions of facial parts in the captured image.
  • the second detection unit 5 may detect the occupant using feature information relating to the physique, such as the skeletal points of the occupant in the captured image.
  • the second detection unit 5 performs attribute estimation processing for estimating attributes such as the physique, age, and sex of the detected occupant using feature information about the occupant's face and physique, and also performs an attribute estimation process, and performs posture, drowsiness, physical condition, etc. At least one of state estimation processing for estimating the state of the may be performed.
  • passenger authentication may be performed to associate the detected passenger with a specific individual, and the authentication result may be included in the attribute.
  • Various well-known algorithms can be used for the attribute estimation processing and the state estimation processing described above, and detailed description thereof will be omitted.
  • the first detection unit 4 uses the sensor information to perform at least one of the occupant attribute estimation process and the state estimation process, and the second detection unit 5 performs the occupant attribute estimation process using the captured image. and state estimation processing.
  • the resolution of the sensor information acquired from the radio wave sensor 21 cannot be guaranteed, and a plurality of passengers are detected in a row, and the physique of each passenger is estimated. can be performed, the reliability of attribute estimation results and state estimation results can be improved, such as by estimating the physique of each occupant using the captured image by the second detection unit 5 .
  • a vehicle monitoring apparatus 80 according to Embodiment 3 includes a first information acquisition section 1 that acquires sensor information and a second information acquisition section 2 that acquires a captured image, as in Embodiment 1.
  • the vehicle monitoring device 80 includes a vehicle exterior detection unit 81 that detects a person outside the vehicle 50 (hereinafter referred to as a person outside the vehicle). different.
  • the same reference numerals are given to the same components as in the first embodiment, and the description thereof is omitted.
  • FIG. 9 is a block diagram showing a configuration example of the vehicle monitoring system 101 according to the third embodiment.
  • the vehicle monitoring system 101 includes a vehicle monitoring device 80, a sensor 20, and an imaging device 30.
  • the vehicle monitoring device 80, the sensor 20, and the imaging device 30 are mounted on the vehicle 50, respectively.
  • the vehicle monitoring device 80 also includes a vehicle exterior detection unit 81 that detects a person outside the vehicle.
  • the vehicle exterior detection unit 81 has a first vehicle exterior detection unit 7, a second vehicle exterior detection unit 8, and a vehicle exterior presence/absence determination unit 9, and detects a person outside the vehicle. Details of each configuration of the vehicle exterior detection unit 81 will be described later. That is, the target person in detection by the vehicle monitoring system 101 according to this embodiment is a person outside the vehicle.
  • the vehicle monitoring system 101 may detect not only persons outside the vehicle but also passengers in the front seats or passengers in the rear seats as target persons for detection.
  • the sensor 20 in the present embodiment is, for example, a sensor 20 different from the imaging device 30, such as a radio wave sensor 21 or an ultrasonic sensor. It is provided on the ceiling of the passenger compartment, the exterior of the vehicle 50, and the like. If the sensor 20 is a radio wave sensor 21 or an ultrasonic sensor, one or more sensors are provided on the ceiling of the passenger compartment or the like so that the detection range includes a set distance from the exterior of the vehicle 50 . Note that the set distance is, for example, about 15 cm, and can be set appropriately as long as it is possible to detect a person existing near the vehicle 50 outside the vehicle.
  • FIG. 10 is an explanatory diagram showing an example of detection of a person outside the vehicle by the vehicle monitoring device 80 according to the third embodiment.
  • FIG. 10 is a top view of the inside and outside of a vehicle 50 equipped with a vehicle monitoring device 80, and shows an example in which a person 70 outside the vehicle is present near the right side of the rear seat.
  • a region F in FIG. 10 indicates the imaging range of the imaging device 30, and as shown in FIG. 10, the imaging device 30 includes the outside of the vehicle in the imaging range.
  • the information acquisition unit 11 of the vehicle monitoring device 80 has the first information acquisition unit 1 and the second information acquisition unit 2 as in the first embodiment.
  • the first information acquisition section 1 of the information acquisition section 11 is connected to the sensor 20 and acquires sensor information from the sensor 20 .
  • the sensor information is information regarding the presence or absence of a person outside the vehicle.
  • the sensor information includes distance data indicating the distance between the radio wave sensor 21 and the detection target, angle data indicating the angle of the detection target with respect to the radio wave sensor 21, and the like.
  • the sensor information is voice data or the like indicating the volume, direction of arrival, etc. of voice uttered by a person outside the vehicle.
  • the sensor information is any one of these or a combination of a plurality of them.
  • the second information acquisition section 2 of the information acquisition section 11 is connected to the imaging device 30 and acquires a captured image from the imaging device 30 .
  • the first information acquisition unit 1 and the second information acquisition unit 2 are shown separately in the example of FIG. You can go with one configuration.
  • the information acquisition unit 11 then outputs the acquired sensor information and captured image to the vehicle exterior detection unit 81 .
  • the first vehicle exterior detection unit 7 of the vehicle exterior detection unit 81 detects a person outside the vehicle using sensor information.
  • the second vehicle exterior detection unit 8 of the vehicle exterior detection unit 81 detects a person outside the vehicle using the captured image.
  • the vehicle exterior presence/absence determination unit 9 of the vehicle exterior detection unit 81 uses the detection results of the first vehicle exterior detection unit 7 and the second vehicle exterior detection unit 8 to determine whether or not a person exists outside the vehicle. A part 9 is provided.
  • the process of detecting a person outside the vehicle by the vehicle exterior detection unit 81 will be described.
  • the first vehicle exterior detection unit 7 detects a person outside the vehicle using distance data and angle data acquired from the radio wave sensor 21 as sensor information.
  • the first vehicle exterior detection unit 7 acquires distance data and angle data as sensor information from the first information acquisition unit 1 of the information acquisition unit 11, and determines the size of a detection target such as a person, an animal, etc., and the size of the detection target. Calculate the range in which For example, if the first vehicle exterior detection unit 7 finds that the detection target exists near the vehicle 50, such as near the window of the vehicle 50 outside the vehicle, from the calculated size and range of the detection target, A detection result that a person exists outside the vehicle is derived, and the detection result is output to the storage unit of the vehicle monitoring device 80 .
  • first vehicle exterior detection processing the detection result of the first vehicle-exterior detection process by the first vehicle-exterior detection unit 7 is referred to as a first vehicle-exterior detection result.
  • the reliability of the detection result may become an issue.
  • the sensor 20 is the radio wave sensor 21, depending on the positional relationship between the radio wave sensor 21 and the person outside the vehicle, the person outside the vehicle may be detected larger than necessary, or the body of the person outside the vehicle may not be detected. Therefore, it may not be possible to accurately determine whether the object to be detected is a person.
  • the imaging device 30 when used to detect a person outside the vehicle, a highly reliable detection result can be obtained compared to the case where the sensor 20 different from the imaging device 30 is used to detect the person outside the vehicle. . Therefore, regarding the detection of a person outside the vehicle, if a person outside the vehicle is detected from the captured image, the person outside the vehicle, that is, the target person can be detected by using the detection result of the person detection process using the captured image. Can improve accuracy.
  • the second vehicle exterior detection unit 8 analyzes the captured image and detects the face of a person outside the vehicle in the captured image. Then, the second vehicle exterior detection unit 8 acquires a face area, which is an area in which the face of the person outside the vehicle is detected, and the facial feature information of the person outside the vehicle in the face area.
  • the feature information of the face of the person outside the vehicle is, for example, the contrast ratio of the eyes, nose, mouth, and cheeks after normalizing the size of the face.
  • the second vehicle exterior detection unit 8 may determine whether or not face detection has succeeded using facial feature information of a person outside the vehicle.
  • the second vehicle exterior detection unit 8 determines that the face detection is successful if the luminance distribution of the face region is found to be face-like from the contrast ratio in the face region, for example, and determines that the face detection is successful. If it is determined that there is no face detection, it may be determined that the face detection has failed.
  • the second outside detection unit 8 After detecting the face of the person outside the vehicle, acquires the coordinates related to the face area surrounding the face of the person outside the vehicle, such as a rectangle contacting the contour of the person outside the vehicle, from the captured image. do.
  • the coordinates relating to the face area are, for example, the coordinates of each vertex, center, etc. of the rectangle when the face area is rectangular.
  • the second vehicle exterior detection unit 8 calculates dimensions such as the width, height, and area of the face region from the coordinates of the face region.
  • the second vehicle exterior detection unit 8 identifies whether or not there is a person outside the vehicle from the acquired coordinates, width, height, area, etc. of the face area, and provides information regarding the presence or absence of the person outside the vehicle as the detection result. , to the storage unit of the vehicle monitoring device 80 .
  • the detection processing by the second vehicle exterior detection unit 8 is not limited to the above example, and various known algorithms can be used. Further, the second vehicle exterior detection unit 8 may detect a person outside the vehicle using feature information related to the physique of the person outside the vehicle, such as the skeletal points of the person outside the vehicle in the captured image.
  • the detection result of the second vehicle exterior detection unit 8 is stored in the storage unit. good.
  • the detection processing using the captured image by the second vehicle exterior detection unit 8 will be referred to as the second vehicle exterior detection processing.
  • the detection result of the second vehicle-exterior detection process by the second vehicle-exterior detection unit 8 is referred to as a second vehicle-exterior detection result.
  • the presence/absence determination unit 9 determines that a person exists outside the vehicle if the person outside the vehicle can be detected from the captured image even if the person outside the vehicle is not detected from the sensor information. In this manner, when the detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information differs from the detection result regarding the presence or absence of a person outside the vehicle obtained from the captured image, the vehicle exterior detection unit 81 The presence or absence of a person outside the vehicle is specified using the detection result regarding the presence or absence of the person outside the vehicle.
  • FIG. 11 is a flow chart showing an operation example of the vehicle monitoring device 80 according to the third embodiment.
  • the vehicle information acquisition unit 3 acquires a signal indicating that the engine of the vehicle 50 has started from the vehicle-side control device 200
  • the vehicle monitoring device 80 performs the first vehicle exterior detection process and the second vehicle exterior detection process. to start.
  • the vehicle monitoring device 80 is configured such that the vehicle information acquisition unit 3, for example, controls the vehicle 50 from the vehicle-side control device 200.
  • the vehicle information acquisition unit 3 controls the vehicle 50 from the vehicle-side control device 200.
  • the first information acquiring section 1 of the vehicle monitoring device 80 acquires sensor information from the sensor 20 (ST301).
  • the first vehicle exterior detection unit 7 uses the sensor information to perform the first vehicle exterior detection process (ST302).
  • the determination of whether or not a person outside the vehicle has been detected by the first vehicle exterior detection unit 7 is based on, for example, the distance data acquired as sensor information, the size of the detection target calculated from the angle data, the existing range, and the like. , it may be determined whether or not the detected detection target exists outside the vehicle.
  • the first vehicle exterior detection unit 7 outputs a detection result indicating whether or not a person outside the vehicle has been detected to the storage unit of the vehicle monitoring device 80 .
  • the first outside detection unit 7 stores the detection result indicating that the person outside the vehicle is detected as a first outside detection result in the storage unit of the vehicle monitoring device 80. output, and turn on the vehicle exterior first flag (ST303).
  • the first vehicle exterior detection unit 7 sets the detection result indicating that a person outside the vehicle is not detected as a first vehicle exterior detection result to the vehicle monitoring device 80. , and the vehicle exterior first flag is turned OFF (ST304).
  • the first vehicle exterior flag stored in the storage unit indicates whether or not a person has been detected outside the vehicle by the first person detection process. That is, when the first vehicle exterior detection unit 7 outputs a first vehicle exterior detection result indicating that a person outside the vehicle has been detected, the first vehicle exterior flag is turned ON, and the first vehicle exterior detection unit 7 detects a person outside the vehicle. If the first vehicle exterior detection result indicating that the vehicle has not been detected is output, the vehicle exterior first flag is turned OFF.
  • the second information acquisition section 2 of the vehicle monitoring device 80 acquires the captured image from the imaging device 30 (ST305).
  • the second vehicle exterior detection unit 8 performs a second vehicle exterior detection process using the captured image (ST306).
  • the second vehicle exterior detection unit 8 determines whether or not a person has been detected outside the vehicle, for example, based on the coordinates, width, height, area, etc. of the face area acquired from the captured image. may be performed by determining whether or not it exists in the
  • the second vehicle exterior detection unit 8 outputs the detection result as to whether or not a person outside the vehicle has been detected to the storage unit of the vehicle monitoring device 80 .
  • the second outside detection unit 8 stores the detection result indicating that the person outside the vehicle is detected as a second outside detection result in the storage unit of the vehicle monitoring device 80. output to turn ON the second flag outside the vehicle (ST307).
  • the second vehicle exterior detection unit 8 sets the detection result indicating that a person outside the vehicle is not detected as a second vehicle exterior detection result to the vehicle monitoring device 80. , and the second flag outside the vehicle is turned OFF (ST308).
  • the second vehicle exterior flag stored in the storage unit indicates whether or not a person has been detected outside the vehicle by the second vehicle exterior detection process. That is, when the second vehicle exterior detection unit 8 outputs a second vehicle exterior detection result indicating that a person outside the vehicle has been detected, the second vehicle exterior flag is turned ON, and the second vehicle exterior detection unit 8 detects a person outside the vehicle. If the second vehicle exterior detection result indicating that the vehicle has not been detected is output, the vehicle exterior second flag is turned OFF.
  • the vehicle-exterior presence/absence determination unit 9 of the vehicle monitoring device 80 acquires the first vehicle-exterior detection result from the first vehicle-exterior detection unit 7 and acquires the second vehicle-exterior detection result from the second vehicle-exterior detection unit 8 . Then, the vehicle exterior presence/absence determination unit 9 determines whether or not a person is present outside the vehicle using the first vehicle exterior detection result and the second vehicle exterior detection result.
  • the vehicle exterior presence/absence determination unit 9 refers to the vehicle exterior first flag as the first vehicle exterior detection result, and determines whether or not the vehicle exterior first flag is ON (ST309). confirm whether or not If the vehicle outside first flag is ON (ST309; YES), that is, if it is confirmed from the sensor information that a person outside the vehicle has been detected, the vehicle outside presence/absence determining unit 9 determines that a person exists outside the vehicle. , the determination result is output to the vehicle-side control device 200 (ST310).
  • the determination result by the presence/absence determination unit 9 outside the vehicle is output to, for example, a notification unit (not shown) of the vehicle-side control device 200 that notifies the presence of a person outside the vehicle. Then, when the notification unit receives the determination result that a person exists outside the vehicle, the notification unit notifies the driver or the like of the existence of the person outside the vehicle from the notification unit of the vehicle or a terminal (not shown) possessed by the driver. inform. By doing so, the driver of the vehicle 50 can recognize that a person such as a suspicious person is present in the vicinity of the vehicle.
  • the vehicle exterior presence/absence determination unit 9 performs the processing of ST311 described below. proceed to
  • the vehicle exterior presence/absence determining unit 9 refers to the vehicle exterior second flag as the second vehicle exterior detection result, and determines whether or not the vehicle exterior second flag is ON (ST311). confirm whether or not If the vehicle exterior second flag is ON (ST311; YES), that is, if it is confirmed that a person outside the vehicle has been detected from the captured image, the vehicle exterior presence/absence determination unit 9 determines that a person exists outside the vehicle. , the determination result is output to the vehicle-side control device 200 (ST310).
  • the vehicle exterior presence/absence determination unit 9 determines that there is no person outside the vehicle. It determines and outputs the determination result to the vehicle-side control device 200 (ST312).
  • the vehicle exterior detection unit 81 obtains a detection result indicating that a person outside the vehicle has not been detected from the sensor information and obtains a detection result indicating that a person outside the vehicle has been detected from the captured image, Identify the presence of a person outside the vehicle.
  • Identify the presence of a person outside the vehicle even if a person is actually present outside the vehicle, even if the presence of a person outside the vehicle is not detected from the sensor information, the presence of a person outside the vehicle is detected from the captured image.
  • it is determined that there is a person outside the vehicle using the detection result obtained from the captured image it is possible to prevent omissions in detecting the person outside the vehicle and improve the detection accuracy of the person outside the vehicle, that is, the target person.
  • the vehicle monitoring device 80 performs the processing of ST301 to ST304 and then performs the processing of ST305 to ST308, but the flowchart of FIG. 11 is an example.
  • the processing of ST301 to ST304 and the processing of ST305 to ST308 may be performed in parallel.
  • the presence/absence determination unit 9 detects from the second vehicle exterior detection unit 8 the detection result indicating that a person is present outside the vehicle based on the sensor information. Even if it is determined that there is no person outside the vehicle by using the detection result obtained from the imaged image, if a detection result indicating that there is no person outside the vehicle is obtained from the imaged image. good.
  • the presence/absence determination unit 9 outside the vehicle obtains a detection result indicating that a detection target exists outside the vehicle detected from the sensor information from the first detection unit 7, the second detection unit 8 outside the vehicle
  • the detection result obtained from the sensor information that there is a person outside the vehicle is due to an erroneous detection, if it is clear from the captured image that there is no person outside the vehicle, the person outside the vehicle will be detected. False detection can be suppressed and the detection accuracy of the person outside the vehicle, that is, the target person can be improved.
  • the vehicle monitoring device 80 may perform the first vehicle exterior detection process and the second vehicle exterior detection process even when the engine of the vehicle is stopped.
  • the notification control unit of the vehicle-side control device 200 can notify the presence of a person outside the vehicle to the driver or the like from the notification unit of the vehicle or the terminal held by the driver. A driver can be notified that a suspicious person exists in the vicinity of the vehicle before a criminal act is committed on the vehicle.
  • FIG. 12 is a diagram showing a hardware configuration example of a vehicle monitoring device 80 according to Embodiment 3. As shown in FIG. First information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first vehicle exterior detection unit 7, second vehicle exterior detection unit 8, vehicle exterior presence/absence determination unit 9 in vehicle monitoring device 80, The functions of the vehicle exterior detection unit 81 and the storage unit are implemented by a processing circuit.
  • the unit 9, the vehicle exterior detection unit 81, and the storage unit may be a processing circuit 80a that is dedicated hardware as shown in FIG. It may be the executing processor 80b.
  • the processing circuit 80a may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit ), FPGA (Field-programmable Gate Array), or a combination thereof.
  • First information acquisition unit 1 second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first vehicle exterior detection unit 7, second vehicle exterior detection unit 8, vehicle exterior presence/absence determination unit 9, vehicle exterior detection unit 81,
  • Each function of each unit of the storage unit and the storage unit may be realized by a processing circuit, or the functions of each unit may be collectively realized by one processing circuit.
  • the outside detection unit 81 and the storage unit are the processor 80b, the function of each unit is realized by software, firmware, or a combination of software and firmware.
  • Software or firmware is written as a program and stored in the memory 80c.
  • the processor 80b reads out and executes the programs stored in the memory 80c to obtain the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, and the first outside detection unit 7.
  • first information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first vehicle exterior detection unit 7, second vehicle exterior detection unit 8, vehicle exterior presence/absence determination unit 9, vehicle exterior detection unit 81, and storage comprises memory 80c for storing a program which, when executed by processor 80b, results in the execution of the steps shown in FIG. Further, these programs include a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first vehicle exterior detection unit 7, a second vehicle exterior detection unit 8, and a vehicle exterior presence/absence determination unit. 9, the vehicle exterior detection unit 81, and the storage unit.
  • the processor 80b is, for example, a CPU (Central Processing Unit), a processing device, an arithmetic device, a processor, a microprocessor, a microcomputer, or a DSP (Digital Signal Processor).
  • the memory 80c may be non-volatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically EPROM), etc. However, it may be a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a mini disk, a CD (Compact Disc) or a DVD (Digital Versatile Disc).
  • First information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first vehicle exterior detection unit 7, second vehicle exterior detection unit 8, vehicle exterior presence/absence determination unit 9, vehicle exterior detection unit 81 and each function of the storage unit may be partly realized by dedicated hardware and partly realized by software or firmware.
  • the processing circuit 80a in the vehicle monitoring device 80 can realize each of the functions described above by hardware, software, firmware, or a combination thereof.
  • a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first vehicle exterior detection unit 7, a second vehicle exterior detection unit 8, a vehicle exterior presence/absence determination unit 9, and a vehicle exterior detection unit 81 and at least part of the functions of the storage unit may be executed by an external server.
  • the vehicle monitoring device 80 includes the first information acquisition unit 1 that acquires sensor information from the sensor 20 whose detection range is outside the vehicle, and the second information acquisition unit 1 that acquires the captured image from the imaging device 30 whose imaging range is outside the vehicle. 2 information acquisition unit 2, and a vehicle exterior detection unit 81 that detects a person outside the vehicle from the sensor information and the captured image respectively acquired by the first information acquisition unit 1 and the second information acquisition unit 2, and the vehicle exterior detection unit If the detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information is different from the detection result regarding the presence or absence of the person outside the vehicle obtained from the captured image, 81 determines whether the person outside the vehicle is present or not obtained from the captured image. If the presence or absence of a person outside the vehicle is specified using the detection result, detection failure of the person outside the vehicle can be prevented, and the detection accuracy of the person outside the vehicle, that is, the target person can be improved.
  • the vehicle monitoring device may include both the occupant detection unit 12 and the vehicle exterior detection unit 81 to detect both the occupant and the person outside the vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Air Bags (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The present invention provides a vehicular monitoring device with increased accuracy in detecting a target person. The vehicular monitoring device detects a target person who is an occupant inside the vehicle or a person outside the vehicle, and in the case where a detection result regarding the presence or absence of the target person obtained from sensor information acquired by a sensor different from an imaging device differs from a detection result regarding the presence or absence of the target person obtained from a captured image acquired by the imaging device, the detection result regarding the presence or absence of the target person obtained from the captured image is used to specify whether the target person is present or absent, thereby be capable of increasing the accuracy in detecting the target person.

Description

車両監視装置及び車両監視システムVehicle monitoring device and vehicle monitoring system
 本開示は、車両内部の乗員又は車両外部の人物を検知する車両監視装置及び車両監視システムに関する。 The present disclosure relates to a vehicle monitoring device and a vehicle monitoring system that detect an occupant inside a vehicle or a person outside the vehicle.
 車内の乗員又は車外の人物を検知し、乗員が存在する位置等の検知結果に応じたエアバック、シートベルトリマインダ等の制御処理、車外の人物の検知結果に応じた不審人物の存在の報知等を行う技術が開発されている。従来、車内の乗員の検知に関しては、車両の前席及び後席を撮像範囲に含むように撮像装置を設け、撮像装置から取得した撮像画像における、乗員の頭部の位置、大きさ等に基づいて、乗員がいずれの座席に存在しているかを検知していた(例えば、特許文献1参照)。 Detecting occupants inside the vehicle or people outside the vehicle, control processing such as airbags and seat belt reminders according to the detection results such as the position of the occupants, notification of the presence of a suspicious person according to the detection results of people outside the vehicle, etc. technology has been developed to Conventionally, regarding the detection of occupants in a vehicle, an imaging device is provided so that the imaging range includes the front and rear seats of the vehicle. In this case, it is detected in which seat the occupant is located (see, for example, Patent Document 1).
特開2020-50090号公報Japanese Patent Application Laid-Open No. 2020-50090
 こうした車内の乗員又は車外の人物等、すなわち対象人物を検知しようとする場合、電波センサ又は着座センサ等、撮像装置とは異なるセンサを用いて検知することが考えられる。しかしながら、例えば、着座センサを用いて後席の乗員を検知しようとした場合、乗員が座席に着座する姿勢によっては、座席に着座する乗員の重量を正しく測定できない等、撮像装置とは異なるセンサを用いたとしても十分に検知できないおそれがあるという課題があった。 When trying to detect such a passenger inside the vehicle or a person outside the vehicle, that is, a target person, it is conceivable to use a sensor different from the imaging device, such as a radio wave sensor or a seating sensor. However, for example, if an attempt is made to detect an occupant in a rear seat using a seat sensor, depending on the posture of the occupant sitting on the seat, the weight of the occupant sitting on the seat cannot be measured correctly. There is a problem that even if it is used, it may not be sufficiently detected.
 本開示は、上述のような課題を解決するためになされたもので、撮像装置から得られた検知結果と、撮像装置とは異なるセンサから得られた検知結果とを用いることにより、対象人物の検知精度を向上することを目的とする。 The present disclosure has been made to solve the above-described problems, and by using a detection result obtained from an imaging device and a detection result obtained from a sensor different from the imaging device, The purpose is to improve detection accuracy.
 本開示に係る第1の車両監視装置は、車両の少なくとも後席を検知範囲に含むセンサから、センサ情報を取得する第1情報取得部と、車両の前席及び後席を撮像範囲に含む撮像装置から、撮像画像を取得する第2情報取得部と、第1情報取得部及び第2情報取得部がそれぞれ取得した、センサ情報及び撮像画像から、後席の乗員を検知する乗員検知部と、を備え、乗員検知部は、センサ情報から得られた後席の乗員の存否に関する検知結果と、撮像画像から得られた後席の乗員の存否に関する検知結果とが異なる場合、撮像画像から得られた後席の乗員の存否に関する検知結果を用いて、後席の乗員の存否を特定するものである。 A first vehicle monitoring device according to the present disclosure includes a first information acquisition unit that acquires sensor information from a sensor whose detection range includes at least the rear seats of a vehicle; a second information acquisition unit that acquires a captured image from the device; an occupant detection unit that detects an occupant in a rear seat from the sensor information and the captured image respectively acquired by the first information acquisition unit and the second information acquisition unit; The occupant detection unit obtains from the captured image when the detection result regarding the presence or absence of the rear seat occupant obtained from the sensor information and the detection result regarding the presence or absence of the rear seat occupant obtained from the captured image are different. The presence/absence of a rear-seat passenger is specified using the detection result regarding the presence/absence of a rear-seat passenger.
 また、本開示に係る第1の車両監視システムは、車両に搭載され、少なくとも車両の後席を検知範囲に含むセンサと、車両に搭載され、車両の前席及び後席を撮像範囲に含む撮像装置と、センサから、センサ情報を取得する第1情報取得部と、撮像装置から、撮像画像を取得する第2情報取得部と、第1情報取得部及び第2情報取得部がそれぞれ取得した、センサ情報及び撮像画像から、後席の乗員を検知する乗員検知部と、を備え、乗員検知部は、センサ情報から得られた後席の乗員の存否に関する検知結果と、撮像画像から得られた後席の乗員の存否に関する検知結果とが異なる場合、撮像画像から得られた後席の乗員の存否に関する検知結果を用いて、後席の乗員の存否を特定するものである。 In addition, a first vehicle monitoring system according to the present disclosure includes a sensor mounted in a vehicle and including at least the rear seats of the vehicle in its detection range, and an imaging sensor mounted in the vehicle and including the front and rear seats of the vehicle in its imaging range. A device, a first information acquisition unit that acquires sensor information from a sensor, a second information acquisition unit that acquires a captured image from an imaging device, and the first information acquisition unit and the second information acquisition unit respectively acquired, and an occupant detection unit that detects an occupant in the rear seat from the sensor information and the captured image. If the detection result regarding the presence/absence of the passenger in the rear seat is different, the presence/absence of the passenger in the rear seat is specified using the detection result regarding the presence/absence of the passenger in the rear seat obtained from the captured image.
 また、本開示に係る第2の車両監視装置は、車両の外部を検知範囲に含むセンサから、センサ情報を取得する第1情報取得部と、車両の外部を撮像範囲に含む撮像装置から、撮像画像を取得する第2情報取得部と、第1情報取得部及び第2情報取得部がそれぞれ取得した、センサ情報及び撮像画像から、車両の外部の人物を検知する車外検知部と、を備え、車外検知部は、センサ情報から得られた車両の外部の人物の存否に関する検知結果と、撮像画像から得られた車両の外部の人物の存否に関する検知結果とが異なる場合、撮像画像から得られた車両の外部の人物の存否に関する検知結果を用いて、車両の外部の人物の存否を特定するものである。 Further, a second vehicle monitoring device according to the present disclosure includes a first information acquisition unit that acquires sensor information from a sensor whose detection range includes the outside of the vehicle, and an imaging device that includes the outside of the vehicle in its imaging range. A second information acquisition unit that acquires an image, and a vehicle exterior detection unit that detects a person outside the vehicle from the sensor information and the captured image respectively acquired by the first information acquisition unit and the second information acquisition unit, If the detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information differs from the detection result regarding the presence or absence of a person outside the vehicle obtained from the captured image, the vehicle exterior detection unit The existence or non-existence of a person outside the vehicle is specified using the detection result regarding the existence or non-existence of the person outside the vehicle.
 また、本開示に係る第2の車両監視システムは、車両に搭載され、車両の外部を検知範囲に含むセンサと、車両に搭載され、車両の外部を撮像範囲に含む撮像装置と、センサから、センサ情報を取得する第1情報取得部と、撮像装置から、撮像画像を取得する第2情報取得部と、第1情報取得部及び第2情報取得部がそれぞれ取得した、センサ情報及び撮像画像から、車両の外部の人物を検知する車外検知部と、を備え、車外検知部は、センサ情報から得られた車両の外部の人物の存否に関する検知結果と、撮像画像から得られた車両の外部の人物の存否に関する検知結果とが異なる場合、撮像画像から得られた車両の外部の人物の存否に関する検知結果を用いて、車両の外部の人物の存否を特定するものである。 Further, a second vehicle monitoring system according to the present disclosure includes a sensor mounted on a vehicle and including the exterior of the vehicle in its detection range, an imaging device mounted on the vehicle and including the exterior of the vehicle in its imaging range, and a sensor, A first information acquisition unit that acquires sensor information, a second information acquisition unit that acquires a captured image from an imaging device, and the sensor information and the captured image acquired by the first information acquisition unit and the second information acquisition unit, respectively and a vehicle exterior detection unit that detects a person outside the vehicle, and the vehicle exterior detection unit detects a detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information, and a person outside the vehicle obtained from the captured image. If the detection result regarding the presence or absence of a person is different, the presence or absence of the person outside the vehicle is specified using the detection result regarding the presence or absence of the person outside the vehicle obtained from the captured image.
 本開示によれば、撮像装置と、撮像装置と異なるセンサとを用いて、車内の乗員又は車外の人物、すなわち対象人物の検知を行うため、対象人物の検知精度を向上できる。 According to the present disclosure, an occupant inside the vehicle or a person outside the vehicle, that is, a target person is detected using an imaging device and a sensor different from the imaging device, so the detection accuracy of the target person can be improved.
実施の形態1に係る車両監視システムの構成例を示すブロック図である。1 is a block diagram showing a configuration example of a vehicle monitoring system according to Embodiment 1; FIG. 実施の形態1に係るセンサの検知範囲及び撮像装置の撮像範囲を示す説明図である。4A and 4B are explanatory diagrams showing a detection range of a sensor and an imaging range of an imaging device according to Embodiment 1; FIG. 実施の形態1に係る撮像画像を用いた乗員検知例を示す説明図である。FIG. 5 is an explanatory diagram showing an example of occupant detection using a captured image according to Embodiment 1; 実施の形態1に係る車両監視装置の動作例を示すフローチャートである。4 is a flowchart showing an operation example of the vehicle monitoring device according to Embodiment 1; 実施の形態1に係る車両監視装置のハードウェア構成例を示す図である。1 is a diagram showing an example hardware configuration of a vehicle monitoring device according to Embodiment 1; FIG. 実施の形態2に係る撮像画像を用いた乗員検知例を示す説明図である。FIG. 11 is an explanatory diagram showing an example of occupant detection using a captured image according to Embodiment 2; 実施の形態2に係る撮像画像を用いた乗員検知例を示す説明図である。FIG. 11 is an explanatory diagram showing an example of occupant detection using a captured image according to Embodiment 2; 実施の形態2に係る車両監視装置の動作例を示すフローチャートである。9 is a flowchart showing an operation example of the vehicle monitoring device according to Embodiment 2; 実施の形態3に係る車両監視システムの構成例を示すブロック図である。FIG. 11 is a block diagram showing a configuration example of a vehicle monitoring system according to Embodiment 3; 実施の形態3に係る車両監視装置による車外の人物の検知例を示す説明図である。FIG. 11 is an explanatory diagram showing an example of detection of a person outside the vehicle by the vehicle monitoring device according to Embodiment 3; 実施の形態3に係る車両監視装置の動作例を示すフローチャートである。10 is a flow chart showing an operation example of the vehicle monitoring device according to Embodiment 3; 実施の形態3に係る車両監視装置のハードウェア構成例を示す図である。FIG. 11 is a diagram showing a hardware configuration example of a vehicle monitoring device according to Embodiment 3;
 以下、図面に基づいて実施の形態について説明する。 Embodiments will be described below based on the drawings.
実施の形態1.
 図1は、実施の形態1に係る車両監視システム100の構成例を示すブロック図である。車両監視システム100は、車両監視装置10、センサ20、及び撮像装置30を備えており、車両監視装置10、センサ20、及び撮像装置30はそれぞれ車両に搭載される。また、車両監視装置10は、車両監視装置10が搭載された車両における、空調機器、音響機器、ナビゲーション装置、報知部等の車載機器、及びエンジン等を制御する車両側制御装置200と接続されている。
Embodiment 1.
FIG. 1 is a block diagram showing a configuration example of a vehicle monitoring system 100 according to Embodiment 1. As shown in FIG. The vehicle monitoring system 100 includes a vehicle monitoring device 10, a sensor 20, and an imaging device 30. The vehicle monitoring device 10, the sensor 20, and the imaging device 30 are each mounted on a vehicle. In addition, the vehicle monitoring device 10 is connected to a vehicle-side control device 200 that controls in-vehicle devices such as air conditioners, audio devices, navigation devices, and notification units, and the engine, etc., in the vehicle on which the vehicle monitoring device 10 is mounted. there is
 本実施の形態に係る車両監視システム100の検知における対象人物は、後席の乗員である。なお、車両監視システム100は、後席の乗員のみを検知するものでなく、対象人物に前席の乗員を含んでもよい。図2は、実施の形態1に係るセンサ20の検知範囲及び撮像装置30の撮像範囲を示す説明図である。図2Aは、車両監視装置10を搭載した車両50の内部を側方から見た図であり、図2Bは、車両監視装置10を搭載した車両50の内部を上方から見た図である。センサ20は、車両に搭載された、車内の乗員のうち少なくとも後席の乗員を検知可能に設けられたセンサ20である。センサ20は、例えば、乗員を検知可能な、電波センサ21、重量センサ(着座センサ)、音声センサ、超音波センサ等の、撮像装置30と異なるセンサ20である。また、センサ20は、これらのうちのいずれか1つもしくは複数の組み合わせである。 A target person in detection by the vehicle monitoring system 100 according to the present embodiment is a passenger in the rear seat. The vehicle monitoring system 100 may detect not only rear seat passengers, but also front seat passengers as target persons. FIG. 2 is an explanatory diagram showing the detection range of the sensor 20 and the imaging range of the imaging device 30 according to the first embodiment. 2A is a side view of the interior of a vehicle 50 equipped with the vehicle monitoring device 10, and FIG. 2B is a top view of the interior of the vehicle 50 equipped with the vehicle monitoring device 10. FIG. The sensor 20 is a sensor 20 mounted in a vehicle and provided so as to be able to detect at least rear seat passengers among the passengers in the vehicle. The sensor 20 is a sensor 20 different from the imaging device 30, such as a radio wave sensor 21, a weight sensor (seating sensor), a voice sensor, an ultrasonic sensor, etc., which can detect a passenger, for example. Moreover, the sensor 20 is any one of these or a combination of a plurality of them.
 図2Aには、センサ20が電波センサ21である例を示しており、電波センサ21の検知範囲を領域Aで示している。なお、電波センサ21は、例えば、ミリ波を送信し、動体で反射された反射波を受信するセンサである。図2Aの例において、センサ20は、少なくとも後席を検知範囲に含むよう、車両50における車室の天井等に設けられている。また、図2Bでは、後席左側の座席53に乗員58が着座している例を示している。ここで、後席左側の座席とは、後席のうち、車両前方に対して左側の座席53をいい、後席右側の座席とは、後席のうち、車両前方に対して右側の座席54をいい、後席中央の座席とは、後席のうち、後席左側の座席と右側の座席との間に設けられた座席55をいう。以下、説明のため、特に断りなく単に後席と記載する場合、後席左側の座席53、後席右側の座席54、及び後席中央の座席55をまとめたものをいう。なお、後席の数は図2Bの例のように3つでなくてもよく、後席の数は任意である。 FIG. 2A shows an example in which the sensor 20 is a radio wave sensor 21, and the detection range of the radio wave sensor 21 is indicated by area A. The radio wave sensor 21 is, for example, a sensor that transmits millimeter waves and receives reflected waves reflected by a moving object. In the example of FIG. 2A, the sensor 20 is provided on the ceiling of the compartment of the vehicle 50 or the like so that at least the rear seats are included in the detection range. In addition, FIG. 2B shows an example in which an occupant 58 is seated in the seat 53 on the left side of the rear seat. Here, the seat on the left side of the rear seat refers to the seat 53 on the left side with respect to the front of the vehicle among the rear seats, and the seat on the right side of the rear seat refers to the seat 54 on the right side with respect to the front of the vehicle among the rear seats. , and the seat in the middle of the rear seat refers to the seat 55 provided between the left seat and the right seat of the rear seats. Hereinafter, for the sake of explanation, when the term "rear seat" is simply used without any particular notice, it refers to the seat 53 on the left side of the rear seat, the seat 54 on the right side of the rear seat, and the seat 55 in the center of the rear seat. Note that the number of rear seats does not have to be three as in the example of FIG. 2B, and the number of rear seats is arbitrary.
 撮像装置30は、例えば、広角カメラ、赤外線カメラ等で構成され、車両50の内部を撮像する。また、撮像装置30は、撮像装置30と被写体との距離を反映した画像を撮像可能な、TOF(Time-of-flight)カメラであってもよい。撮像装置30は、例えば30~60fps(frames per second)の間隔で車内を撮像して、撮像した画像を、車両監視装置10が有する情報取得部11の第2情報取得部2に出力する。以下、撮像装置30が撮像した画像を、撮像画像と記載する。 The imaging device 30 is composed of, for example, a wide-angle camera, an infrared camera, etc., and images the inside of the vehicle 50 . Further, the imaging device 30 may be a TOF (Time-of-flight) camera capable of capturing an image reflecting the distance between the imaging device 30 and the subject. The imaging device 30 captures an image of the interior of the vehicle at intervals of, for example, 30 to 60 fps (frames per second), and outputs the captured image to the second information acquisition section 2 of the information acquisition section 11 of the vehicle monitoring device 10 . An image captured by the imaging device 30 is hereinafter referred to as a captured image.
 図2Aの例において、撮像装置30の撮像範囲を領域Bで示している。撮像装置30は、運転席51及び助手席52の前席と、後席のうちの少なくともいずれかの座席とを撮像範囲に含むように、一台又は複数台、オーバーヘッドコンソール、インストメントパネル、ステアリングコラム、ルームミラー等に配置される。以下、車両監視装置10の検知対象となる乗員56、57、58等の車内の乗員をまとめて「乗員」ともいう。すなわち、乗員は運転者を含むものである。 In the example of FIG. 2A, the area B indicates the imaging range of the imaging device 30 . The imaging device 30 includes one or more units, an overhead console, an instrument panel, and a steering wheel so as to include the front seats of the driver's seat 51 and the front passenger seat 52 and at least one of the rear seats in the imaging range. They are installed in columns, room mirrors, etc. Hereinafter, passengers in the vehicle, such as passengers 56, 57, and 58, who are to be detected by the vehicle monitoring device 10, are collectively referred to as "passengers". That is, the occupants include the driver.
 図1に戻り、車両監視装置10について説明する。車両監視装置10は、センサ20及び撮像装置30から、それぞれセンサ情報及び撮像画像を取得する情報取得部11と、センサ情報及び撮像画像から、後席の乗員を検知する乗員検知部12とを備える。 Returning to FIG. 1, the vehicle monitoring device 10 will be described. The vehicle monitoring device 10 includes an information acquisition unit 11 that acquires sensor information and captured images from the sensor 20 and the imaging device 30, respectively, and an occupant detection unit 12 that detects passengers in the rear seats from the sensor information and captured images. .
 図1に示すように、情報取得部11は、第1情報取得部1と、第2情報取得部2とを有する。情報取得部11の第1情報取得部1は、センサ20と接続されており、センサ20からセンサ情報を取得する。ここで、センサ情報とは、車内の乗員の存否に関する情報である。例えば、センサ20が電波センサ21である場合、センサ情報は、電波センサ21と検知対象との距離を示す距離データ、電波センサ21に対する検知対象の角度を示す角度データ等である。例えば、センサ20が着座センサである場合、センサ情報は、着座センサが設けられた座席の情報を示す座席データ、着座センサが検出した重量データ等である。また、例えば、センサ20が音声センサである場合、センサ情報は、乗員が発話した音声の、音量及び到来方向等を示す音声データ等である。なお、センサ情報は、これらのうちのいずれか1つもしくは複数の組み合わせである。 As shown in FIG. 1, the information acquisition section 11 has a first information acquisition section 1 and a second information acquisition section 2 . The first information acquisition section 1 of the information acquisition section 11 is connected to the sensor 20 and acquires sensor information from the sensor 20 . Here, the sensor information is information regarding the presence or absence of passengers in the vehicle. For example, when the sensor 20 is the radio wave sensor 21, the sensor information includes distance data indicating the distance between the radio wave sensor 21 and the detection target, angle data indicating the angle of the detection target with respect to the radio wave sensor 21, and the like. For example, when the sensor 20 is a seat sensor, the sensor information includes seat data indicating information on the seat provided with the seat sensor, weight data detected by the seat sensor, and the like. Further, for example, when the sensor 20 is a voice sensor, the sensor information is voice data or the like indicating the volume, direction of arrival, etc. of the voice uttered by the passenger. In addition, the sensor information is any one of these or a combination of a plurality of them.
 一方、情報取得部11の第2情報取得部2は、撮像装置30と接続されており、撮像装置30から撮像画像を取得する。なお、図1の例において、第1情報取得部1と第2情報取得部2とを分けて図示しているが、撮像装置30からの撮像画像の取得及びセンサ20からのセンサ情報の取得は一つの構成で行ってもよい。そして、情報取得部11は、取得したセンサ情報及び撮像画像を後述する乗員検知部12に出力する。 On the other hand, the second information acquisition section 2 of the information acquisition section 11 is connected to the imaging device 30 and acquires a captured image from the imaging device 30 . Although the first information acquisition unit 1 and the second information acquisition unit 2 are shown separately in the example of FIG. You can go with one configuration. The information acquisition unit 11 then outputs the acquired sensor information and captured image to the occupant detection unit 12, which will be described later.
 また、情報取得部11は、車両側制御装置200と接続された車両情報取得部3を有する。車両情報取得部3は、車両側制御装置200から、車両の始動、停止等に関する信号等の車両情報を取得する。そして、車両情報取得部3は、車両側制御装置200から取得した車両情報を用いて、第1情報取得部1及び第2情報取得部2に、センサ情報及び撮像画像の取得を開始させる旨の信号、又はセンサ情報及び撮像画像の取得を終了させる旨の信号を出力する。 The information acquisition section 11 also has a vehicle information acquisition section 3 connected to the vehicle-side control device 200 . The vehicle information acquisition unit 3 acquires vehicle information such as signals related to starting and stopping of the vehicle from the vehicle-side control device 200 . Then, the vehicle information acquisition unit 3 instructs the first information acquisition unit 1 and the second information acquisition unit 2 to start acquiring the sensor information and the captured image using the vehicle information acquired from the vehicle-side control device 200. A signal or a signal indicating that the acquisition of the sensor information and the captured image is to be terminated is output.
 例えば、車両情報取得部3は、車両側制御装置200から、ドアの開錠、ドアのオープン、イグニッションのON、人感センサのON、シフトレバーがドライブの位置に移動、車両速度が0km/hを超えた、ナビゲーション装置が案内を開始した、及び車両が自宅を出発した等のいずれかの信号を取得した場合、第1情報取得部1及び第2情報取得部2に、センサ情報及び撮像画像の取得を開始させる旨の信号を出力する。一方、例えば、車両情報取得部3は、車両側制御装置200から、イグニッションのOFF、人感センサのOFF、シフトレバーがパーキングの位置に移動、ナビゲーション装置が案内を終了した、及び車両が自宅へ帰着した等のいずれかの信号を取得した場合、第1情報取得部1及び第2情報取得部2に、センサ情報及び撮像画像の取得を終了させる旨の信号を出力する。 For example, the vehicle information acquisition unit 3 receives information from the vehicle-side control device 200 that the doors are unlocked, the doors are opened, the ignition is turned on, the human sensor is turned on, the shift lever is moved to the drive position, and the vehicle speed is 0 km/h. , the navigation device has started guidance, or the vehicle has left home, the sensor information and the captured image are sent to the first information acquisition unit 1 and the second information acquisition unit 2 output a signal to start acquisition of On the other hand, for example, the vehicle information acquisition unit 3 receives information from the vehicle-side control device 200 that the ignition is turned off, the motion sensor is turned off, the shift lever is moved to the parking position, the navigation device has finished guidance, and the vehicle is heading home. When any of the signals such as return is acquired, a signal to the effect that the acquisition of the sensor information and the captured image is to be terminated is output to the first information acquisition section 1 and the second information acquisition section 2 .
 車両監視装置10の乗員検知部12について説明する。乗員検知部12は、センサ情報を用いて乗員を検知する第1検知部4と、撮像画像を用いて乗員を検知する第2検知部5と、第1検知部4及び第2検知部5の検知結果を用いて、後席に乗員が存在しているか否かを判定する存否判定部6とを備える。なお、図1に示すように、乗員検知部12の第1検知部4及び第2検知部5はそれぞれ、情報取得部11の第1情報取得部1及び第2情報取得部2と接続されている。 The occupant detection unit 12 of the vehicle monitoring device 10 will be explained. The occupant detection unit 12 includes a first detection unit 4 that detects an occupant using sensor information, a second detection unit 5 that detects an occupant using a captured image, and the first detection unit 4 and the second detection unit 5. A presence/absence determination unit 6 that determines whether or not an occupant is present in the rear seat using the detection result. In addition, as shown in FIG. 1, the first detection unit 4 and the second detection unit 5 of the occupant detection unit 12 are connected to the first information acquisition unit 1 and the second information acquisition unit 2 of the information acquisition unit 11, respectively. there is
 乗員検知部12による、後席の乗員を検知する処理について説明する。以下の説明では、第1検知部4が、センサ情報として、電波センサ21から取得した距離データと、角度データとを取得し、後席の乗員を検知する例を挙げて説明する。 The process of detecting passengers in the rear seats by the passenger detection unit 12 will be explained. In the following description, an example in which the first detection unit 4 obtains distance data and angle data obtained from the radio wave sensor 21 as sensor information and detects an occupant in the rear seat will be described.
 まず、第1検知部4は、情報取得部11の第1情報取得部1から、距離データと、角度データとをセンサ情報として取得し、検知対象の大きさと、人、動物等の検知対象が存在する範囲等を算出する。例えば、第1検知部4は、算出した検知対象の大きさ及び存在する範囲等から、検知対象が後席左側に存在することが判明すれば、後席左側に乗員が検知されたという検知結果を導出し、検知結果を車両監視装置10の記憶部(図示せず)に出力する。 First, the first detection unit 4 acquires distance data and angle data as sensor information from the first information acquisition unit 1 of the information acquisition unit 11, and determines the size of the detection target and the detection target such as a person or an animal. Calculate the existing range, etc. For example, if the first detection unit 4 finds that the detection target exists on the left side of the rear seat from the calculated size and range of the detection target, the detection result indicates that an occupant has been detected on the left side of the rear seat. is derived, and the detection result is output to a storage unit (not shown) of the vehicle monitoring device 10 .
 また、第1検知部4は、例えば、算出した検知対象の大きさ及び存在する範囲等から、検知対象が後席右側に存在することが判明すれば、後席右側に乗員が検知されたという検知結果を記憶部に出力する。同様に、例えば、第1検知部4は、算出した検知対象の大きさ及び存在する範囲等から、検知対象が後席中央に存在することが判明すれば、後席中央に乗員が検知されたという検知結果を出力する。なお、第1検知部4による乗員検知処理は上述の例に限らず、公知の種々のアルゴリズムを用いることが可能である。また、上述の例では、第1検知部4の検知結果を記憶部に格納する例を示しているが、第1検知部4は、検知結果を存否判定部6に出力してもよい。 For example, if it is found that the object to be detected exists on the right side of the rear seat from the calculated size and range of the object to be detected, the first detection unit 4 determines that the occupant has been detected on the right side of the rear seat. The detection result is output to the storage unit. Similarly, for example, if the first detection unit 4 finds that the detection target exists in the center of the rear seat from the calculated size and range of the detection target, the passenger is detected in the center of the rear seat. output the detection result. The occupant detection processing by the first detection unit 4 is not limited to the example described above, and various known algorithms can be used. In the above example, the detection result of the first detection unit 4 is stored in the storage unit, but the first detection unit 4 may output the detection result to the existence determination unit 6 .
 ここで、乗員が存在しているとは、厳密に乗員が座席に座っている状態だけでなく、座席の足元に乗員が存在している状態も含むものとする。乳幼児等、体格の小さい乗員は座席の足元に存在する可能性もあり、例えば、電波センサ21から取得したセンサ情報を用いて乗員を検知すれば、座席の足元に存在する乗員も検知可能であるためである。以下、説明のため、第1検知部4によるセンサ情報を用いた乗員検知処理を、第1乗員検知処理という。また、第1検知部4による第1乗員検知処理の検知結果を第1検知結果という。 Here, the existence of an occupant includes not only the state in which the occupant is strictly sitting on the seat, but also the state in which the occupant is present at the feet of the seat. A small-sized occupant such as an infant may be at the foot of the seat. For example, if the sensor information obtained from the radio wave sensor 21 is used to detect the occupant, it is possible to detect the occupant at the foot of the seat. It's for. Hereinafter, the occupant detection process using the sensor information by the first detection unit 4 will be referred to as the first occupant detection process for the sake of explanation. A detection result of the first occupant detection process by the first detection unit 4 is called a first detection result.
 ところで、センサ情報を用いて後席の乗員を検知する場合、検知結果の信頼性が問題となる場合がある。例えば、センサ20が電波センサ21である場合は、電波センサ21から取得した距離データ及び角度データ等の分解能を担保できず、異なる座席に着座する複数の乗員が連なって検知される等して、乗員がいずれの座席に存在しているのかを正確に判定できない場合がある。また、例えば、センサ20が着座センサである場合は、乗員が座席に着座する姿勢によっては、座席に着座する乗員の重量を正しく測定できず、座席上の配置された物体が人であるか否かを判定できない場合がある。 By the way, when using sensor information to detect rear seat occupants, the reliability of the detection results may become an issue. For example, if the sensor 20 is the radio wave sensor 21, the resolution of the distance data and angle data obtained from the radio wave sensor 21 cannot be guaranteed, and multiple occupants sitting in different seats are detected in a row. In some cases, it may not be possible to accurately determine which seat the occupant is in. Further, for example, if the sensor 20 is a seat sensor, depending on the posture of the occupant sitting on the seat, the weight of the occupant sitting on the seat cannot be measured correctly, and it is possible to determine whether the object placed on the seat is a person. It may not be possible to determine whether
 一方で、撮像装置30を用いて乗員を検知する場合は、撮像装置30と異なるセンサ20を用いて乗員を検知する場合に比して信頼性が高い検知結果を得ることができる。そこで、後席の乗員の検知について、撮像画像から乗員が検知された場合は、撮像画像を用いた乗員の検知処理による検知結果を用いることで、乗員の検知精度を向上できる。 On the other hand, when the imaging device 30 is used to detect the occupant, a more reliable detection result can be obtained than when the sensor 20 different from the imaging device 30 is used to detect the occupant. Therefore, regarding the detection of the occupant in the rear seat, when the occupant is detected from the captured image, the occupant detection accuracy can be improved by using the detection result of the occupant detection processing using the captured image.
 撮像画像を用いた乗員の検知処理について説明する。図3は、実施の形態1に係る撮像画像を用いた乗員検知例を示す説明図である。図3には、撮像装置30から取得した撮像画像を示しており、後述する第2検知部5の乗員検知処理により、運転席51に着座する乗員56、助手席52に着座する乗員57、後席左側の座席53に着座する乗員58、及び後席中央の座席55に着座する乗員60をそれぞれ検知した例を示している。 The occupant detection process using captured images will be explained. FIG. 3 is an explanatory diagram showing an example of occupant detection using captured images according to the first embodiment. FIG. 3 shows captured images acquired from the imaging device 30, and an occupant 56 sitting in the driver's seat 51, an occupant 57 sitting in the passenger's seat 52, and an occupant 57 sitting in the passenger's seat 52 are detected by the occupant detection processing of the second detection unit 5, which will be described later. An example is shown in which an occupant 58 seated on the seat 53 on the left side of the seat and an occupant 60 seated on the seat 55 in the center of the rear seat are respectively detected.
 第2検知部5による乗員検知処理例について説明する。まず、第2検知部5は、撮像画像の解析を行い、撮像画像内における、乗員の顔を検出する。そして、第2検知部5は、乗員の顔が検出された領域(図3中破線に示す領域。以下、顔領域という。)と、顔領域における乗員の特徴情報とを取得する。ここで、乗員の特徴情報とは、例えば、顔の大きさを正規化した上での、目、鼻、口、頬の部分のコントラスト比等である。なお、第2検知部5は、乗員の特徴情報を用いて顔検出に成功したか否かを判定してもよい。この場合、第2検知部5は、例えば、顔領域内のコントラスト比から、顔領域の輝度分布が、顔らしいことが判明すれば、顔検出に成功したと判定し、輝度分布が顔らしくないと判定すれば、顔検出に失敗したと判定すればよい。 An example of occupant detection processing by the second detection unit 5 will be described. First, the second detection unit 5 analyzes the captured image and detects the face of the passenger in the captured image. Then, the second detection unit 5 acquires an area where the occupant's face is detected (the area indicated by the dashed line in FIG. 3; hereinafter referred to as a face area) and occupant feature information in the face area. Here, the characteristic information of the occupant is, for example, the contrast ratio of the eyes, nose, mouth, and cheeks after normalizing the size of the face. Note that the second detection unit 5 may determine whether or not the face detection is successful using the occupant's characteristic information. In this case, for example, if the luminance distribution of the face region is found to be face-like from the contrast ratio in the face region, the second detection unit 5 determines that face detection has succeeded, and the luminance distribution does not look like a face. , it is determined that the face detection has failed.
 第2検知部5は、乗員の顔を検出したら、撮像画像から、例えば乗員の顔の輪郭に接する矩形等の、乗員の顔を囲む顔領域に係る座標を取得する。ここで、顔領域に係る座標は、例えば、顔領域が矩形である場合、矩形の各頂点、中心等の座標である。さらに、第2検知部5は、顔領域の幅、高さ、及び面積等の寸法を、顔領域の座標から算出する。 After detecting the occupant's face, the second detection unit 5 acquires the coordinates of the face area surrounding the occupant's face, such as a rectangle that contacts the contour of the occupant's face, from the captured image. Here, the coordinates relating to the face area are, for example, the coordinates of each vertex, center, etc. of the rectangle when the face area is rectangular. Further, the second detection unit 5 calculates dimensions such as the width, height, and area of the face area from the coordinates of the face area.
 そして、第2検知部5は、取得した顔領域の座標、幅、高さ、及び面積等から、乗員が存在する座席を特定し、乗員の存否に関する情報と、乗員が存在する座席とを検知結果として、車両監視装置10の記憶部に出力する。なお、第2検知部5による乗員検知処理は上述の例に限らず、公知の種々のアルゴリズムを用いることが可能である。 Then, the second detection unit 5 identifies the seat where the occupant is present from the acquired coordinates, width, height, area, etc. of the face region, and detects information regarding the presence or absence of the occupant and the seat where the occupant is present. As a result, it is output to the storage unit of the vehicle monitoring device 10 . The occupant detection process by the second detection unit 5 is not limited to the above example, and various known algorithms can be used.
 また、上述の例では、第2検知部5の検知結果を記憶部に格納する例を示しているが、第2検知部5は、検知結果を存否判定部6に出力してもよい。以下、説明のため、第2検知部5による撮像画像を用いた乗員検知処理を、第2乗員検知処理という。また、第2検知部5による第2乗員検知処理の検知結果を第2検知結果という。 Also, in the above example, the detection result of the second detection unit 5 is stored in the storage unit, but the second detection unit 5 may output the detection result to the existence determination unit 6. Hereinafter, for the sake of explanation, the occupant detection process using the captured image by the second detection unit 5 will be referred to as the second occupant detection process. A detection result of the second occupant detection process by the second detection unit 5 is called a second detection result.
 次に、乗員が存在しているか否かを判定する存否判定について説明する。本実施の形態の存否判定部6は、センサ情報から後席の乗員が検知されなかったとしても、撮像画像から後席の乗員が検知できれば、後席に乗員が存在すると判定する。このように、乗員検知部12は、センサ情報から得られた後席の乗員の存否に関する検知結果と、撮像画像から得られた後席の乗員の存否に関する検知結果とが異なる場合、撮像画像から得られた後席の乗員の存否に関する検知結果を用いて、後席の乗員の存否を特定する。 Next, the presence/absence determination for determining whether or not a passenger is present will be described. The presence/absence determination unit 6 of the present embodiment determines that there is a passenger in the rear seat if the passenger in the rear seat can be detected from the captured image even if the passenger in the rear seat is not detected from the sensor information. In this way, if the detection result regarding the presence or absence of a rear seat passenger obtained from the sensor information differs from the detection result regarding the presence or absence of a rear seat passenger obtained from the captured image, the passenger detection unit 12 The presence or absence of the passenger in the rear seat is specified using the obtained detection result regarding the presence or absence of the passenger in the rear seat.
 以下、車両監視装置10の動作について説明する。図4は、実施の形態1に係る車両監視装置10の動作例を示すフローチャートである。また、図4のフローチャートには、車両監視装置10の動作を終了する処理が示されていないが、車両監視装置10は、例えば、車両情報取得部3が、車両側制御装置200から車両50のエンジンが停止した旨を示す信号を取得した場合、第1乗員検知処理及び第2乗員検知処理の動作を終了する。なお、以下の説明では、後席の座席が複数設けられた場合でも、後席の座席をまとめて後席と記載しているが、後席の座席が複数である場合は、後席の座席のそれぞれの座席に対して、処理を行うものとする。 The operation of the vehicle monitoring device 10 will be described below. FIG. 4 is a flow chart showing an operation example of the vehicle monitoring device 10 according to the first embodiment. Although the flowchart of FIG. 4 does not show a process for terminating the operation of the vehicle monitoring device 10, the vehicle monitoring device 10, for example, the vehicle information acquisition unit 3 receives the vehicle 50 from the vehicle-side control device 200. When the signal indicating that the engine has stopped is acquired, the operations of the first occupant detection process and the second occupant detection process are terminated. In the explanation below, even if multiple rear seats are provided, the rear seats are collectively referred to as rear seats. shall be processed for each seat.
 まず、車両監視装置10の動作が開始した後、車両監視装置10の第1情報取得部1は、センサ20からセンサ情報を取得する(ST101)。次に、第1検知部4は、センサ情報を用いて第1乗員検知処理を行う(ST102)。ここで、ST102の処理では、第1検知部4は、後席のそれぞれ、つまり、車両に設けられた後席の座席のそれぞれに対して、第1乗員検知処理を行う。なお、第1検知部4による後席の座席のそれぞれに対して乗員が検知されたか否かの判定は、例えば、センサ情報として取得した距離データ、角度データから算出した検知対象の大きさ及び存在する範囲等から、検知された検知対象が後席のうち、いずれの座席に存在するかに基づいて判定すればよい。 First, after the operation of the vehicle monitoring device 10 is started, the first information acquiring section 1 of the vehicle monitoring device 10 acquires sensor information from the sensor 20 (ST101). Next, the first detection section 4 performs a first occupant detection process using sensor information (ST102). Here, in the process of ST102, the first detection unit 4 performs the first occupant detection process for each of the rear seats, that is, each of the rear seats provided in the vehicle. It should be noted that the determination of whether or not an occupant has been detected for each of the rear seats by the first detection unit 4 is performed by, for example, the distance data acquired as sensor information, the size and presence of the detection target calculated from the angle data, and the like. The determination may be made based on which of the rear seats the detection target is located in, based on the range of detection.
 そして、第1検知部4は、後席の乗員が検知されたか否かを示す検知結果を、車両監視装置10の記憶部に出力する。第1検知部4は、後席において乗員が検知された場合(ST102;YES)、後席の乗員が検知されたことを示す検知結果を、第1検知結果として車両監視装置10の記憶部に出力し、後席第1フラグをONにする(ST103)。一方、第1検知部4は、後席において乗員が検知されなかった場合(ST102;NO)、後席の乗員が検知されなかったことを示す検知結果を、第1検知結果として車両監視装置10の記憶部に出力し、後席第1フラグをOFFにする(ST104)。 Then, the first detection unit 4 outputs to the storage unit of the vehicle monitoring device 10 the detection result indicating whether or not the passenger in the rear seat has been detected. When an occupant is detected in the rear seat (ST102; YES), the first detection unit 4 stores the detection result indicating that an occupant in the rear seat has been detected in the storage unit of the vehicle monitoring device 10 as the first detection result. output, and the rear seat first flag is turned ON (ST103). On the other hand, when the passenger in the rear seat is not detected (ST102; NO), the first detection unit 4 sets the detection result indicating that the passenger in the rear seat is not detected as the first detection result to the vehicle monitoring device 10. and the rear seat first flag is turned OFF (ST104).
 ここで、記憶部に記憶される後席第1フラグとは、第1乗員検知処理により後席において乗員が検知されたか否かを示すものである。すなわち、第1検知部4が後席の乗員が検知されたことを示す第1検知結果を出力すれば、後席第1フラグはONとなり、第1検知部4が後席の乗員が検知されなかったことを示す第1検知結果を出力すれば、後席第1フラグはOFFとなる。 Here, the rear seat first flag stored in the storage unit indicates whether or not an occupant is detected in the rear seat by the first occupant detection process. That is, when the first detection unit 4 outputs the first detection result indicating that the passenger in the rear seat has been detected, the rear seat first flag is turned ON, and the first detection unit 4 detects the passenger in the rear seat. If the first detection result indicating that there is no seat is output, the rear seat first flag is turned OFF.
 なお、後席第1フラグは、後席の座席数に応じて複数又は単数設けられたものとする。例えば、後席の座席数が3つであり、後席左側の座席、後席右側の座席、及び後席中央の座席が車両に設けられていれば、後席第1フラグは、後席左側第1フラグ、後席右側第1フラグ、後席中央第1フラグとなる。 It should be noted that a plurality or a single number of rear seat first flags are provided according to the number of rear seats. For example, if the number of rear seats is three and the vehicle is provided with a seat on the left side of the rear seat, a seat on the right side of the rear seat, and a seat in the center of the rear seat, the rear seat first flag The first flag, the rear seat right side first flag, and the rear seat center first flag.
 次に、車両監視装置10の第2情報取得部2は、撮像装置30から撮像画像を取得する(ST105)。第2検知部5は、撮像画像を用いて第2乗員検知処理を行う(ST106)。ここで、ST106の処理では、第2検知部5は、後席のそれぞれ、つまり、車両に設けられた後席の座席のそれぞれに対して、乗員の検知を行う。なお、第2検知部5による後席の座席のそれぞれに対して乗員が検知されたか否かの判定は、例えば、撮像画像から検出された特徴情報、顔領域の座標、幅、高さ、及び面積等から、検知された検知対象が後席のうち、いずれの座席に存在するか否かに基づいて判定すればよい。 Next, the second information acquisition section 2 of the vehicle monitoring device 10 acquires the captured image from the imaging device 30 (ST105). The second detection unit 5 uses the captured image to perform a second occupant detection process (ST106). Here, in the process of ST106, the second detection unit 5 detects the occupant of each rear seat, that is, each of the rear seats provided in the vehicle. It should be noted that the determination of whether or not an occupant has been detected for each of the rear seats by the second detection unit 5 is performed by, for example, characteristic information detected from the captured image, the coordinates of the face region, the width, the height, and the The determination may be made on the basis of which of the rear seats the detection target is located in, based on the area or the like.
 そして、第2検知部5は、後席の乗員が検知されたか否かの検知結果を、車両監視装置10の記憶部に出力する。第2検知部5は、後席において乗員が検知された場合(ST106;YES)、後席の乗員が検知されたことを示す検知結果を、第2検知結果として車両監視装置10の記憶部に出力し、後席第2フラグをONにする(ST107)。一方、第2検知部5は、後席において乗員が検知されなかった場合(ST106;NO)、後席の乗員が検知されなかったことを示す検知結果を、第2検知結果として車両監視装置10の記憶部に出力し、後席第2フラグをOFFにする(ST108)。 Then, the second detection unit 5 outputs the detection result as to whether or not an occupant in the rear seat has been detected to the storage unit of the vehicle monitoring device 10 . When an occupant is detected in the rear seat (ST106; YES), the second detection unit 5 stores the detection result indicating that an occupant in the rear seat has been detected in the storage unit of the vehicle monitoring device 10 as a second detection result. output, and the rear seat second flag is turned ON (ST107). On the other hand, when the passenger in the rear seat is not detected (ST106; NO), the second detection unit 5 sets the detection result indicating that the passenger in the rear seat is not detected as the second detection result to the vehicle monitoring device 10. , and the rear seat second flag is turned OFF (ST108).
 ここで、記憶部に記憶される後席第2フラグとは、第2乗員検知処理により、後席において乗員が検知されたか否かを示すものである。すなわち、第2検知部5が後席の乗員が検知されたことを示す第2検知結果を出力すれば、後席第2フラグはONとなり、第2検知部5が後席の乗員が検知されなかったことを示す第2検知結果を出力すれば、後席第2フラグはOFFとなる。 Here, the rear seat second flag stored in the storage unit indicates whether or not a passenger is detected in the rear seat by the second passenger detection process. That is, when the second detection unit 5 outputs a second detection result indicating that a rear seat passenger has been detected, the rear seat second flag is turned ON, and the second detection unit 5 detects a rear seat passenger. If the second detection result indicating that there is no seat is output, the rear seat second flag is turned OFF.
 なお、後席第2フラグは、後席の座席数に応じて複数又は単数設けられたものとする。例えば、後席の座席数が3つであり、後席左側の座席、後席右側の座席、及び後席中央の座席が車両に設けられていれば、後席第2フラグは、後席左側第2フラグ、後席右側第2フラグ、後席中央第2フラグとなる。 It should be noted that a plurality or a single number of rear seat second flags are provided according to the number of rear seats. For example, if the number of rear seats is three, and the vehicle is provided with a seat on the left side of the rear seat, a seat on the right side of the rear seat, and a seat in the center of the rear seat, the second rear seat flag The second flag, the rear seat right side second flag, and the rear seat center second flag.
 次に、車両監視装置10の存否判定部6は、第1検知部4から第1検知結果を取得し、第2検知部5から第2検知結果を取得する。そして、存否判定部6は、第1検知結果及び第2検知結果を用いて、後席に乗員が存在しているか否かを判定する。 Next, the presence/absence determination unit 6 of the vehicle monitoring device 10 acquires the first detection result from the first detection unit 4 and acquires the second detection result from the second detection unit 5 . Then, the presence/absence determination unit 6 uses the first detection result and the second detection result to determine whether or not an occupant is present in the rear seat.
 存否判定部6は、第1検知結果として後席第1フラグを参照して、後席第1フラグがONであるか否かを判定し(ST109)、センサ情報から後席の乗員が検知された否かを確認する。そして、存否判定部6は、後席第1フラグがONであれば(ST109;YES)、すなわち、センサ情報から後席の乗員が検知されたことを確認した場合、後席に乗員が存在すると判定し、判定結果を車両側制御装置200に出力する(ST110)。なお、存否判定部6による判定結果は、車両側制御装置200の、例えば、シートベルトリマインダの制御を行う報知制御部(図示せず)に出力され、報知制御部は、後席に乗員が存在するという判定結果を受信した場合、後席のシートベルトリマインダの機能をONにする。 The presence/absence determining unit 6 refers to the rear seat first flag as the first detection result, and determines whether or not the rear seat first flag is ON (ST109), and the rear seat occupant is detected from the sensor information. Check whether or not Then, if the rear seat first flag is ON (ST109; YES), that is, if it is confirmed from the sensor information that an occupant in the rear seat is detected, the presence/absence determination unit 6 determines that there is an occupant in the rear seat. It determines and outputs the determination result to the vehicle-side control device 200 (ST110). The determination result by the presence/absence determination unit 6 is output to, for example, a notification control unit (not shown) that controls a seatbelt reminder of the vehicle-side control device 200, and the notification control unit detects that a passenger is present in the rear seat. When it receives the determination result that it should, the function of the seat belt reminder in the rear seat is turned on.
 一方、存否判定部6は、後席第1フラグがOFFであれば(ST109;NO)、すなわち、センサ情報から後席において乗員が検知されなかったことを確認した場合、次に説明するST111の処理に進む。 On the other hand, if the rear seat first flag is OFF (ST109; NO), i.e., if it is confirmed from the sensor information that no occupant is detected in the rear seat, the presence/absence determination unit 6 performs ST111 to be described next. Proceed to processing.
 存否判定部6は、第2検知結果として後席第2フラグを参照して、後席第2フラグがONであるか否かを判定し(ST111)、撮像画像から後席の乗員が検知されたか否かを確認する。そして、存否判定部6は、後席第2フラグがONであれば(ST111;YES)、すなわち、撮像画像から後席の乗員が検知されたことを確認した場合、後席に乗員が存在すると判定し、判定結果を車両側制御装置200に出力する(ST110)。 The presence/absence determining unit 6 refers to the rear seat second flag as the second detection result, and determines whether or not the rear seat second flag is ON (ST111), and the rear seat occupant is detected from the captured image. Check whether or not Then, if the rear seat second flag is ON (ST111; YES), that is, if it is confirmed that a rear seat occupant is detected from the captured image, the presence/absence determination unit 6 determines that a rear seat occupant is present. It determines and outputs the determination result to the vehicle-side control device 200 (ST110).
 一方、存否判定部6は、後席第2フラグがOFFであれば(ST111;NO)、すなわち、撮像画像から後席の乗員が検知されなかったことを確認した場合、後席に乗員が存在しないと判定し、判定結果を車両側制御装置200に出力する(ST112)。 On the other hand, if the rear seat second flag is OFF (ST111; NO), that is, if the presence/absence determination unit 6 confirms that the passenger in the rear seat is not detected from the captured image, the passenger is present in the rear seat. It determines not to do so, and outputs the determination result to vehicle-side control device 200 (ST112).
 このように、乗員検知部12は、センサ情報から後席において乗員が検知されなかったことを示す検知結果を得て、撮像画像から後席の乗員が検知されたことを示す検知結果を得た場合、後席に乗員が存在すると特定する。これにより、実際は、後席に乗員が存在しているにもかかわらず、センサ情報から後席の乗員が検知されなかったとしても、撮像画像から後席の乗員が検知された場合に、撮像画像から得られた検知結果を用いて後席に乗員が存在すると特定するため、後席の乗員の検知漏れを防ぎ、後席の乗員、すなわち対象人物の検知精度を向上できる。 In this way, the occupant detection unit 12 obtains a detection result indicating that no occupant was detected in the rear seat from the sensor information, and obtained a detection result indicating that an occupant in the rear seat was detected from the captured image. In this case, it is specified that there is an occupant in the rear seat. As a result, even if an occupant in the rear seat is not actually detected from the sensor information, even if the occupant in the rear seat is not detected from the captured image, when the occupant in the rear seat is detected from the captured image, Since it is specified that a passenger is present in the rear seat using the detection result obtained from the above, it is possible to prevent failure to detect the passenger in the rear seat and improve the detection accuracy of the passenger in the rear seat, that is, the target person.
 なお、第2検知部5が、第1検知結果にて、後席の座席のうち、乗員が検知されなかったことが示された座席に対して第2乗員検知処理を行うようにしてもよい。このようにすると、検知漏れを抑制できるとともに、検知漏れがある可能性のある座席に対して第2乗員検知処理を行うため、処理負荷を軽減できる。また、図4のフローチャートにおいて、車両監視装置10により、ST101~ST104の処理を行った後に、ST105~ST108の処理を行うものとして図示しているが、図4のフローチャートは一例であり、例えば、ST101~ST104の処理及びST105~ST108の処理を並列で行ってもよい。 It should be noted that the second detection unit 5 may perform the second occupant detection process on the rear seat for which the first detection result indicates that no occupant has been detected. . In this way, detection omissions can be suppressed, and the processing load can be reduced because the second occupant detection processing is performed for the seats where there is a possibility of detection omissions. Further, in the flowchart of FIG. 4, the vehicle monitoring device 10 performs the processing of ST101 to ST104 and then performs the processing of ST105 to ST108, but the flowchart of FIG. 4 is an example. The processing of ST101 to ST104 and the processing of ST105 to ST108 may be performed in parallel.
 次に、車両監視装置10の機能を実現するハードウェア構成について説明する。図5は、実施の形態1に係る車両監視装置10のハードウェア構成例を示す図である。車両監視装置10における第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部の機能は、処理回路によって実現される。すなわち、車両監視装置10の、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部は、図5Aに示すように専用のハードウェアである処理回路10aであってもよいし、図5Bに示すようにメモリ10cに格納されているプログラムを実行するプロセッサ10bであってもよい。 Next, the hardware configuration that implements the functions of the vehicle monitoring device 10 will be described. FIG. 5 is a diagram showing a hardware configuration example of the vehicle monitoring device 10 according to the first embodiment. First information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first detection unit 4, second detection unit 5, presence/absence determination unit 6, occupant detection unit in vehicle monitoring device 10 12, and the functions of the storage unit are realized by the processing circuit. That is, of the vehicle monitoring device 10, the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, the first detection unit 4, the second detection unit 5, the presence/absence determination unit 6, The occupant detection unit 12 and the storage unit may be a processing circuit 10a that is dedicated hardware as shown in FIG. 5A, or may be a processor that executes a program stored in a memory 10c as shown in FIG. 5B. 10b.
 図5Aに示すように、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部が専用のハードウェアである場合、処理回路10aは、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-programmable Gate Array)、又はこれらを組み合わせたものが該当する。第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部の各部の機能それぞれを処理回路で実現してもよいし、各部の機能をまとめて1つの処理回路で実現してもよい。 As shown in FIG. 5A, first information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first detection unit 4, second detection unit 5, presence/absence determination unit 6, occupant When the detection unit 12 and the storage unit are dedicated hardware, the processing circuit 10a is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-programmable Gate Array), or a combination of these. First information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first detection unit 4, second detection unit 5, presence/absence determination unit 6, occupant detection unit 12, and storage unit Each function of each section may be realized by a processing circuit, or the functions of each section may be collectively realized by one processing circuit.
 図5Bに示すように、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部がプロセッサ10bである場合、各部の機能は、ソフトウェア、ファームウェア、又はソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェア又はファームウェアはプログラムとして記述され、メモリ10cに格納される。プロセッサ10bは、メモリ10cに記憶されたプログラムを読み出して実行することにより、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部の各機能を実現する。すなわち、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部は、プロセッサ10bにより実行されるときに、図4に示す各ステップが結果的に実行されることになるプログラムを格納するためのメモリ10cを備える。また、これらのプログラムは、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部の手順又は方法をコンピュータに実行させるものであるともいえる。 As shown in FIG. 5B, first information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first detection unit 4, second detection unit 5, presence/absence determination unit 6, occupant When the detection unit 12 and the storage unit are the processor 10b, the functions of each unit are realized by software, firmware, or a combination of software and firmware. Software or firmware is written as a program and stored in the memory 10c. The processor 10b reads and executes the programs stored in the memory 10c to obtain the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, the first detection unit 4, It implements the functions of the second detection unit 5, the presence/absence determination unit 6, the occupant detection unit 12, and the storage unit. That is, the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, the first detection unit 4, the second detection unit 5, the existence determination unit 6, the occupant detection unit 12, and The storage unit comprises a memory 10c for storing a program which, when executed by the processor 10b, results in the execution of the steps shown in FIG. Further, these programs include a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first detection unit 4, a second detection unit 5, a presence/absence determination unit 6, an occupant It can also be said that a computer executes the procedures or methods of the detection unit 12 and the storage unit.
 ここで、プロセッサ10bとは、例えば、CPU(Central Processing Unit)、処理装置、演算装置、プロセッサ、マイクロプロセッサ、マイクロコンピュータ、又はDSP(Digital Signal Processor)等のことである。メモリ10cは、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable ROM)、EEPROM(Electrically EPROM)等の不揮発性又は揮発性の半導体メモリであってもよいし、ハードディスク、フレキシブルディスク等の磁気ディスクであってもよいし、ミニディスク、CD(Compact Disc)、DVD(Digital Versatile Disc)等の光ディスクであってもよい。 Here, the processor 10b is, for example, a CPU (Central Processing Unit), a processing device, an arithmetic device, a processor, a microprocessor, a microcomputer, or a DSP (Digital Signal Processor). The memory 10c may be non-volatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically EPROM), etc. However, it may be a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a mini disk, a CD (Compact Disc) or a DVD (Digital Versatile Disc).
 なお、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェア又はファームウェアで実現するようにしてもよい。このように、車両監視装置10における処理回路10aは、ハードウェア、ソフトウェア、ファームウェア、又はこれらの組み合わせによって、上述の各機能を実現することができる。また、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1検知部4、第2検知部5、存否判定部6、乗員検知部12、及び記憶部の少なくとも一部の機能を、外部サーバに実行させてもよい。 In addition, the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, the first detection unit 4, the second detection unit 5, the presence/absence determination unit 6, the occupant detection unit 12, and Each function of the storage unit may be partly realized by dedicated hardware and partly realized by software or firmware. In this way, the processing circuit 10a in the vehicle monitoring device 10 can realize each function described above by hardware, software, firmware, or a combination thereof. Also, a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first detection unit 4, a second detection unit 5, a presence/absence determination unit 6, an occupant detection unit 12, and At least part of the functions of the storage unit may be executed by an external server.
 このように、車両監視装置10に、車両の後席を検知範囲に含むセンサ20から、センサ情報を取得する第1情報取得部1と、車両の前席及び後席を撮像範囲に含む撮像装置30から、撮像画像を取得する第2情報取得部2と、第1情報取得部1及び第2情報取得部2がそれぞれ取得した、センサ情報及び撮像画像から、後席の乗員を検知する乗員検知部12と、を備え、乗員検知部12は、センサ情報から得られた後席の乗員の存否に関する検知結果と、撮像画像から得られた後席の乗員の存否に関する検知結果とが異なる場合、撮像画像から得られた後席の乗員の存否に関する検知結果を用いて、後席の乗員の存否を特定するものであると、後席の乗員の検知漏れを防ぎ、後席の乗員、すなわち対象人物の検知精度を向上できる。 In this manner, the vehicle monitoring device 10 includes the first information acquisition unit 1 that acquires sensor information from the sensor 20 whose detection range includes the rear seats of the vehicle, and the imaging device that includes the front and rear seats of the vehicle as its imaging range. From 30, a second information acquisition unit 2 that acquires a captured image, and a passenger detection that detects a passenger in a rear seat from the sensor information and the captured image respectively acquired by the first information acquisition unit 1 and the second information acquisition unit 2. and the occupant detection unit 12, when the detection result regarding the presence or absence of the occupant in the rear seat obtained from the sensor information and the detection result regarding the presence or absence of the occupant in the rear seat obtained from the captured image are different, If the presence or absence of the rear seat occupant is specified using the detection result regarding the presence or absence of the rear seat occupant obtained from the captured image, the rear seat occupant, that is, the target, can be prevented from failing to detect the rear seat occupant. People detection accuracy can be improved.
実施の形態2. 
 実施の形態2に係る車両監視装置10は、実施の形態1と同様に、センサ情報を取得する第1情報取得部1と、撮像画像を取得する第2情報取得部2と、第1情報取得部1及び第2情報取得部2がそれぞれ取得した、センサ情報及び撮像画像から、後席の乗員を検知する乗員検知部12とを備える。本実施の形態では、第2検知部5が、撮像画像から後席の乗員が検知されなかった場合に、撮像画像を用いて後席に乗員が存在するか否かの判定を行う点について、車両監視装置10の処理動作が異なる。実施の形態1と同じ構成要素には同じ符号を付し、その説明を省略する。また、本実施の形態に係る車両監視システム100の検知における対象人物は、後席の乗員である。なお、車両監視システム100は、後席の乗員のみを検知するものでなく、対象人物に前席の乗員を含んでもよい。
Embodiment 2.
A vehicle monitoring apparatus 10 according to Embodiment 2 includes, as in Embodiment 1, a first information acquisition unit 1 that acquires sensor information, a second information acquisition unit 2 that acquires a captured image, and a first information acquisition unit 2 that acquires a captured image. An occupant detection unit 12 that detects an occupant in a rear seat from the sensor information and captured image acquired by the unit 1 and the second information acquisition unit 2, respectively. In the present embodiment, when the second detection unit 5 does not detect an occupant in the rear seat from the captured image, the captured image is used to determine whether or not there is an occupant in the rear seat. The processing operation of the vehicle monitoring device 10 is different. The same reference numerals are given to the same components as in the first embodiment, and the description thereof is omitted. Moreover, the target person in the detection of the vehicle monitoring system 100 according to the present embodiment is the passenger in the rear seat. It should be noted that the vehicle monitoring system 100 may detect not only rear-seat passengers, but also front-seat passengers as target persons.
 撮像装置30と異なるセンサ20を用いた第1乗員検知処理において、例えば、センサ20が電波センサ21であり、電波センサ21が、座席上に存在する動体から反射波が反射されたことを検出すれば、座席に検知対象が存在すると判定される。このような場合、座席上に存在する動体が、例えば、ペットボトル内の水であり、乗員以外の物体であるにもかかわらず、ペットボトルから反射された反射波に基づいて、座席に乗員が存在していると誤検知される可能性がある。また、例えば、センサ20が着座センサである場合、着座センサで座席上の物体の重量を正しく測定できたとしても、座席上の物体の重量によっては、座席上の物体が乗員以外の物体であるにも関わらず、座席に乗員が存在すると誤検知されるおそれもある。 In the first occupant detection process using the sensor 20 different from the imaging device 30, for example, the sensor 20 is the radio wave sensor 21, and if the radio wave sensor 21 detects that the reflected wave is reflected from the moving object present on the seat. For example, it is determined that the detection target exists on the seat. In such a case, even though the moving object present on the seat is, for example, water in a PET bottle and an object other than the occupant, it is possible to determine if the occupant is on the seat based on the reflected wave from the PET bottle. Existence may be falsely detected. Further, for example, when the sensor 20 is a seat sensor, even if the seat sensor can correctly measure the weight of the object on the seat, the object on the seat may be an object other than the occupant depending on the weight of the object on the seat. In spite of this, there is a risk of erroneous detection when an occupant is present in the seat.
 上述のように、撮像装置30と異なるセンサ20で乗員を検知する場合、実際には座席に乗員が存在していないにもかかわらず、座席に乗員が存在していると誤検知されるおそれがあり、乗員の検知精度が十分であるとはいえない。 As described above, when the sensor 20 different from the imaging device 30 detects an occupant, there is a possibility that the presence of an occupant in the seat may be erroneously detected even though no occupant is actually present in the seat. Therefore, it cannot be said that the occupant detection accuracy is sufficient.
 そのため、本実施の形態の車両監視装置10では、第1検知部4がセンサ情報から後席に乗員が存在するという検知結果を示したとしても、第2検知部5の、撮像画像を用いた第2乗員検知処理により、後席に乗員が存在しないことが判明した場合には、撮像画像を用いた検知結果を優先し、後席に乗員が存在していないと特定する。以下、本実施の形態の第2検知部5による乗員検知処理について説明する。 Therefore, in the vehicle monitoring device 10 of the present embodiment, even if the first detection unit 4 indicates the detection result that the passenger is present in the rear seat from the sensor information, the captured image of the second detection unit 5 is used. If the second occupant detection process reveals that no occupant is present in the rear seat, priority is given to the detection result using the captured image, and it is specified that no occupant is present in the rear seat. The occupant detection processing by the second detection unit 5 of the present embodiment will be described below.
 本実施の形態の第2検知部5は、撮像画像から後席の乗員を検知することに加え、撮像画像から後席の乗員が検知されない場合は、撮像画像から後席に乗員が存在するか否かを判定する。 In addition to detecting rear seat passengers from the captured image, the second detection unit 5 of the present embodiment detects whether a rear seat passenger is present from the captured image. determine whether or not
 図6は、実施の形態2に係る撮像画像を用いた乗員検知例を示す説明図である。図6に示す撮像画像には、車両の運転席51及び助手席52にそれぞれ乗員56、57が着座し、後席のうち、後席左側の座席53及び後席右側の座席54に乗員58、59が着座している状態が撮像された例を示している。さらに、図6において、後席中央の座席55には、乗員は着座しておらず、チャイルドシート61が配置されている。 FIG. 6 is an explanatory diagram showing an example of occupant detection using captured images according to the second embodiment. In the captured image shown in FIG. 6, crew members 56 and 57 are respectively seated in a driver's seat 51 and a front passenger seat 52 of a vehicle, and among the rear seats, a passenger 58 and a passenger 58 are seated in a seat 53 on the left side of the rear seat and a seat 54 on the right side of the rear seat. 59 shows an example in which the state in which the person 59 is seated is imaged. Further, in FIG. 6, a child seat 61 is placed on a seat 55 in the center of the rear seat, without an occupant sitting there.
 図6の例において、例えば、センサ20が着座センサである場合、上述したように、後席中央の座席55に配置されたチャイルドシート61により、センサ情報を用いた第1乗員検知処理の検知結果が、後席中央の座席55に乗員が検知されたことを示す可能性がある。一方、撮像画像からは、後席中央の座席55に乗員は存在しておらず、後席中央の座席55にチャイルドシート61が配置されていることが判明しているため、第2検知部5は、次に説明する検知処理により後席中央の座席55に乗員以外の物体が存在する、すなわち後席中央の座席55に乗員が存在していない旨を示す検知結果を出力する。 In the example of FIG. 6, for example, when the sensor 20 is a seat sensor, the detection result of the first occupant detection process using the sensor information is obtained by the child seat 61 placed in the seat 55 in the center of the rear seat, as described above. , may indicate that an occupant has been detected in the middle rear seat 55 . On the other hand, it is clear from the captured image that no occupant is present in the seat 55 at the center of the rear seat, and that a child seat 61 is placed in the seat 55 at the center of the rear seat. Then, a detection result indicating that there is an object other than the passenger on the seat 55 at the center of the rear seat, that is, no passenger is present on the seat 55 at the center of the rear seat, is output by the detection processing described below.
 第2検知部5は、例えば、第2情報取得部2から、撮像画像を取得し、撮像画像から、設定された領域における画像を抽出する。なお、設定された領域とは、例えば、図6の領域Cに示すような、後席中央の座席55等、後席のうち、特定の座席を含むように設定された領域である。 For example, the second detection unit 5 acquires the captured image from the second information acquisition unit 2, and extracts the image in the set area from the captured image. The set area is, for example, an area set so as to include a specific seat among the rear seats, such as the seat 55 in the center of the rear seat, as shown in area C in FIG.
 そして、第2検知部5は、抽出された画像と、例えば、チャイルドシート61を撮像した画像、ペットボトルを撮像した画像等の、記憶部等に予め蓄積された類似度判定用の画像とを比較することにより、抽出された画像に写った物体を特定する。ここで、第2検知部5による抽出された画像内の物体の特定は、インスタンスセグメンテーション、テンプレートマッチング等、公知の物体認識手法を用いることが可能である。 Then, the second detection unit 5 compares the extracted image with an image for similarity determination, which is stored in advance in the storage unit or the like, such as an image of the child seat 61 or an image of a PET bottle. By doing so, the object in the extracted image is specified. Here, the identification of the object in the image extracted by the second detection unit 5 can use a known object recognition method such as instance segmentation or template matching.
 第2検知部5は、後席のうち、特定の座席に対応して抽出された画像内の物体が、例えばチャイルドシート61等であり、乗員以外の物体であることを特定した場合、後席に乗員以外の物体が存在する、すなわち後席に乗員が存在していない旨の検知結果を車両監視装置10の記憶部に出力する。一方、特定の座席以外の他の座席の乗員に遮蔽されて、座席上の物体を特定できない場合もある。このように抽出された画像に写っている物体が特定されなかった場合は、第2検知部5は、後席に乗員が存在するか否かが明確ではない旨の検知結果を車両監視装置10の記憶部に出力する。なお、上述した検知処理は一例であり、公知の種々のアルゴリズムを用いて、撮像画像内の物体を特定することも可能である。 When the second detection unit 5 identifies that an object in the image extracted corresponding to a specific seat among the rear seats is, for example, a child seat 61 or the like, and is an object other than a passenger, A detection result indicating that there is an object other than the passenger, that is, there is no passenger in the rear seat, is output to the storage unit of the vehicle monitoring device 10 . On the other hand, there are cases where an object on a seat cannot be specified because it is blocked by passengers in other seats than the specific seat. If the object appearing in the extracted image is not specified, the second detection unit 5 outputs the detection result indicating that it is not clear whether or not the passenger is present in the rear seat. output to the storage unit of Note that the detection processing described above is an example, and it is also possible to specify an object in a captured image using various known algorithms.
 さらに、第2検知部5による、後席に乗員が存在するか否かの判定処理は、上述の例に限らない。実施の形態1において、第2検知部5が撮像画像に対して顔検出処理を行う例を挙げて説明したが、第2検知部5は、撮像画像内に顔検出を行うための検知領域を設定し、検知領域内で顔検出処理を行うことも可能である。 Furthermore, the process of determining whether or not there is a passenger in the rear seat by the second detection unit 5 is not limited to the above example. In Embodiment 1, an example in which the second detection unit 5 performs face detection processing on a captured image has been described. It is also possible to set and perform face detection processing within the detection area.
 第2検知部5が、検知領域を設定し、検知領域内で顔検出処理を行う場合、例えば、後席に設けられた検知領域に、運転席51又は助手席52、すなわち前席の乗員が侵入していない状態で、検知領域内で後席の乗員が検知されなければ、後席に乗員が着座していないことが明確となる。後席に設けられた検知領域に、前席の乗員が侵入していない状態では、前席の乗員に、後席の乗員が遮蔽されている可能性が低いためである。 When the second detection unit 5 sets a detection region and performs face detection processing within the detection region, for example, the detection region provided in the rear seat is the driver's seat 51 or the passenger's seat 52, that is, the passenger in the front seat. If the passenger in the rear seat is not detected within the detection area while the vehicle is not intruded, it becomes clear that no passenger is seated in the rear seat. This is because when the front seat occupant does not enter the detection area provided in the rear seat, the rear seat occupant is less likely to be blocked by the front seat occupant.
 図7は、実施の形態2に係る撮像画像を用いた乗員検知例を示す説明図である。図7に示す撮像画像には、車両の運転席51及び助手席52にそれぞれ乗員56、57が着座し、後席のうち、後席左側の座席53及び後席右側の座席54にそれぞれ乗員58、59が着座している状態が撮像された例を示している。さらに、図7において、後席中央の座席55には乗員及び物体のいずれも存在しない。 FIG. 7 is an explanatory diagram showing an example of occupant detection using captured images according to the second embodiment. In the captured image shown in FIG. 7, passengers 56 and 57 are seated in a driver's seat 51 and a front passenger's seat 52 of the vehicle, respectively. , 59 are taken in a seated state. Further, in FIG. 7, neither an occupant nor an object is present in the seat 55 in the middle of the rear seat.
 図7の例において、後席中央に対して設けられた検知領域(図7に示す領域E)には、運転席51又は助手席52の乗員56、57は侵入しておらず、さらに、後席中央に対して設けられた検知領域内で、乗員の顔は検出されていない。このような場合、撮像画像を用いた検知処理で、後席中央において乗員が検知されなかったとしても、後席中央に乗員が存在しないことが明確である。 In the example of FIG. 7, the occupants 56 and 57 in the driver's seat 51 or the passenger's seat 52 do not enter the detection area (area E shown in FIG. 7) provided for the center of the rear seat. The occupant's face is not detected within the detection area provided for the center of the seat. In such a case, even if an occupant is not detected in the center of the rear seat by the detection processing using the captured image, it is clear that there is no occupant in the center of the rear seat.
 さらに、図7には、後席右側の座席54に存在する乗員59が、運転席51の乗員56に遮蔽されている例を示している。つまり、後席右側に対して設けられた検知領域(図7に示す領域D)では、後席右側の乗員59が検知されない一方で、運転席51の乗員56が後席右側に対して設けられた検知領域に大幅に侵入している。上記のような場合、撮像画像を用いた検知処理において、後席右側に乗員が検知されなかったとしても、後席右側に乗員が存在するか否かは明確ではない。 Furthermore, FIG. 7 shows an example in which an occupant 59 in the seat 54 on the right side of the rear seat is shielded by an occupant 56 in the driver's seat 51 . That is, in the detection area provided for the right side of the rear seat (area D shown in FIG. 7), the occupant 59 on the right side of the rear seat is not detected, while the occupant 56 on the driver's seat 51 is provided on the right side of the rear seat. significantly intrudes into the detection area. In the above case, even if an occupant is not detected on the right side of the rear seat in the detection process using the captured image, it is not clear whether there is an occupant on the right side of the rear seat.
 そのため、第2検知部5は、例えば、後席に乗員が存在するか否かを判定処理の一例として、撮像画像において、後席に対して設けられた検知領域に占める、運転席51又は助手席52の乗員56、57から検出された顔領域の面積が、設定された閾値未満である場合は、前席の乗員により遮蔽されていない後席に乗員が存在しない、すなわち後席に乗員が存在しないと判定する。一方、後席に対して設けられた検知領域に占める、運転席又は助手席の乗員から検出された顔領域の面積が、設定された閾値以上である場合、後席に乗員が存在するか否かが明確でないと判定する。 Therefore, the second detection unit 5, for example, as an example of the process of determining whether or not there is an occupant in the rear seat, detects the presence of the driver's seat 51 or the passenger's seat in the detection area provided for the rear seat in the captured image. If the area of the face area detected from the occupants 56 and 57 in the seat 52 is less than the set threshold value, there is no occupant in the rear seat that is not blocked by the occupant in the front seat. Determine that it does not exist. On the other hand, if the area of the detected face area of the occupant in the driver's seat or front passenger's seat in the detection area provided for the rear seat is greater than or equal to the set threshold value, whether or not the occupant is present in the rear seat. is not clear.
 次に、後席の乗員の存否の特定について説明する。乗員検知部12は、センサ情報から得られた後席の乗員の存否に関する検知結果と、撮像画像から得られた後席の乗員の存否に関する検知結果とが異なる場合、撮像画像から得られた後席の乗員の存否に関する検知結果を用いて、後席の乗員の存否を特定する。すなわち、存否判定部6は、第2検知部5から、センサ情報から後席に乗員が存在することを示す検知結果を得たとしても、撮像画像から後席に乗員が存在しないことを示す検知結果を得た場合、撮像画像から得られた検知結果を用いて、後席に乗員が存在しないと判定する。このように、センサ情報から得られた、後席に乗員が存在するという検知結果が誤検知によるものだったとしても、撮像画像から後席に乗員が存在しないことが明確であれば、後席に乗員が存在しないという特定するため、乗員検知処理における誤検知を抑制し、乗員の検知精度を向上できる。 Next, we will explain how to identify the presence or absence of passengers in the rear seats. If the detection result regarding the presence or absence of the passenger in the rear seat obtained from the sensor information and the detection result regarding the presence or absence of the rear seat passenger obtained from the captured image are different, the passenger detection unit 12 detects the presence or absence of the passenger in the rear seat obtained from the captured image. The presence or absence of a passenger in the rear seat is specified using the detection result regarding the presence or absence of the passenger in the seat. That is, even if the presence/absence determination unit 6 obtains a detection result indicating that a passenger is present in the rear seat from the sensor information from the second detection unit 5, the captured image indicates that there is no passenger present in the rear seat. When the result is obtained, it is determined that there is no occupant in the rear seat using the detection result obtained from the captured image. In this way, even if the detection result that there is a passenger in the rear seat obtained from the sensor information is due to an erroneous detection, if it is clear from the captured image that there is no passenger in the rear seat, the rear seat Since it is specified that there is no occupant in the vehicle, erroneous detection in the occupant detection process can be suppressed, and the accuracy of occupant detection can be improved.
 なお、存否判定部6は、センサ情報から後席に乗員が存在することを示す検知結果を得て、撮像画像から後席に乗員が存在するか否かが明確ではないことを示す検知結果を得た場合、後席に乗員が存在すると判定する。 The presence/absence determining unit 6 obtains a detection result indicating that a passenger is present in the rear seat from the sensor information, and obtains a detection result indicating that it is unclear whether or not a passenger is present in the rear seat from the captured image. If so, it is determined that there is an occupant in the rear seat.
 次に、車両監視装置10の動作について説明する。図8は、実施の形態2に係る車両監視装置10の動作例を示すフローチャートである。ここで、以下では実施の形態1に係る車両監視装置10の処理と同一のステップには、図4で示した符号と同一の符号を付し、説明を省略又は簡略化する。また、図8のフローチャートには、車両監視装置10の動作を終了する処理が示されていないが、車両監視装置10は、例えば、車両情報取得部3が、車両側制御装置200から車両50のエンジンが停止した旨を示す信号を取得した場合、第1乗員検知処理及び第2乗員検知処理の動作を終了する。なお、以下の説明では、後席の座席が複数設けられた場合でも、後席の座席をまとめて後席と記載しているが、後席の座席が複数である場合は、後席の座席のそれぞれの座席に対して、処理を行うものとする。 Next, the operation of the vehicle monitoring device 10 will be described. FIG. 8 is a flow chart showing an operation example of the vehicle monitoring device 10 according to the second embodiment. Here, hereinafter, steps that are the same as the processing of the vehicle monitoring device 10 according to the first embodiment are denoted by the same reference numerals as those shown in FIG. 4, and description thereof will be omitted or simplified. Although the flowchart of FIG. 8 does not show a process for terminating the operation of the vehicle monitoring device 10, the vehicle monitoring device 10, for example, the vehicle information acquisition unit 3 receives the information of the vehicle 50 from the vehicle-side control device 200. When the signal indicating that the engine has stopped is acquired, the operations of the first occupant detection process and the second occupant detection process are terminated. In the explanation below, even if multiple rear seats are provided, the rear seats are collectively referred to as rear seats. shall be processed for each seat.
 まず、車両監視装置10の動作が開始した後、車両監視装置10の第1情報取得部1は、センサ20からセンサ情報を取得する(ST101)。次に、第1検知部4は、センサ情報を用いて第1乗員検知処理を行う(ST102)。 First, after the operation of the vehicle monitoring device 10 is started, the first information acquiring section 1 of the vehicle monitoring device 10 acquires sensor information from the sensor 20 (ST101). Next, the first detection section 4 performs a first occupant detection process using sensor information (ST102).
 そして、第1検知部4は、後席の乗員が検知されたか否かの検知結果を、車両監視装置10の記憶部に出力する。第1検知部4は、後席において乗員が検知された場合(ST102;YES)、後席の乗員が検知されたことを示す検知結果を、第1検知結果として車両監視装置10の記憶部に出力し、後席第1フラグをONにする(ST103)。一方、第1検知部4は、後席において乗員が検知されなかった場合(ST102;NO)、後席の乗員が検知されなかったことを示す検知結果を、第1検知結果として車両監視装置10の記憶部に出力し、後席第1フラグをOFFにする(ST104)。 Then, the first detection unit 4 outputs the detection result as to whether or not an occupant in the rear seat has been detected to the storage unit of the vehicle monitoring device 10 . When an occupant is detected in the rear seat (ST102; YES), the first detection unit 4 stores the detection result indicating that an occupant in the rear seat has been detected in the storage unit of the vehicle monitoring device 10 as the first detection result. output, and the rear seat first flag is turned ON (ST103). On the other hand, when the passenger in the rear seat is not detected (ST102; NO), the first detection unit 4 sets the detection result indicating that the passenger in the rear seat is not detected as the first detection result to the vehicle monitoring device 10. and the rear seat first flag is turned OFF (ST104).
 次に、車両監視装置10の第2情報取得部2は、撮像装置30から撮像画像を取得する(ST105)。第2検知部5は、撮像画像を用いて第2乗員検知処理を行う(ST106)。 Next, the second information acquisition section 2 of the vehicle monitoring device 10 acquires the captured image from the imaging device 30 (ST105). The second detection unit 5 uses the captured image to perform a second occupant detection process (ST106).
 そして、第2検知部5は、後席の乗員が検知されたか否かの検知結果を、車両監視装置10の記憶部に出力する。第2検知部5は、後席において乗員が検知された場合(ST106;YES)、後席の乗員が検知されたことを示す検知結果を、第2検知結果として車両監視装置10の記憶部に出力し、後席第2フラグをONにする(ST107)。一方、第2検知部5は、後席において乗員が検知されなかった場合(ST106;NO)、後席の乗員が検知されなかったことを示す検知結果を、第2検知結果として車両監視装置10の記憶部に出力し、後席第2フラグをOFFにする(ST108)。 Then, the second detection unit 5 outputs the detection result as to whether or not an occupant in the rear seat has been detected to the storage unit of the vehicle monitoring device 10 . When an occupant is detected in the rear seat (ST106; YES), the second detection unit 5 stores the detection result indicating that an occupant in the rear seat has been detected in the storage unit of the vehicle monitoring device 10 as a second detection result. output, and the rear seat second flag is turned ON (ST107). On the other hand, when the passenger in the rear seat is not detected (ST106; NO), the second detection unit 5 sets the detection result indicating that the passenger in the rear seat is not detected as the second detection result to the vehicle monitoring device 10. , and the rear seat second flag is turned OFF (ST108).
 そして、第2検知部5は、撮像画像から後席に乗員が検知されなかった場合、後席に乗員が存在するか否かを判定する(ST201)。ここで、第2検知部5による、後席に乗員が存在するか否かの判定は、例えば、撮像画像において、後席のうち特定の座席を含むように設定された領域における画像を抽出し、抽出された画像に写った物体が乗員であるか否かを特定することにより行えばよい。 Then, if no occupant is detected in the rear seat from the captured image, the second detection unit 5 determines whether or not there is an occupant in the rear seat (ST201). Here, the second detection unit 5 determines whether or not an occupant is present in the rear seat by, for example, extracting an image in a region set to include a specific seat among the rear seats in the captured image. , by identifying whether or not the object shown in the extracted image is a passenger.
 第2検知部5は、後席に乗員が存在しないと判定した場合(ST201;YES)、後席に乗員が存在しないことを示す検知結果を記憶部に出力し、後席第3フラグをONにする(ST202)。一方、第2検知部5は、後席に乗員が存在するか否か明確でないと判定した場合(ST201;NO)、後席に乗員が存在するか否か明確でないことを示す検知結果を記憶部に出力し、後席第3フラグをOFFにする(ST203)。 When the second detection unit 5 determines that there is no passenger in the rear seat (ST201; YES), the second detection unit 5 outputs a detection result indicating that there is no passenger in the rear seat to the storage unit, and turns on the third rear seat flag. (ST202). On the other hand, when the second detection unit 5 determines that it is unclear whether or not there is an occupant in the rear seat (ST201; NO), it stores the detection result indicating that it is unclear whether or not there is an occupant in the rear seat. and the rear seat third flag is turned OFF (ST203).
 ここで、記憶部に記憶される後席第3フラグとは、後席に乗員が存在するか否かを示すものである。すなわち、第2検知部5が後席に乗員が存在しないという検知結果を出力すれば、後席第3フラグはONとなり、第2検知部5が後席に乗員が存在するか否かが明確でないという検知結果を出力すれば、後席第3フラグはOFFとなる。 Here, the rear seat third flag stored in the storage unit indicates whether or not there is an occupant in the rear seat. That is, if the second detection unit 5 outputs a detection result indicating that there is no passenger in the rear seat, the rear seat third flag is turned ON, and the second detection unit 5 clearly determines whether or not there is a passenger in the rear seat. If the detection result that it is not is output, the rear seat third flag is turned OFF.
 なお、後席第3フラグは、後席の座席数に応じて複数又は単数設けられたものとする。例えば、後席の座席数が3つであり、後席左側の座席、後席右側の座席、及び後席中央の座席が車両に設けられていれば、後席第3フラグは、後席左側第3フラグ、後席右側第3フラグ、後席中央第3フラグとなる。 It should be noted that a plurality or a single number of rear seat third flags are provided according to the number of rear seats. For example, if the number of rear seats is three and the vehicle is provided with a seat on the left side of the rear seat, a seat on the right side of the rear seat, and a seat in the center of the rear seat, the rear seat third flag The 3rd flag, the 3rd flag on the right side of the rear seat, and the 3rd flag on the center of the rear seat.
 次に、車両監視装置10の存否判定部6は、第2検知部5から第2検知結果を取得し、存否判定部6は、第2検知結果を用いて、後席に乗員が存在しているか否かを判定する。存否判定部6は、第2検知結果として後席第2フラグを参照して、後席第2フラグがONであるか否かを判定し(ST204)、撮像画像から後席の乗員が検知されたか否かを確認する。 Next, the presence/absence determination unit 6 of the vehicle monitoring device 10 acquires the second detection result from the second detection unit 5, and the presence/absence determination unit 6 uses the second detection result to determine whether a passenger is present in the rear seat. determine whether or not there is The presence/absence determining unit 6 refers to the rear seat second flag as the second detection result, and determines whether or not the rear seat second flag is ON (ST204), and the passenger in the rear seat is detected from the captured image. Check whether or not
 そして、存否判定部6は、後席第2フラグがONであれば(ST204;YES)、すなわち、撮像画像から後席の乗員が検知されたことを確認した場合、後席に乗員が存在していると判定し、判定結果を車両側制御装置200に出力する(ST205)。なお、存否判定部6による判定結果は、車両側制御装置200の、例えば、シートベルトリマインダの制御を行う報知制御部に出力され、報知制御部は、後席に乗員が存在するという判定結果を受信した場合、後席のシートベルトリマインダの機能をONにする。 Then, if the rear seat second flag is ON (ST204; YES), that is, if it is confirmed that a rear seat occupant is detected from the captured image, the presence/absence determination unit 6 confirms that a rear seat occupant is present. It is determined that the vehicle is on, and the determination result is output to the vehicle-side control device 200 (ST205). The determination result by the presence/absence determination unit 6 is output to, for example, a notification control unit that controls a seat belt reminder of the vehicle-side control device 200, and the notification control unit outputs the determination result that the passenger is present in the rear seat. When it is received, the function of the seat belt reminder in the rear seat is turned ON.
 一方、存否判定部6は、後席第2フラグがOFFであれば(ST204;NO)、すなわち、撮像画像から後席に乗員が検知されなかった場合、次に説明する処理に進む。まず、存否判定部6は、第1検知部4から第1検知結果を取得する。そして、存否判定部6は、第1検知結果として後席第1フラグを参照して、後席第1フラグがONであるか否かを判定し(ST206)、センサ情報から後席の乗員が検知されたか否かを確認する。 On the other hand, if the rear seat second flag is OFF (ST204; NO), that is, if no occupant is detected in the rear seat from the captured image, the presence/absence determination unit 6 proceeds to the processing described below. First, the presence/absence determination unit 6 acquires the first detection result from the first detection unit 4 . Then, the presence/absence determination unit 6 refers to the rear seat first flag as the first detection result, determines whether or not the rear seat first flag is ON (ST206), and determines whether or not the rear seat occupant is on the basis of the sensor information. Check whether it has been detected.
 存否判定部6は、後席第1フラグがOFFであれば(ST206;NO)、すなわち、センサ情報から後席の乗員が検知されなかったことを確認した場合、後席に乗員が存在しないと判定し、判定結果を車両側制御装置200に出力する(ST207)。 If the rear seat first flag is OFF (ST206; NO), that is, if it is confirmed from the sensor information that an occupant in the rear seat is not detected, the presence/absence determination unit 6 determines that there is no occupant in the rear seat. It judges and outputs the judgment result to the vehicle side control device 200 (ST207).
 一方、存否判定部6は、後席第1フラグがONであれば(ST206;YES)、すなわち、センサ情報から後席の乗員が検知されたことを確認した場合、次に説明するST208の処理に進む。 On the other hand, if the rear seat first flag is ON (ST206; YES), that is, if it is confirmed from the sensor information that an occupant in the rear seat has been detected, the presence/absence determining unit 6 performs the processing of ST208 described below. proceed to
 存否判定部6は、後席第3フラグを参照して、後席第3フラグがONであるか否かを判定し(ST208)、撮像画像から後席に乗員が存在しないことが判明したか否かを確認する。そして、存否判定部6は、後席第3フラグがONであれば(ST208;YES)、すなわち、撮像画像から後席に乗員が存在しないことが判明した場合、後席に乗員が存在しないと判定し、判定結果を車両側制御装置200に出力する(ST207)。 The presence/absence determination section 6 refers to the rear seat third flag to determine whether or not the rear seat third flag is ON (ST208), and whether it is found from the captured image that there is no passenger in the rear seat. confirm whether or not Then, if the rear seat third flag is ON (ST208; YES), that is, if it is found from the captured image that there is no passenger in the rear seat, the presence/absence determination unit 6 determines that there is no passenger in the rear seat. It judges and outputs the judgment result to the vehicle side control device 200 (ST207).
 一方、存否判定部6は、後席第3フラグがOFFであれば(ST208;NO)、すなわち、センサ情報から後席に乗員が存在するという検知結果が導出され、撮像画像からは後席に乗員が存在するか否かが明確でないことを確認した場合、後席に乗員が存在すると判定し、判定結果を車両側制御装置200に出力する(ST205)。 On the other hand, if the rear seat third flag is OFF (ST208; NO), the presence/absence determination unit 6 derives the detection result that the passenger is present in the rear seat from the sensor information, and the captured image indicates that the passenger is in the rear seat. When it is confirmed that it is not clear whether or not the passenger is present, it is determined that the passenger is present in the rear seat, and the determination result is output to the vehicle side control device 200 (ST205).
 このように、車両監視装置10は、後席に乗員が存在しないにもかかわらず、センサ情報から、後席に乗員が存在するという検知結果を得たとしても、撮像画像からは後席に乗員が存在しないことが明確であれば、後席に乗員が存在しないと特定するため、誤検知を抑制し、後席の乗員、すなわち対象人物の検知精度を向上できる。 In this way, even if the vehicle monitoring device 10 obtains a detection result indicating that there is a passenger in the rear seat from the sensor information, even though there is no passenger in the rear seat, the captured image indicates that there is no passenger in the rear seat. If it is clear that there is no occupant in the rear seat, it is specified that there is no occupant in the rear seat. Therefore, erroneous detection can be suppressed and the detection accuracy of the occupant in the rear seat, that is, the target person can be improved.
 なお、実施の形態1と同様に、第1検知結果にて、後席の座席のうち、乗員が存在しないこと示された座席に対して、第2乗員検知処理を行うようにしてもよい。このようにすると、第1乗員検知処理にて検知漏れがある可能性のある座席に対して第2乗員検知処理を行うため、乗員の検知漏れを抑制できるとともに、車両監視装置10の処理負荷を軽減できる。また、図8のフローチャートにおいて、車両監視装置10により、ST101~ST104の処理を行った後に、ST105~ST108及びST201~ST203の処理を行うものとして図示しているが、図8のフローチャートは一例であり、例えば、ST101~ST104と、ST105~ST108及びST201~ST203の処理とを並列で行ってもよい。 It should be noted that, as in the first embodiment, the second occupant detection process may be performed for the rear seats for which it is indicated that no occupant is present in the first detection result. In this way, since the second occupant detection process is performed on a seat for which there is a possibility of detection failure in the first occupant detection process, occupant detection failure can be suppressed and the processing load of the vehicle monitoring device 10 can be reduced. can be reduced. Further, in the flowchart of FIG. 8, the vehicle monitoring device 10 performs the processing of ST101 to ST104, and then performs the processing of ST105 to ST108 and ST201 to ST203, but the flowchart of FIG. 8 is an example. Yes, for example, ST101 to ST104 and ST105 to ST108 and ST201 to ST203 may be performed in parallel.
 なお、実施の形態1~2において、乗員検知部12が、後席の乗員を検知して後席の乗員の存否を特定する例について説明したが、乗員検知部12は、前席の乗員を検知し、前席の乗員の存否を特定してもよい。この場合、車両の前席及び後席を検知範囲に含むようにセンサ20を設け、第1検知部4により、前席の乗員の検知処理を行えばよい。 In the first and second embodiments, the occupant detection unit 12 detects an occupant in the rear seat and identifies the presence or absence of the occupant in the rear seat. It may be detected and the presence or absence of the passenger in the front seat may be specified. In this case, the sensor 20 may be provided so as to include the front and rear seats of the vehicle in its detection range, and the first detection unit 4 may detect the occupant in the front seat.
 また、実施の形態1~2において、乗員検知部12による前席の乗員の存否の特定を、図4のフローチャートを用いて説明した動作例と同様に、センサ情報から前席の乗員が検知されなかったことを示す検知結果を得て、撮像画像から前席の乗員が検知されたことを示す検知結果を得た場合、前席に乗員が存在すると特定するようにすれば、前席の乗員の検知漏れを防ぎ、前席の乗員、すなわち対象人物の検知精度を向上できる。 Further, in Embodiments 1 and 2, the presence or absence of a passenger in the front seat is specified by the passenger detection unit 12 by detecting the presence or absence of the passenger in the front seat from the sensor information in the same manner as in the operation example described with reference to the flowchart of FIG. If a detection result indicating that there was no passenger in the front seat was obtained, and a detection result indicating that the passenger in the front seat was detected from the captured image, the presence of the passenger in the front seat could be specified. It is possible to improve the detection accuracy of the passenger in the front seat, that is, the target person.
 さらに、実施の形態1~2において、乗員検知部12による前席の乗員の存否の特定を、図8のフローチャートを用いて説明した動作例と同様に、センサ情報から前席の乗員が検知されたことを示す検知結果を得て、撮像画像から前席に乗員が存在しないことを示す検知結果を得た場合、前席に乗員が存在しないと特定するようにすれば、前席の乗員の誤検知を抑制し、前席の乗員、すなわち対象人物の検知精度を向上できる。 Furthermore, in Embodiments 1 and 2, the presence or absence of a passenger in the front seat is identified by the passenger detection unit 12 by detecting the presence or absence of a passenger in the front seat from sensor information in the same manner as in the operation example described using the flowchart of FIG. When a detection result indicating that there is no occupant in the front seat is obtained from the captured image, it is determined that there is no occupant in the front seat. It is possible to suppress erroneous detection and improve the detection accuracy of the passenger in the front seat, that is, the target person.
 なお、実施の形態1~2において、第1検知部4は、センサ情報を用いて、検知された乗員の体格、年齢、又は性別等の属性を推定する属性推定処理、及び姿勢等の状態を推定する状態推定処理の少なくともいずれかを行ってもよい。この場合、センサ20を電波センサ21又は超音波センサ等で構成し、第1検知部4により、センサ20から取得した距離データ及び角度データ等を用いて、乗員の体格、姿勢等を推定すればよい。上述の属性推定処理及び状態推定処理は、公知の種々のアルゴリズムを用いることができるものであり、詳細な説明を省略する。 In Embodiments 1 and 2, the first detection unit 4 uses sensor information to perform an attribute estimation process for estimating attributes such as the physique, age, or sex of the detected occupant, and to determine the state of the posture and the like. At least one of state estimation processing to estimate may be performed. In this case, if the sensor 20 is composed of a radio wave sensor 21 or an ultrasonic sensor or the like, and the physique, posture, etc. of the occupant are estimated by the first detection unit 4 using distance data, angle data, etc. obtained from the sensor 20, good. Various well-known algorithms can be used for the attribute estimation processing and the state estimation processing described above, and detailed description thereof will be omitted.
 また、実施の形態1~2において、第2検知部5が、撮像画像における乗員の顔領域、顔パーツの位置等、乗員の顔に関する特徴情報を用いて、乗員を検知する例について説明したが、第2検知部5は、撮像画像における乗員の骨格点等、体格に関する特徴情報を用いて乗員を検知してもよい。さらに、第2検知部5は、乗員の顔、体格に関する特徴情報を用いて、検知された乗員の体格、年齢、及び性別等の属性を推定する属性推定処理、及び姿勢、眠気、又は体調等の状態を推定する状態推定処理の少なくともいずれかを行ってもよい。なお、属性推定処理において、検知された乗員と特定の個人とを紐づける乗員認証を行い、認証結果を属性に含めてもよい。上述の属性推定処理、及び状態推定処理は、公知の種々のアルゴリズムを用いることができるものであり、詳細な説明を省略する。 Further, in Embodiments 1 and 2, an example has been described in which the second detection unit 5 detects an occupant using feature information related to the occupant's face, such as the occupant's face area and the positions of facial parts in the captured image. , the second detection unit 5 may detect the occupant using feature information relating to the physique, such as the skeletal points of the occupant in the captured image. Furthermore, the second detection unit 5 performs attribute estimation processing for estimating attributes such as the physique, age, and sex of the detected occupant using feature information about the occupant's face and physique, and also performs an attribute estimation process, and performs posture, drowsiness, physical condition, etc. At least one of state estimation processing for estimating the state of the may be performed. In addition, in the attribute estimation process, passenger authentication may be performed to associate the detected passenger with a specific individual, and the authentication result may be included in the attribute. Various well-known algorithms can be used for the attribute estimation processing and the state estimation processing described above, and detailed description thereof will be omitted.
 上述のように、第1検知部4が、センサ情報を用いて乗員の属性推定処理及び状態推定処理の少なくともいずれかを行い、第2検知部5が、撮像画像を用いて乗員の属性推定処理及び状態推定処理の少なくともいずれかを行う。このようにすると、例えば、第1検知部4の第1検知処理において、電波センサ21から取得したセンサ情報の分解能が担保できずに複数の乗員が連なって検知され、それぞれの乗員の体格の推定が行えない場合であっても、第2検知部5により撮像画像を用いて前述のそれぞれ乗員の体格の推定を行うことができる等、属性推定結果及び状態推定結果の信頼性を向上できる。 As described above, the first detection unit 4 uses the sensor information to perform at least one of the occupant attribute estimation process and the state estimation process, and the second detection unit 5 performs the occupant attribute estimation process using the captured image. and state estimation processing. In this way, for example, in the first detection process of the first detection unit 4, the resolution of the sensor information acquired from the radio wave sensor 21 cannot be guaranteed, and a plurality of passengers are detected in a row, and the physique of each passenger is estimated. can be performed, the reliability of attribute estimation results and state estimation results can be improved, such as by estimating the physique of each occupant using the captured image by the second detection unit 5 .
実施の形態3.
 実施の形態3に係る車両監視装置80は、実施の形態1と同様に、センサ情報を取得する第1情報取得部1と、撮像画像を取得する第2情報取得部2とを備える。本実施の形態では、車両監視装置80が、車両50の外部の人物(以下、車外の人物という)を検知する車外検知部81を備え、車外の人物を検知する点について、実施の形態1と異なる。実施の形態1と同じ構成要素には同じ符号を付し、その説明を省略する。
Embodiment 3.
A vehicle monitoring apparatus 80 according to Embodiment 3 includes a first information acquisition section 1 that acquires sensor information and a second information acquisition section 2 that acquires a captured image, as in Embodiment 1. FIG. In the present embodiment, the vehicle monitoring device 80 includes a vehicle exterior detection unit 81 that detects a person outside the vehicle 50 (hereinafter referred to as a person outside the vehicle). different. The same reference numerals are given to the same components as in the first embodiment, and the description thereof is omitted.
 図9は、実施の形態3に係る車両監視システム101の構成例を示すブロック図である。車両監視システム101は、車両監視装置80、センサ20、及び撮像装置30を備えており、車両監視装置80、センサ20、及び撮像装置30は、それぞれ車両50に搭載される。 FIG. 9 is a block diagram showing a configuration example of the vehicle monitoring system 101 according to the third embodiment. The vehicle monitoring system 101 includes a vehicle monitoring device 80, a sensor 20, and an imaging device 30. The vehicle monitoring device 80, the sensor 20, and the imaging device 30 are mounted on the vehicle 50, respectively.
 また、車両監視装置80は、車外の人物を検知する車外検知部81を備えている。車外検知部81は、第1車外検知部7、第2車外検知部8、及び車外存否判定部9を有しており、車外の人物の検知を行う。車外検知部81が有する各構成については、詳細を後述する。すなわち、本実施の形態に係る車両監視システム101の検知における対象人物は、車外の人物である。なお、車両監視システム101は、車外の人物のみを検知するものでなく、検知における対象人物に前席の乗員又は後席の乗員を含んでもよい。 The vehicle monitoring device 80 also includes a vehicle exterior detection unit 81 that detects a person outside the vehicle. The vehicle exterior detection unit 81 has a first vehicle exterior detection unit 7, a second vehicle exterior detection unit 8, and a vehicle exterior presence/absence determination unit 9, and detects a person outside the vehicle. Details of each configuration of the vehicle exterior detection unit 81 will be described later. That is, the target person in detection by the vehicle monitoring system 101 according to this embodiment is a person outside the vehicle. The vehicle monitoring system 101 may detect not only persons outside the vehicle but also passengers in the front seats or passengers in the rear seats as target persons for detection.
 本実施の形態におけるセンサ20は、例えば、電波センサ21又は超音波センサ等の撮像装置30と異なるセンサ20であり、車外を検知範囲に含むように、すなわち車外の人物を検知可能なように、車室内の天井、車両50の外装等に設けられている。センサ20が電波センサ21又は超音波センサである場合、車両50の外装から設定された距離、離間した範囲を検知範囲に含むように、一台又は複数台、車室の天井等に設けられる。なお、設定された距離とは、例えば15cm程度であり、車外において車両50の近傍に存在する人物を検知可能であれば、適宜設定可能である。 The sensor 20 in the present embodiment is, for example, a sensor 20 different from the imaging device 30, such as a radio wave sensor 21 or an ultrasonic sensor. It is provided on the ceiling of the passenger compartment, the exterior of the vehicle 50, and the like. If the sensor 20 is a radio wave sensor 21 or an ultrasonic sensor, one or more sensors are provided on the ceiling of the passenger compartment or the like so that the detection range includes a set distance from the exterior of the vehicle 50 . Note that the set distance is, for example, about 15 cm, and can be set appropriately as long as it is possible to detect a person existing near the vehicle 50 outside the vehicle.
 また、撮像装置30は、車両50の窓を介して車外を撮像可能なように、一台又は複数台、オーバーヘッドコンソール、インストメントパネル、ステアリングコラム、ルームミラー等に配置される。図10は、実施の形態3に係る車両監視装置80による車外の人物の検知例を示す説明図である。図10は、車両監視装置80を搭載した車両50の内部及び外部を上方から見た図であり、後席右側の近傍に車外の人物70が存在している例を示している。また、図10中の領域Fは、撮像装置30の撮像範囲を示しており、図10に示すように、撮像装置30は車外を撮像範囲に含む。 In addition, one or a plurality of imaging devices 30 are arranged on an overhead console, an instrument panel, a steering column, a room mirror, etc., so as to be able to capture an image of the outside of the vehicle 50 through the windows of the vehicle 50 . FIG. 10 is an explanatory diagram showing an example of detection of a person outside the vehicle by the vehicle monitoring device 80 according to the third embodiment. FIG. 10 is a top view of the inside and outside of a vehicle 50 equipped with a vehicle monitoring device 80, and shows an example in which a person 70 outside the vehicle is present near the right side of the rear seat. A region F in FIG. 10 indicates the imaging range of the imaging device 30, and as shown in FIG. 10, the imaging device 30 includes the outside of the vehicle in the imaging range.
 図9に戻り、車両監視装置80について説明する。車両監視装置80の情報取得部11は、実施の形態1と同様に、第1情報取得部1と、第2情報取得部2とを有する。情報取得部11の第1情報取得部1は、センサ20と接続されており、センサ20からセンサ情報を取得する。ここで、センサ情報とは、車外の人物の存否に関する情報である。例えば、センサ20が電波センサ21である場合、センサ情報は、電波センサ21と、検知対象との距離を示す距離データ、電波センサ21に対する検知対象の角度を示す角度データ等である。また、例えば、センサ20が音声センサである場合、センサ情報は、車外の人物が発話した音声の、音量及び到来方向等を示す音声データ等である。なお、センサ情報は、これらのうちのいずれか1つもしくは複数の組み合わせである。 Returning to FIG. 9, the vehicle monitoring device 80 will be described. The information acquisition unit 11 of the vehicle monitoring device 80 has the first information acquisition unit 1 and the second information acquisition unit 2 as in the first embodiment. The first information acquisition section 1 of the information acquisition section 11 is connected to the sensor 20 and acquires sensor information from the sensor 20 . Here, the sensor information is information regarding the presence or absence of a person outside the vehicle. For example, when the sensor 20 is the radio wave sensor 21, the sensor information includes distance data indicating the distance between the radio wave sensor 21 and the detection target, angle data indicating the angle of the detection target with respect to the radio wave sensor 21, and the like. Further, for example, when the sensor 20 is a voice sensor, the sensor information is voice data or the like indicating the volume, direction of arrival, etc. of voice uttered by a person outside the vehicle. In addition, the sensor information is any one of these or a combination of a plurality of them.
 一方、情報取得部11の第2情報取得部2は、撮像装置30と接続されており、撮像装置30から撮像画像を取得する。なお、図9の例において、第1情報取得部1と第2情報取得部2とを分けて図示しているが、撮像画像からの撮像画像の取得及びセンサ20からのセンサ情報の取得は一つの構成で行ってもよい。そして、情報取得部11は、取得したセンサ情報及び撮像画像を車外検知部81に出力する。 On the other hand, the second information acquisition section 2 of the information acquisition section 11 is connected to the imaging device 30 and acquires a captured image from the imaging device 30 . Although the first information acquisition unit 1 and the second information acquisition unit 2 are shown separately in the example of FIG. You can go with one configuration. The information acquisition unit 11 then outputs the acquired sensor information and captured image to the vehicle exterior detection unit 81 .
 車外検知部81の第1車外検知部7は、センサ情報を用いて車外の人物を検知する。また、車外検知部81の第2車外検知部8は、撮像画像を用いて車外の人物を検知する。さらに、車外検知部81の車外存否判定部9は、第1車外検知部7及び第2車外検知部8の検知結果を用いて、車外に人物が存在しているか否かを判定する車外存否判定部9を備える。 The first vehicle exterior detection unit 7 of the vehicle exterior detection unit 81 detects a person outside the vehicle using sensor information. The second vehicle exterior detection unit 8 of the vehicle exterior detection unit 81 detects a person outside the vehicle using the captured image. Furthermore, the vehicle exterior presence/absence determination unit 9 of the vehicle exterior detection unit 81 uses the detection results of the first vehicle exterior detection unit 7 and the second vehicle exterior detection unit 8 to determine whether or not a person exists outside the vehicle. A part 9 is provided.
 次に、車外検知部81による、車外の人物を検知する処理について説明する。以下の説明では、第1車外検知部7が、センサ情報として、電波センサ21から取得した距離データと、角度データとを用いて車外の人物を検知する例を挙げて説明する。 Next, the process of detecting a person outside the vehicle by the vehicle exterior detection unit 81 will be described. In the following description, an example will be described in which the first vehicle exterior detection unit 7 detects a person outside the vehicle using distance data and angle data acquired from the radio wave sensor 21 as sensor information.
 まず、第1車外検知部7は、情報取得部11の第1情報取得部1から、距離データと、角度データとをセンサ情報として取得し、人、動物等の検知対象の大きさ、検知対象が存在する範囲等を算出する。例えば、第1車外検知部7は、算出した検知対象の大きさ及び存在する範囲等から、車外における車両50の窓の近辺等、車両50の近傍に検知対象が存在することが判明すれば、車外に人物が存在するという検知結果を導出し、検知結果を車両監視装置80の記憶部に出力する。以下、説明のため、第1車外検知部7によるセンサ情報を用いた検知処理を、第1車外検知処理という。また、第1車外検知部7による第1車外検知処理の検知結果を第1車外検知結果という。 First, the first vehicle exterior detection unit 7 acquires distance data and angle data as sensor information from the first information acquisition unit 1 of the information acquisition unit 11, and determines the size of a detection target such as a person, an animal, etc., and the size of the detection target. Calculate the range in which For example, if the first vehicle exterior detection unit 7 finds that the detection target exists near the vehicle 50, such as near the window of the vehicle 50 outside the vehicle, from the calculated size and range of the detection target, A detection result that a person exists outside the vehicle is derived, and the detection result is output to the storage unit of the vehicle monitoring device 80 . Hereinafter, for the sake of explanation, the detection processing using the sensor information by the first vehicle exterior detection unit 7 will be referred to as first vehicle exterior detection processing. Further, the detection result of the first vehicle-exterior detection process by the first vehicle-exterior detection unit 7 is referred to as a first vehicle-exterior detection result.
 ところで、センサ情報を用いて車外の人物を検知する場合、検知結果の信頼性が問題となる場合がある。例えばセンサ20が電波センサ21である場合、電波センサ21と車外の人物との位置関係によっては、車外の人物が必要以上に大きく検知される、又は車外の人物の体を検知しきれない等して、検知対象が人であるか否かを正確に判定できない場合がある。 By the way, when detecting a person outside the vehicle using sensor information, the reliability of the detection result may become an issue. For example, when the sensor 20 is the radio wave sensor 21, depending on the positional relationship between the radio wave sensor 21 and the person outside the vehicle, the person outside the vehicle may be detected larger than necessary, or the body of the person outside the vehicle may not be detected. Therefore, it may not be possible to accurately determine whether the object to be detected is a person.
 一方で、撮像装置30を用いて車外の人物を検知する場合は、撮像装置30と異なるセンサ20を用いて車外の人物を検知する場合に比して信頼性が高い検知結果を得ることができる。そこで、車外の人物の検知について、撮像画像から車外の人物が検知された場合は、撮像画像を用いた車外の人物の検知処理による検知結果を用いることで、車外の人物、すなわち対象人物の検知精度を向上できる。 On the other hand, when the imaging device 30 is used to detect a person outside the vehicle, a highly reliable detection result can be obtained compared to the case where the sensor 20 different from the imaging device 30 is used to detect the person outside the vehicle. . Therefore, regarding the detection of a person outside the vehicle, if a person outside the vehicle is detected from the captured image, the person outside the vehicle, that is, the target person can be detected by using the detection result of the person detection process using the captured image. Can improve accuracy.
 第2車外検知部8による検知処理例について説明する。まず、第2車外検知部8は、撮像画像の解析を行い、撮像画像内における、車外の人物の顔を検出する。そして、第2車外検知部8は、車外の人物の顔が検出された領域である顔領域と、顔領域における車外の人物の顔の特徴情報とを取得する。ここで、車外の人物の顔の特徴情報とは、例えば、顔の大きさを正規化した上での、目、鼻、口、頬の部分のコントラスト比等である。なお、第2車外検知部8は、車外の人物の顔の特徴情報を用いて顔検出に成功したか否かを判定してもよい。この場合、第2車外検知部8は、例えば、顔領域内のコントラスト比から、顔領域の輝度分布が、顔らしいことが判明すれば、顔検出に成功したと判定し、輝度分布が顔らしくないと判定すれば、顔検出に失敗したと判定すればよい。 An example of detection processing by the second vehicle exterior detection unit 8 will be described. First, the second vehicle exterior detection unit 8 analyzes the captured image and detects the face of a person outside the vehicle in the captured image. Then, the second vehicle exterior detection unit 8 acquires a face area, which is an area in which the face of the person outside the vehicle is detected, and the facial feature information of the person outside the vehicle in the face area. Here, the feature information of the face of the person outside the vehicle is, for example, the contrast ratio of the eyes, nose, mouth, and cheeks after normalizing the size of the face. Note that the second vehicle exterior detection unit 8 may determine whether or not face detection has succeeded using facial feature information of a person outside the vehicle. In this case, the second vehicle exterior detection unit 8 determines that the face detection is successful if the luminance distribution of the face region is found to be face-like from the contrast ratio in the face region, for example, and determines that the face detection is successful. If it is determined that there is no face detection, it may be determined that the face detection has failed.
 第2車外検知部8は、車外の人物の顔を検出したら、撮像画像から、例えば、車外の人物の顔の輪郭に接する矩形等の、車外の人物の顔を囲む顔領域に係る座標を取得する。ここで、顔領域に係る座標は、例えば、顔領域が矩形である場合、矩形の各頂点、中心等の座標である。さらに、第2車外検知部8は、顔領域の幅、高さ、及び面積等の寸法を、顔領域の座標から算出する。 After detecting the face of the person outside the vehicle, the second outside detection unit 8 acquires the coordinates related to the face area surrounding the face of the person outside the vehicle, such as a rectangle contacting the contour of the person outside the vehicle, from the captured image. do. Here, the coordinates relating to the face area are, for example, the coordinates of each vertex, center, etc. of the rectangle when the face area is rectangular. Further, the second vehicle exterior detection unit 8 calculates dimensions such as the width, height, and area of the face region from the coordinates of the face region.
 そして、第2車外検知部8、取得した顔領域の座標、幅、高さ、及び面積等から、車外に人物が存在するか否かを特定し、車外の人物の存否に関する情報を検知結果として、車両監視装置80の記憶部に出力する。なお、第2車外検知部8による検知処理は上述の例に限らず、公知の種々のアルゴリズムを用いることが可能である。また、第2車外検知部8は、撮像画像における車外の人物の骨格点等、体格に関する特徴情報を用いて車外の人物を検知してもよい。 Then, the second vehicle exterior detection unit 8 identifies whether or not there is a person outside the vehicle from the acquired coordinates, width, height, area, etc. of the face area, and provides information regarding the presence or absence of the person outside the vehicle as the detection result. , to the storage unit of the vehicle monitoring device 80 . The detection processing by the second vehicle exterior detection unit 8 is not limited to the above example, and various known algorithms can be used. Further, the second vehicle exterior detection unit 8 may detect a person outside the vehicle using feature information related to the physique of the person outside the vehicle, such as the skeletal points of the person outside the vehicle in the captured image.
 また、上述の例では、第2車外検知部8の検知結果を記憶部に格納する例を示しているが、第2車外検知部8は、検知結果を車外存否判定部9に出力してもよい。以下、説明のため、第2車外検知部8による撮像画像を用いた検知処理を、第2車外検知処理という。また、第2車外検知部8による第2車外検知処理の検知結果を第2車外検知結果という。 In the above example, the detection result of the second vehicle exterior detection unit 8 is stored in the storage unit. good. Hereinafter, for the sake of explanation, the detection processing using the captured image by the second vehicle exterior detection unit 8 will be referred to as the second vehicle exterior detection processing. Further, the detection result of the second vehicle-exterior detection process by the second vehicle-exterior detection unit 8 is referred to as a second vehicle-exterior detection result.
 次に、車外に人物が存在しているか否かを判定する存否判定処理について説明する。車外存否判定部9は、センサ情報から車外の人物が検知されなかったとしても、撮像画像から車外の人物が検知できれば、車外に人物が存在すると判定する。このように、車外検知部81は、センサ情報から得られた車外の人物の存否に関する検知結果と、撮像画像から得られた車外の人物の存否に関する検知結果とが異なる場合、撮像画像から得られた車外の存否に関する検知結果を用いて、車外の人物の存否を特定する。 Next, the presence/absence determination process for determining whether or not there is a person outside the vehicle will be described. The presence/absence determination unit 9 determines that a person exists outside the vehicle if the person outside the vehicle can be detected from the captured image even if the person outside the vehicle is not detected from the sensor information. In this manner, when the detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information differs from the detection result regarding the presence or absence of a person outside the vehicle obtained from the captured image, the vehicle exterior detection unit 81 The presence or absence of a person outside the vehicle is specified using the detection result regarding the presence or absence of the person outside the vehicle.
 以下、車両監視装置80の動作について説明する。図11は、実施の形態3に係る車両監視装置80の動作例を示すフローチャートである。車両監視装置80は、例えば、車両情報取得部3が、車両側制御装置200から車両50のエンジンが始動した旨を示す信号を取得した場合、第1車外検知処理及び第2車外検知処理の動作を開始する。 The operation of the vehicle monitoring device 80 will be described below. FIG. 11 is a flow chart showing an operation example of the vehicle monitoring device 80 according to the third embodiment. For example, when the vehicle information acquisition unit 3 acquires a signal indicating that the engine of the vehicle 50 has started from the vehicle-side control device 200, the vehicle monitoring device 80 performs the first vehicle exterior detection process and the second vehicle exterior detection process. to start.
 また、図11のフローチャートには、車両監視装置80の動作を終了する処理が示されていないが、車両監視装置80は、例えば、車両情報取得部3が、車両側制御装置200から車両50のエンジンが停止した旨を示す信号を取得した場合、第1車外検知処理及び第2車外検知処理の動作を終了する。 Although the flowchart of FIG. 11 does not show a process for terminating the operation of the vehicle monitoring device 80, the vehicle monitoring device 80 is configured such that the vehicle information acquisition unit 3, for example, controls the vehicle 50 from the vehicle-side control device 200. When the signal indicating that the engine has stopped is acquired, the operations of the first vehicle exterior detection process and the second vehicle exterior detection process are terminated.
 まず、車両監視装置80の動作が開始した後、車両監視装置80の第1情報取得部1は、センサ20からセンサ情報を取得する(ST301)。次に、第1車外検知部7は、センサ情報を用いて第1車外検知処理を行う(ST302)。ここで、第1車外検知部7による車外の人物が検知されたか否かの判定は、例えば、センサ情報として取得した距離データ、角度データから算出した検知対象の大きさ及び存在する範囲等に基づき、検知された検知対象が車外に存在するか否かを判定することにより行えばよい。 First, after the vehicle monitoring device 80 starts operating, the first information acquiring section 1 of the vehicle monitoring device 80 acquires sensor information from the sensor 20 (ST301). Next, the first vehicle exterior detection unit 7 uses the sensor information to perform the first vehicle exterior detection process (ST302). Here, the determination of whether or not a person outside the vehicle has been detected by the first vehicle exterior detection unit 7 is based on, for example, the distance data acquired as sensor information, the size of the detection target calculated from the angle data, the existing range, and the like. , it may be determined whether or not the detected detection target exists outside the vehicle.
 そして、第1車外検知部7は、車外の人物が検知されたか否かを示す検知結果を、車両監視装置80の記憶部に出力する。第1車外検知部7は、車外において人物が検知された場合(ST302;YES)、車外の人物が検知されたことを示す検知結果を、第1車外検知結果として車両監視装置80の記憶部に出力し、車外第1フラグをONにする(ST303)。一方、第1車外検知部7は、車外において人物が検知されなかった場合(ST302;NO)、車外の人物が検知されなかったことを示す検知結果を、第1車外検知結果として車両監視装置80の記憶部に出力し、車外第1フラグをOFFにする(ST304)。 Then, the first vehicle exterior detection unit 7 outputs a detection result indicating whether or not a person outside the vehicle has been detected to the storage unit of the vehicle monitoring device 80 . When a person is detected outside the vehicle (ST302; YES), the first outside detection unit 7 stores the detection result indicating that the person outside the vehicle is detected as a first outside detection result in the storage unit of the vehicle monitoring device 80. output, and turn on the vehicle exterior first flag (ST303). On the other hand, when a person is not detected outside the vehicle (ST302; NO), the first vehicle exterior detection unit 7 sets the detection result indicating that a person outside the vehicle is not detected as a first vehicle exterior detection result to the vehicle monitoring device 80. , and the vehicle exterior first flag is turned OFF (ST304).
 ここで、記憶部に記憶される車外第1フラグとは、第1人物検知処理により車外において人物が検知されたか否かを示すものである。すなわち、第1車外検知部7が車外の人物が検知されたことを示す第1車外検知結果を出力すれば、車外第1フラグはONとなり、第1車外検知部7が車外の人物が検知されなかったことを示す第1車外検知結果を出力すれば、車外第1フラグはOFFとなる。 Here, the first vehicle exterior flag stored in the storage unit indicates whether or not a person has been detected outside the vehicle by the first person detection process. That is, when the first vehicle exterior detection unit 7 outputs a first vehicle exterior detection result indicating that a person outside the vehicle has been detected, the first vehicle exterior flag is turned ON, and the first vehicle exterior detection unit 7 detects a person outside the vehicle. If the first vehicle exterior detection result indicating that the vehicle has not been detected is output, the vehicle exterior first flag is turned OFF.
 次に、車両監視装置80の第2情報取得部2は、撮像装置30から撮像画像を取得する(ST305)。第2車外検知部8は、撮像画像を用いて第2車外検知処理を行う(ST306)。ここで、第2車外検知部8による車外において人物が検知されたか否かの判定は、例えば、撮像画像から取得した顔領域の座標、幅、高さ、及び面積等に基づき、検知対象が車外に存在するか否かを判定することにより行えばよい。 Next, the second information acquisition section 2 of the vehicle monitoring device 80 acquires the captured image from the imaging device 30 (ST305). The second vehicle exterior detection unit 8 performs a second vehicle exterior detection process using the captured image (ST306). Here, the second vehicle exterior detection unit 8 determines whether or not a person has been detected outside the vehicle, for example, based on the coordinates, width, height, area, etc. of the face area acquired from the captured image. may be performed by determining whether or not it exists in the
 そして、第2車外検知部8は、車外の人物が検知されたか否かの検知結果を、車両監視装置80の記憶部に出力する。第2車外検知部8は、車外において人物が検知された場合(ST306;YES)、車外の人物が検知されたことを示す検知結果を、第2車外検知結果として車両監視装置80の記憶部に出力し、車外第2フラグをONにする(ST307)。一方、第2車外検知部8は、車外において人物が検知されなかった場合(ST306;NO)、車外の人物が検知されなかったことを示す検知結果を、第2車外検知結果として車両監視装置80の記憶部に出力し、車外第2フラグをOFFにする(ST308)。 Then, the second vehicle exterior detection unit 8 outputs the detection result as to whether or not a person outside the vehicle has been detected to the storage unit of the vehicle monitoring device 80 . When a person is detected outside the vehicle (ST306; YES), the second outside detection unit 8 stores the detection result indicating that the person outside the vehicle is detected as a second outside detection result in the storage unit of the vehicle monitoring device 80. output to turn ON the second flag outside the vehicle (ST307). On the other hand, when a person is not detected outside the vehicle (ST306; NO), the second vehicle exterior detection unit 8 sets the detection result indicating that a person outside the vehicle is not detected as a second vehicle exterior detection result to the vehicle monitoring device 80. , and the second flag outside the vehicle is turned OFF (ST308).
 ここで、記憶部に記憶される車外第2フラグとは、第2車外検知処理により、車外において人物が検知されたか否かを示すものである。すなわち、第2車外検知部8が車外の人物が検知されたことを示す第2車外検知結果を出力すれば、車外第2フラグはONとなり、第2車外検知部8が車外の人物が検知されなかったことを示す第2車外検知結果を出力すれば、車外第2フラグはOFFとなる。 Here, the second vehicle exterior flag stored in the storage unit indicates whether or not a person has been detected outside the vehicle by the second vehicle exterior detection process. That is, when the second vehicle exterior detection unit 8 outputs a second vehicle exterior detection result indicating that a person outside the vehicle has been detected, the second vehicle exterior flag is turned ON, and the second vehicle exterior detection unit 8 detects a person outside the vehicle. If the second vehicle exterior detection result indicating that the vehicle has not been detected is output, the vehicle exterior second flag is turned OFF.
 次に、車両監視装置80の車外存否判定部9は、第1車外検知部7から第1車外検知結果を取得し、第2車外検知部8から第2車外検知結果を取得する。そして、車外存否判定部9は、第1車外検知結果及び第2車外検知結果を用いて、車外に人物が存在しているか否かを判定する。 Next, the vehicle-exterior presence/absence determination unit 9 of the vehicle monitoring device 80 acquires the first vehicle-exterior detection result from the first vehicle-exterior detection unit 7 and acquires the second vehicle-exterior detection result from the second vehicle-exterior detection unit 8 . Then, the vehicle exterior presence/absence determination unit 9 determines whether or not a person is present outside the vehicle using the first vehicle exterior detection result and the second vehicle exterior detection result.
 車外存否判定部9は、第1車外検知結果として車外第1フラグを参照して、車外第1フラグがONであるか否かを判定し(ST309)、センサ情報から車外の人物が検知された否かを確認する。そして、車外存否判定部9は、車外第1フラグがONであれば(ST309;YES)、すなわち、センサ情報から車外の人物が検知されたことを確認した場合、車外に人物が存在すると判定し、判定結果を車両側制御装置200に出力する(ST310)。 The vehicle exterior presence/absence determination unit 9 refers to the vehicle exterior first flag as the first vehicle exterior detection result, and determines whether or not the vehicle exterior first flag is ON (ST309). confirm whether or not If the vehicle outside first flag is ON (ST309; YES), that is, if it is confirmed from the sensor information that a person outside the vehicle has been detected, the vehicle outside presence/absence determining unit 9 determines that a person exists outside the vehicle. , the determination result is output to the vehicle-side control device 200 (ST310).
 なお、車外存否判定部9による判定結果は、車両側制御装置200の、例えば、車外に人物が存在する旨の報知を行う報知部(図示せず)に出力される。そして、報知部は、車外に人物が存在するという判定結果を受信した場合、車外に人物が存在する旨を、車両の報知部又は運転者が所持する端末(図示せず)から運転者等に報知する。このようにすると、車両50の運転者は、車両の近傍に不審者等の人物が存在することを認識できる。 The determination result by the presence/absence determination unit 9 outside the vehicle is output to, for example, a notification unit (not shown) of the vehicle-side control device 200 that notifies the presence of a person outside the vehicle. Then, when the notification unit receives the determination result that a person exists outside the vehicle, the notification unit notifies the driver or the like of the existence of the person outside the vehicle from the notification unit of the vehicle or a terminal (not shown) possessed by the driver. inform. By doing so, the driver of the vehicle 50 can recognize that a person such as a suspicious person is present in the vicinity of the vehicle.
 一方、車外存否判定部9は、車外第1フラグがOFFであれば(ST309;NO)、すなわち、センサ情報から車外において人物が検知されなかったことを確認した場合、次に説明するST311の処理に進む。 On the other hand, if the vehicle exterior first flag is OFF (ST309; NO), that is, if it is confirmed from the sensor information that a person has not been detected outside the vehicle, the vehicle exterior presence/absence determination unit 9 performs the processing of ST311 described below. proceed to
 車外存否判定部9は、第2車外検知結果として車外第2フラグを参照して、車外第2フラグがONであるか否かを判定し(ST311)、撮像画像から車外の人物が検知されたか否かを確認する。そして、車外存否判定部9は、車外第2フラグがONであれば(ST311;YES)、すなわち、撮像画像から車外の人物が検知されたことを確認した場合、車外に人物が存在すると判定し、判定結果を車両側制御装置200に出力する(ST310)。 The vehicle exterior presence/absence determining unit 9 refers to the vehicle exterior second flag as the second vehicle exterior detection result, and determines whether or not the vehicle exterior second flag is ON (ST311). confirm whether or not If the vehicle exterior second flag is ON (ST311; YES), that is, if it is confirmed that a person outside the vehicle has been detected from the captured image, the vehicle exterior presence/absence determination unit 9 determines that a person exists outside the vehicle. , the determination result is output to the vehicle-side control device 200 (ST310).
 一方、車外存否判定部9は、車外第2フラグがOFFであれば(ST311;NO)、すなわち、撮像画像から車外の人物が検知されなかったことを確認した場合、車外に人物が存在しないと判定し、判定結果を車両側制御装置200に出力する(ST312)。 On the other hand, if the vehicle exterior second flag is OFF (ST311; NO), that is, if it is confirmed that a person outside the vehicle has not been detected from the captured image, the vehicle exterior presence/absence determination unit 9 determines that there is no person outside the vehicle. It determines and outputs the determination result to the vehicle-side control device 200 (ST312).
 このように、車外検知部81は、センサ情報から車外の人物が検知されなかったことを示す検知結果を得て、撮像画像から車外の人物が検知されたことを示す検知結果を得た場合、車外に人物が存在すると特定する。これにより、実際は車外に人物が存在しているにもかかわらず、センサ情報から車外に人物が存在することが検知されなかったとしても、撮像画像から車外に人物が存在することが検知された場合に、撮像画像から得られた検知結果を用いて車外に人物が存在すると判定するため、車外の人物の検知漏れを防ぎ、車外の人物、すなわち対象人物の検知精度を向上できる。 In this way, when the vehicle exterior detection unit 81 obtains a detection result indicating that a person outside the vehicle has not been detected from the sensor information and obtains a detection result indicating that a person outside the vehicle has been detected from the captured image, Identify the presence of a person outside the vehicle. As a result, even if a person is actually present outside the vehicle, even if the presence of a person outside the vehicle is not detected from the sensor information, the presence of a person outside the vehicle is detected from the captured image. Furthermore, since it is determined that there is a person outside the vehicle using the detection result obtained from the captured image, it is possible to prevent omissions in detecting the person outside the vehicle and improve the detection accuracy of the person outside the vehicle, that is, the target person.
 また、図11のフローチャートにおいて、車両監視装置80により、ST301~ST304の処理を行った後に、ST305~ST308の処理を行うものとして図示しているが、図11のフローチャートは一例であり、例えば、ST301~ST304の処理及びST305~ST308の処理を並列で行ってもよい。 Further, in the flowchart of FIG. 11, the vehicle monitoring device 80 performs the processing of ST301 to ST304 and then performs the processing of ST305 to ST308, but the flowchart of FIG. 11 is an example. The processing of ST301 to ST304 and the processing of ST305 to ST308 may be performed in parallel.
 なお、本実施の形態における車両監視装置80は、実施の形態2と同様に、車外存否判定部9が、第2車外検知部8から、センサ情報から車外に人物が存在することを示す検知結果を得たとしても、撮像画像から車外に人物が存在しないことを示す検知結果を得た場合、撮像画像から得られた検知結果を用いて、車外に人物が存在しないと判定するようにしてもよい。 In the vehicle monitoring device 80 of the present embodiment, as in the case of the second embodiment, the presence/absence determination unit 9 detects from the second vehicle exterior detection unit 8 the detection result indicating that a person is present outside the vehicle based on the sensor information. Even if it is determined that there is no person outside the vehicle by using the detection result obtained from the imaged image, if a detection result indicating that there is no person outside the vehicle is obtained from the imaged image. good.
 すなわち、例えば、車外存否判定部9が、第1車外検知部7から、センサ情報から検知した車外に検知対象が存在することを示す検知結果を得たとしても、第2車外検知部8が、検知対象が人以外の物体であることを撮像画像から特定した場合は、車外に人物が存在しないと判定してもよい。このようにすると、センサ情報から得られた、車外に人物が存在するという検知結果が誤検知によるものだったとしても、撮像画像から車外に人物が存在しないことが明確であれば、車外に人物が存在しないという特定するため、誤検知を抑制し、車外の人物、すなわち対象人物の検知精度を向上できる。 That is, for example, even if the presence/absence determination unit 9 outside the vehicle obtains a detection result indicating that a detection target exists outside the vehicle detected from the sensor information from the first detection unit 7, the second detection unit 8 outside the vehicle When it is specified from the captured image that the object to be detected is an object other than a person, it may be determined that there is no person outside the vehicle. In this way, even if the detection result obtained from the sensor information that there is a person outside the vehicle is due to an erroneous detection, if it is clear from the captured image that there is no person outside the vehicle, the person outside the vehicle will be detected. False detection can be suppressed and the detection accuracy of the person outside the vehicle, that is, the target person can be improved.
 また、車両監視装置80は、車両のエンジンが停止している場合であっても、第1車外検知処理及び第2車外検知処理を行ってもよい。このようにすると、車両側制御装置200の報知制御部により、車外に人物が存在する旨を、車両の報知部又は運転者が所持する端末から運転者等に報知できるため、車両の盗難等、車両への犯罪行為が行われる前に不審者が車両近傍に存在することを運転者に通知できる。 Further, the vehicle monitoring device 80 may perform the first vehicle exterior detection process and the second vehicle exterior detection process even when the engine of the vehicle is stopped. In this way, the notification control unit of the vehicle-side control device 200 can notify the presence of a person outside the vehicle to the driver or the like from the notification unit of the vehicle or the terminal held by the driver. A driver can be notified that a suspicious person exists in the vicinity of the vehicle before a criminal act is committed on the vehicle.
 次に、車両監視装置80の機能を実現するハードウェア構成について説明する。図12は、実施の形態3に係る車両監視装置80のハードウェア構成例を示す図である。車両監視装置80における第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部の機能は、処理回路によって実現される。すなわち、車両監視装置80の、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部は、図12Aに示すように専用のハードウェアである処理回路80aであってもよいし、図12Bに示すようにメモリ80cに格納されているプログラムを実行するプロセッサ80bであってもよい。 Next, the hardware configuration that implements the functions of the vehicle monitoring device 80 will be described. FIG. 12 is a diagram showing a hardware configuration example of a vehicle monitoring device 80 according to Embodiment 3. As shown in FIG. First information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first vehicle exterior detection unit 7, second vehicle exterior detection unit 8, vehicle exterior presence/absence determination unit 9 in vehicle monitoring device 80, The functions of the vehicle exterior detection unit 81 and the storage unit are implemented by a processing circuit. That is, the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, the first vehicle exterior detection unit 7, the second vehicle exterior detection unit 8, and the vehicle exterior presence/absence determination of the vehicle monitoring device 80 The unit 9, the vehicle exterior detection unit 81, and the storage unit may be a processing circuit 80a that is dedicated hardware as shown in FIG. It may be the executing processor 80b.
 図12Aに示すように、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部が専用のハードウェアである場合、処理回路80aは、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-programmable Gate Array)、又はこれらを組み合わせたものが該当する。第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部の各部の機能それぞれを処理回路で実現してもよいし、各部の機能をまとめて1つの処理回路で実現してもよい。 As shown in FIG. 12A, a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first vehicle exterior detection unit 7, a second vehicle exterior detection unit 8, and a vehicle exterior presence/absence determination unit. 9. If the vehicle exterior detection unit 81 and the storage unit are dedicated hardware, the processing circuit 80a may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit ), FPGA (Field-programmable Gate Array), or a combination thereof. First information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first vehicle exterior detection unit 7, second vehicle exterior detection unit 8, vehicle exterior presence/absence determination unit 9, vehicle exterior detection unit 81, Each function of each unit of the storage unit and the storage unit may be realized by a processing circuit, or the functions of each unit may be collectively realized by one processing circuit.
 図12Bに示すように、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部がプロセッサ80bである場合、各部の機能は、ソフトウェア、ファームウェア、又はソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェア又はファームウェアはプログラムとして記述され、メモリ80cに格納される。プロセッサ80bは、メモリ80cに記憶されたプログラムを読み出して実行することにより、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部の各機能を実現する。すなわち、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部は、プロセッサ80bにより実行されるときに、図4に示す各ステップが結果的に実行されることになるプログラムを格納するためのメモリ80cを備える。また、これらのプログラムは、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部の手順又は方法をコンピュータに実行させるものであるともいえる。 As shown in FIG. 12B, a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first vehicle exterior detection unit 7, a second vehicle exterior detection unit 8, and a vehicle exterior presence/absence determination unit. 9. When the outside detection unit 81 and the storage unit are the processor 80b, the function of each unit is realized by software, firmware, or a combination of software and firmware. Software or firmware is written as a program and stored in the memory 80c. The processor 80b reads out and executes the programs stored in the memory 80c to obtain the first information acquisition unit 1, the second information acquisition unit 2, the vehicle information acquisition unit 3, the information acquisition unit 11, and the first outside detection unit 7. , the second vehicle exterior detection unit 8, the vehicle exterior presence/absence determination unit 9, the vehicle exterior detection unit 81, and the storage unit. That is, first information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first vehicle exterior detection unit 7, second vehicle exterior detection unit 8, vehicle exterior presence/absence determination unit 9, vehicle exterior detection unit 81, and storage comprises memory 80c for storing a program which, when executed by processor 80b, results in the execution of the steps shown in FIG. Further, these programs include a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first vehicle exterior detection unit 7, a second vehicle exterior detection unit 8, and a vehicle exterior presence/absence determination unit. 9, the vehicle exterior detection unit 81, and the storage unit.
 ここで、プロセッサ80bとは、例えば、CPU(Central Processing Unit)、処理装置、演算装置、プロセッサ、マイクロプロセッサ、マイクロコンピュータ、又はDSP(Digital Signal Processor)等のことである。メモリ80cは、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable ROM)、EEPROM(Electrically EPROM)等の不揮発性又は揮発性の半導体メモリであってもよいし、ハードディスク、フレキシブルディスク等の磁気ディスクであってもよいし、ミニディスク、CD(Compact Disc)、DVD(Digital Versatile Disc)等の光ディスクであってもよい。 Here, the processor 80b is, for example, a CPU (Central Processing Unit), a processing device, an arithmetic device, a processor, a microprocessor, a microcomputer, or a DSP (Digital Signal Processor). The memory 80c may be non-volatile or volatile semiconductor memory such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically EPROM), etc. However, it may be a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a mini disk, a CD (Compact Disc) or a DVD (Digital Versatile Disc).
 なお、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェア又はファームウェアで実現するようにしてもよい。このように、車両監視装置80における処理回路80aは、ハードウェア、ソフトウェア、ファームウェア、又はこれらの組み合わせによって、上述の各機能を実現することができる。また、第1情報取得部1、第2情報取得部2、車両情報取得部3、情報取得部11、第1車外検知部7、第2車外検知部8、車外存否判定部9、車外検知部81、及び記憶部の少なくとも一部の機能を、外部サーバに実行させてもよい。 First information acquisition unit 1, second information acquisition unit 2, vehicle information acquisition unit 3, information acquisition unit 11, first vehicle exterior detection unit 7, second vehicle exterior detection unit 8, vehicle exterior presence/absence determination unit 9, vehicle exterior detection unit 81 and each function of the storage unit may be partly realized by dedicated hardware and partly realized by software or firmware. Thus, the processing circuit 80a in the vehicle monitoring device 80 can realize each of the functions described above by hardware, software, firmware, or a combination thereof. Also, a first information acquisition unit 1, a second information acquisition unit 2, a vehicle information acquisition unit 3, an information acquisition unit 11, a first vehicle exterior detection unit 7, a second vehicle exterior detection unit 8, a vehicle exterior presence/absence determination unit 9, and a vehicle exterior detection unit 81 and at least part of the functions of the storage unit may be executed by an external server.
 このように、車両監視装置80に、車外を検知範囲に含むセンサ20から、センサ情報を取得する第1情報取得部1と、車外を撮像範囲に含む撮像装置30から、撮像画像を取得する第2情報取得部2と、第1情報取得部1及び第2情報取得部2がそれぞれ取得した、センサ情報及び撮像画像から、車外の人物を検知する車外検知部81と、を備え、車外検知部81は、センサ情報から得られた車外の人物の存否に関する検知結果と、撮像画像から得られた車外の人物の存否に関する検知結果とが異なる場合、撮像画像から得られた車外の人物の存否に関する検知結果を用いて、車外の人物の存否を特定するものであると、車外の人物の検知漏れを防ぎ、車外の人物、すなわち対象人物の検知精度を向上できる。 In this way, the vehicle monitoring device 80 includes the first information acquisition unit 1 that acquires sensor information from the sensor 20 whose detection range is outside the vehicle, and the second information acquisition unit 1 that acquires the captured image from the imaging device 30 whose imaging range is outside the vehicle. 2 information acquisition unit 2, and a vehicle exterior detection unit 81 that detects a person outside the vehicle from the sensor information and the captured image respectively acquired by the first information acquisition unit 1 and the second information acquisition unit 2, and the vehicle exterior detection unit If the detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information is different from the detection result regarding the presence or absence of the person outside the vehicle obtained from the captured image, 81 determines whether the person outside the vehicle is present or not obtained from the captured image. If the presence or absence of a person outside the vehicle is specified using the detection result, detection failure of the person outside the vehicle can be prevented, and the detection accuracy of the person outside the vehicle, that is, the target person can be improved.
 また、本明細書中に開示する各実施の形態は、その範囲内において、各実施の形態を自由に組み合わせることが可能であり、各実施の形態を適宜、変形、省略することが可能である。例えば、車両監視装置に、乗員検知部12及び車外検知部81の両方を備え、乗員の検知及び車外の人物の検知の両方を行うようにしてもよい。 In addition, each embodiment disclosed in this specification can be freely combined within its scope, and each embodiment can be modified or omitted as appropriate. . For example, the vehicle monitoring device may include both the occupant detection unit 12 and the vehicle exterior detection unit 81 to detect both the occupant and the person outside the vehicle.
1 第1情報取得部、2 第2情報取得部、3 車両情報取得部、4 第1検知部、5 第2検知部、6 存否判定部、7 第1車外検知部、8 第2車外検知部、9 車外存否判定部、10、80 車両監視装置、10a、80a 処理回路、10b、80b プロセッサ、10c、80c メモリ、11 情報取得部、12 乗員検知部、20 センサ、21 電波センサ、30 撮像装置、50 車両、51 運転席、52 助手席、53、54、55 座席、56、57、58、59、60 乗員、61 チャイルドシート、70 車外の人物、81 車外検知部、100、101 車両監視システム。 1 first information acquisition unit, 2 second information acquisition unit, 3 vehicle information acquisition unit, 4 first detection unit, 5 second detection unit, 6 presence/absence determination unit, 7 first vehicle exterior detection unit, 8 second vehicle exterior detection unit , 9 vehicle presence/absence determination unit, 10, 80 vehicle monitoring device, 10a, 80a processing circuit, 10b, 80b processor, 10c, 80c memory, 11 information acquisition unit, 12 occupant detection unit, 20 sensor, 21 radio wave sensor, 30 imaging device , 50 Vehicle, 51 Driver's seat, 52 Passenger seat, 53, 54, 55 Seats, 56, 57, 58, 59, 60 Passengers, 61 Child seat, 70 People outside the vehicle, 81 Outside detection unit, 100, 101 Vehicle monitoring system.

Claims (14)

  1.  車両の少なくとも後席を検知範囲に含むセンサから、センサ情報を取得する第1情報取得部と、
     前記車両の前席及び前記後席を撮像範囲に含む撮像装置から、撮像画像を取得する第2情報取得部と、
     前記第1情報取得部及び前記第2情報取得部がそれぞれ取得した、前記センサ情報及び前記撮像画像から、前記後席の乗員を検知する乗員検知部と、を備え、
     前記乗員検知部は、前記センサ情報から得られた前記後席の乗員の存否に関する検知結果と、前記撮像画像から得られた前記後席の乗員の存否に関する検知結果とが異なる場合、前記撮像画像から得られた前記後席の乗員の存否に関する検知結果を用いて、前記後席の乗員の存否を特定する
     ことを特徴とする車両監視装置。
    A first information acquisition unit that acquires sensor information from a sensor that includes at least a rear seat of the vehicle in its detection range;
    a second information acquisition unit that acquires a captured image from an imaging device that includes an imaging range of the front seats and the rear seats of the vehicle;
    an occupant detection unit that detects an occupant in the rear seat from the sensor information and the captured image obtained by the first information acquisition unit and the second information acquisition unit, respectively;
    The occupant detection unit detects the captured image when a detection result regarding the presence or absence of the passenger in the backseat obtained from the sensor information is different from a detection result regarding the presence or absence of the backseat passenger obtained from the captured image. A vehicle monitoring device, wherein the presence or absence of the passenger in the rear seat is specified using the detection result regarding the presence or absence of the passenger in the rear seat obtained from.
  2.  前記乗員検知部は、前記センサ情報から前記後席に乗員が存在しないことを示す検知結果を得て、前記撮像画像から前記後席に乗員が存在することを示す検知結果を得た場合、前記後席に乗員が存在すると特定する
     ことを特徴とする請求項1に記載の車両監視装置。
    The occupant detection unit obtains a detection result indicating that no occupant is present in the backseat from the sensor information, and obtains a detection result indicating that an occupant is present in the backseat from the captured image. 2. The vehicle monitoring device according to claim 1, wherein it is specified that an occupant is present in a rear seat.
  3.  前記乗員検知部は、前記センサ情報から前記後席に乗員が存在することを示す検知結果を得て、前記撮像画像から前記後席に乗員が存在しないことを示す検知結果を得た場合、前記後席に乗員が存在しないと特定する
     ことを特徴とする請求項1又は請求項2に記載の車両監視装置。
    The occupant detection unit obtains a detection result indicating that an occupant is present in the backseat from the sensor information, and obtains a detection result indicating that no occupant is present in the backseat from the captured image. 3. The vehicle monitoring device according to claim 1, wherein it is specified that there is no passenger in the rear seat.
  4.  前記乗員検知部は、前記センサ情報から前記後席に乗員が存在することを示す検知結果を得て、前記撮像画像から前記後席に乗員以外の物体が存在することを示す検知結果を得た場合、前記後席に乗員が存在しないと特定する
     ことを特徴とする請求項1から請求項3のいずれか一項に記載の車両監視装置。
    The occupant detection unit obtains a detection result indicating that an occupant is present in the rear seat from the sensor information, and obtains a detection result indicating that an object other than the occupant is present in the rear seat from the captured image. 4. The vehicle monitoring device according to any one of claims 1 to 3, wherein the vehicle monitoring device determines that there is no passenger in the rear seat when the vehicle is in the rear seat.
  5.  前記乗員検知部は、前記センサ情報から前記後席に乗員が存在することを示す検知結果を得て、前記撮像画像から、前記前席の乗員により遮蔽されていない前記後席に、乗員が存在しないことを示す検知結果を得た場合、前記後席に乗員が存在しないと特定する
     ことを特徴とする請求項1から請求項4のいずれか一項に記載の車両監視装置。
    The occupant detection unit obtains a detection result indicating that an occupant is present in the rear seat from the sensor information, and determines from the captured image that the occupant is present in the rear seat that is not blocked by the occupant in the front seat. 5. The vehicle monitoring device according to any one of claims 1 to 4, wherein when a detection result indicating that no passenger is present is obtained, it is determined that there is no passenger in the rear seat.
  6.  前記第1情報取得部は、前記後席の乗員に加えて、前記前席を検知範囲に含むセンサから、センサ情報を取得し、
     前記第1情報取得部及び前記第2情報取得部がそれぞれ取得した、前記センサ情報及び前記撮像画像から、前記後席の乗員に加えて、前記前席の乗員を検知し、
     前記乗員検知部は、前記センサ情報から得られた前記前席の乗員の存否に関する検知結果と、前記撮像画像から得られた前記前席の乗員の存否に関する検知結果とが異なる場合、前記撮像画像から得られた前記前席の乗員の存否に関する検知結果を用いて、前記後席の乗員の存否を特定する
     ことを特徴とする請求項1から請求項5のいずれか一項に記載の車両監視装置。
    The first information acquisition unit acquires sensor information from a sensor whose detection range includes the front seat in addition to the passenger in the rear seat,
    Detecting the front seat occupant in addition to the rear seat occupant from the sensor information and the captured image respectively acquired by the first information acquisition unit and the second information acquisition unit,
    The occupant detection unit detects the presence or absence of the occupant in the front seat obtained from the sensor information and the detection result regarding the presence or absence of the occupant in the front seat obtained from the captured image. The vehicle monitor according to any one of claims 1 to 5, wherein the presence or absence of the passenger in the rear seat is specified using the detection result regarding the presence or absence of the passenger in the front seat obtained from Device.
  7.  前記乗員検知部は、前記センサ情報から前記前席に乗員が存在しないことを示す検知結果を得て、前記撮像画像から前記前席に乗員が存在することを示す検知結果を得た場合、前記後席に乗員が存在すると特定する
     ことを特徴とする請求項6に記載の車両監視装置。
    The occupant detection unit obtains a detection result indicating that no occupant is present in the front seat from the sensor information, and obtains a detection result indicating that an occupant is present in the front seat from the captured image. 7. The vehicle monitoring device according to claim 6, wherein it is specified that an occupant is present in the rear seat.
  8.  前記乗員検知部は、前記センサ情報から前記前席に乗員が存在することを示す検知結果を得て、前記撮像画像から前記前席に乗員が存在しないことを示す検知結果を得た場合、前記後席に乗員が存在しないと特定する
     ことを特徴とする請求項6又は請求項7に記載の車両監視装置。
    The occupant detection unit obtains a detection result indicating that an occupant is present in the front seat from the sensor information, and obtains a detection result indicating that no occupant is present in the front seat from the captured image. 8. The vehicle monitoring device according to claim 6, wherein it is specified that there is no passenger in the rear seat.
  9.  前記乗員検知部は、前記センサ情報から乗員が存在することを示す検知結果を得られた座席に存在する乗員に対して、前記センサ情報を用いて、乗員の属性の推定及び乗員の状態の推定の少なくともいずれかを行い、前記撮像画像から乗員が存在することを示す検知結果を得られた座席に存在する乗員に対して、前記撮像画像を用いて、乗員の属性の推定及び乗員の状態の推定の少なくともいずれかを行う
     ことを特徴とする請求項1から請求項8のいずれか一項に記載の車両監視装置。
    The occupant detection unit uses the sensor information to estimate the attribute of the occupant and the state of the occupant for the occupant present in the seat for which a detection result indicating the presence of the occupant is obtained from the sensor information. and performing at least one of the above, and using the imaged image to estimate the attributes of the occupant and determine the state of the occupant, for the occupant present in the seat for which a detection result indicating the presence of the occupant is obtained from the imaged image. The vehicle monitoring device according to any one of claims 1 to 8, wherein at least one of estimation is performed.
  10.  車両の外部を検知範囲に含むセンサから、センサ情報を取得する第1情報取得部と、
     前記車両の外部を撮像範囲に含む撮像装置から、撮像画像を取得する第2情報取得部と、
     前記第1情報取得部及び前記第2情報取得部がそれぞれ取得した、前記センサ情報及び前記撮像画像から、前記車両の外部の人物を検知する車外検知部と、を備え、
     前記車外検知部は、前記センサ情報から得られた前記車両の外部の人物の存否に関する検知結果と、前記撮像画像から得られた前記車両の外部の人物の存否に関する検知結果とが異なる場合、前記撮像画像から得られた前記車両の外部の人物の存否に関する検知結果を用いて、前記車両の外部の人物の存否を特定する
     ことを特徴とする車両監視装置。
    A first information acquisition unit that acquires sensor information from a sensor that includes the outside of the vehicle in its detection range;
    a second information acquisition unit that acquires a captured image from an imaging device that includes an imaging range of the outside of the vehicle;
    A vehicle exterior detection unit that detects a person outside the vehicle from the sensor information and the captured image respectively acquired by the first information acquisition unit and the second information acquisition unit,
    When the detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information is different from the detection result regarding the presence or absence of a person outside the vehicle obtained from the captured image, the vehicle exterior detection unit detects the presence or absence of the person outside the vehicle. A vehicle monitoring device that identifies the presence or absence of a person outside the vehicle using a detection result regarding the presence or absence of the person outside the vehicle obtained from a captured image.
  11.  前記車外検知部は、前記センサ情報から前記車両の外部に人物が存在しないことを示す検知結果を得て、前記撮像画像から前記車両の外部に人物が存在することを示す検知結果を得た場合、前記車両の外部に人物が存在すると特定する
     ことを特徴とする請求項10に記載の車両監視装置。
    When the vehicle exterior detection unit obtains a detection result indicating that a person does not exist outside the vehicle from the sensor information, and obtains a detection result indicating that a person exists outside the vehicle from the captured image. 11. The vehicle monitoring apparatus according to claim 10, wherein the vehicle monitoring apparatus identifies that a person exists outside the vehicle.
  12.  前記車外検知部は、前記センサ情報から前記車両の外部に人物が存在することを示す検知結果を得て、前記撮像画像から前記車両の外部に人物が存在しないことを示す検知結果を得た場合、前記車両の外部に人物が存在しないと特定する
     ことを特徴とする請求項10又は請求項11に記載の車両監視装置。
    When the vehicle exterior detection unit obtains a detection result indicating that a person exists outside the vehicle from the sensor information, and obtains a detection result indicating that a person does not exist outside the vehicle from the captured image. 12. The vehicle monitoring device according to claim 10 or 11, wherein the vehicle monitoring device determines that there is no person outside the vehicle.
  13.  車両に搭載され、少なくとも前記車両の後席を検知範囲に含むセンサと、
     前記車両に搭載され、前記車両の前席及び前記後席を撮像範囲に含む撮像装置と、
     前記センサから、センサ情報を取得する第1情報取得部と、
     前記撮像装置から、撮像画像を取得する第2情報取得部と、
     前記第1情報取得部及び前記第2情報取得部がそれぞれ取得した、前記センサ情報及び前記撮像画像から、前記後席の乗員を検知する乗員検知部と、を備え、
     前記乗員検知部は、前記センサ情報から得られた前記後席の乗員の存否に関する検知結果と、前記撮像画像から得られた前記後席の乗員の存否に関する検知結果とが異なる場合、前記撮像画像から得られた前記後席の乗員の存否に関する検知結果を用いて、前記後席の乗員の存否を特定する
     ことを特徴とする車両監視システム。
    a sensor mounted on a vehicle and including at least a rear seat of the vehicle in a detection range;
    an imaging device mounted on the vehicle and including the front seats and the rear seats of the vehicle in an imaging range;
    A first information acquisition unit that acquires sensor information from the sensor;
    a second information acquisition unit that acquires a captured image from the imaging device;
    an occupant detection unit that detects an occupant in the rear seat from the sensor information and the captured image obtained by the first information acquisition unit and the second information acquisition unit, respectively;
    The occupant detection unit detects the captured image when a detection result regarding the presence or absence of the passenger in the backseat obtained from the sensor information is different from a detection result regarding the presence or absence of the backseat passenger obtained from the captured image. A vehicle monitoring system, wherein the presence or absence of the passenger in the rear seat is specified using the detection result regarding the presence or absence of the passenger in the rear seat obtained from the above.
  14.  車両に搭載され、車両の外部を検知範囲に含むセンサと、
     前記車両に搭載され、前記車両の外部を撮像範囲に含む撮像装置と、
     前記センサから、センサ情報を取得する第1情報取得部と、
     前記撮像装置から、撮像画像を取得する第2情報取得部と、
     前記第1情報取得部及び前記第2情報取得部がそれぞれ取得した、前記センサ情報及び前記撮像画像から、前記車両の外部の人物を検知する車外検知部と、を備え、
     前記車外検知部は、前記センサ情報から得られた前記車両の外部の人物の存否に関する検知結果と、前記撮像画像から得られた前記車両の外部の人物の存否に関する検知結果とが異なる場合、前記撮像画像から得られた前記車両の外部の人物の存否に関する検知結果を用いて、前記車両の外部の人物の存否を特定する
     ことを特徴とする車両監視システム。
    a sensor mounted on a vehicle and including the outside of the vehicle in its detection range;
    an imaging device mounted on the vehicle and including the exterior of the vehicle in an imaging range;
    A first information acquisition unit that acquires sensor information from the sensor;
    a second information acquisition unit that acquires a captured image from the imaging device;
    A vehicle exterior detection unit that detects a person outside the vehicle from the sensor information and the captured image respectively acquired by the first information acquisition unit and the second information acquisition unit,
    When the detection result regarding the presence or absence of a person outside the vehicle obtained from the sensor information is different from the detection result regarding the presence or absence of a person outside the vehicle obtained from the captured image, the vehicle exterior detection unit detects the presence or absence of the person outside the vehicle. A vehicle monitoring system that identifies the presence or absence of a person outside the vehicle using a detection result regarding the presence or absence of the person outside the vehicle obtained from a captured image.
PCT/JP2021/005263 2021-02-12 2021-02-12 Vehicular monitoring device and vehicular monitoring system WO2022172400A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112021006342.1T DE112021006342T5 (en) 2021-02-12 2021-02-12 Vehicle monitoring device and vehicle monitoring system
PCT/JP2021/005263 WO2022172400A1 (en) 2021-02-12 2021-02-12 Vehicular monitoring device and vehicular monitoring system
JP2022581114A JP7446492B2 (en) 2021-02-12 2021-02-12 Vehicle monitoring device, vehicle monitoring system, and vehicle monitoring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/005263 WO2022172400A1 (en) 2021-02-12 2021-02-12 Vehicular monitoring device and vehicular monitoring system

Publications (1)

Publication Number Publication Date
WO2022172400A1 true WO2022172400A1 (en) 2022-08-18

Family

ID=82838561

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/005263 WO2022172400A1 (en) 2021-02-12 2021-02-12 Vehicular monitoring device and vehicular monitoring system

Country Status (3)

Country Link
JP (1) JP7446492B2 (en)
DE (1) DE112021006342T5 (en)
WO (1) WO2022172400A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017061816A (en) * 2015-09-25 2017-03-30 株式会社デンソー Door lock control device of vehicle
JP2017141612A (en) * 2016-02-11 2017-08-17 株式会社オートネットワーク技術研究所 Vehicle door lock control device
JP2017171174A (en) * 2016-03-24 2017-09-28 パナソニックIpマネジメント株式会社 Vehicle monitoring device and vehicle monitoring method
JP2020097305A (en) * 2018-12-18 2020-06-25 株式会社東海理化電機製作所 Control device, program and control system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7210965B2 (en) 2018-09-26 2023-01-24 株式会社アイシン indoor monitoring device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017061816A (en) * 2015-09-25 2017-03-30 株式会社デンソー Door lock control device of vehicle
JP2017141612A (en) * 2016-02-11 2017-08-17 株式会社オートネットワーク技術研究所 Vehicle door lock control device
JP2017171174A (en) * 2016-03-24 2017-09-28 パナソニックIpマネジメント株式会社 Vehicle monitoring device and vehicle monitoring method
JP2020097305A (en) * 2018-12-18 2020-06-25 株式会社東海理化電機製作所 Control device, program and control system

Also Published As

Publication number Publication date
JP7446492B2 (en) 2024-03-08
JPWO2022172400A1 (en) 2022-08-18
DE112021006342T5 (en) 2023-10-19

Similar Documents

Publication Publication Date Title
CN110088761B (en) Image authentication device, image authentication method, and automobile
EP1700761B1 (en) Apparatus for authenticating vehicle driver
US20060215884A1 (en) Apparatus for authenticating vehicle driver
US20170334353A1 (en) Parking Assistance System
JP2012121386A (en) On-board system
WO2021240777A1 (en) Occupant detection device and occupant detection method
US20230166737A1 (en) Cabin-inside detection device and cabin-inside detection method
CN114423343A (en) Cognitive function estimation device, learning device, and cognitive function estimation method
JP6667743B2 (en) Occupant detection device, occupant detection system and occupant detection method
WO2022172400A1 (en) Vehicular monitoring device and vehicular monitoring system
JP7267467B2 (en) ATTENTION DIRECTION DETERMINATION DEVICE AND ATTENTION DIRECTION DETERMINATION METHOD
WO2021240769A1 (en) Passenger detection device and passenger detection method
JP6594595B2 (en) Inoperable state determination device and inoperable state determination method
US20230408679A1 (en) Occupant determination apparatus and occupant determination method
JP2010214990A (en) Vehicular control device, vehicular control method, and vehicular control program
JP2022143854A (en) Occupant state determination device and occupant state determination method
WO2022113275A1 (en) Sleep detection device and sleep detection system
WO2022195715A1 (en) Occupant detection device and occupant detection system
WO2023170777A1 (en) Vehicle-occupant monitoring device, vehicle-occupant monitoring method, and vehicle-occupant monitoring program
JP7275409B2 (en) Occupant state determination device and occupant state determination method
JP7505660B2 (en) Occupant monitoring device, occupant monitoring method, and occupant monitoring program
JP2010208416A (en) Setting device for on-vehicle device, setting program for on-vehicle device, and method
WO2023175686A1 (en) Occupant information acquisition device, occupant information acquisition system, occupant information acquisition method, and non-transitory computer-readable medium
WO2021192007A1 (en) Face detection device
WO2024116290A1 (en) Information processing system, information processing device, information processing method, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21925652

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022581114

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 112021006342

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21925652

Country of ref document: EP

Kind code of ref document: A1