WO2019207625A1 - Dispositif de détection d'occupant, procédé de détection d'occupant et système de détection d'occupant - Google Patents

Dispositif de détection d'occupant, procédé de détection d'occupant et système de détection d'occupant Download PDF

Info

Publication number
WO2019207625A1
WO2019207625A1 PCT/JP2018/016455 JP2018016455W WO2019207625A1 WO 2019207625 A1 WO2019207625 A1 WO 2019207625A1 JP 2018016455 W JP2018016455 W JP 2018016455W WO 2019207625 A1 WO2019207625 A1 WO 2019207625A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant detection
calibration
occupant
processing unit
vehicle
Prior art date
Application number
PCT/JP2018/016455
Other languages
English (en)
Japanese (ja)
Inventor
洸暉 安部
太郎 熊谷
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2020515326A priority Critical patent/JPWO2019207625A1/ja
Priority to PCT/JP2018/016455 priority patent/WO2019207625A1/fr
Publication of WO2019207625A1 publication Critical patent/WO2019207625A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use

Definitions

  • the present invention relates to an occupant detection device, an occupant detection method, and an occupant detection system.
  • the occupant detection includes, for example, detection of the presence or absence of an occupant in the vehicle, detection of the position of each occupant when there are one or more occupants in the vehicle, and detection of each occupant when there are one or more occupants in the vehicle. It includes estimation of physique and determination of the presence or absence of child seats in each seat in the vehicle. The result of occupant detection is used, for example, for controlling the operation of the airbag.
  • the occupant detection when the ignition switch is turned on is executed before the vehicle starts.
  • the time T1 from when the ignition switch is turned on until the vehicle starts may be short.
  • the time T2 required for calibration may be longer than the time T1.
  • the present invention has been made to solve the above-described problems, and an occupant detection device and an occupant detection method capable of suppressing a delay in the execution timing of occupant detection with respect to an event generation timing such as vehicle start. And an occupant detection system.
  • the occupant detection device includes a calibration processing unit that executes a calibration process for a first occupant detection process using image data indicating an image captured by a camera for imaging in a vehicle interior, and the calibration process is completed.
  • An end determination unit that determines whether or not, a first occupant detection processing unit that executes the first occupant detection process using the image data and the result of the calibration process after the end of the calibration process, and the end of the calibration process.
  • a second occupant detection processing unit that executes a second occupant detection process that does not require a calibration process using image data is provided.
  • the present invention since it is configured as described above, it is possible to suppress a delay in the execution timing of occupant detection with respect to the occurrence timing of an event such as vehicle start.
  • FIG. 2A is an explanatory diagram illustrating an example of an arrangement position of a camera in the vehicle and an example of an imageable range by the camera, and is an explanatory diagram illustrating a state viewed from above the vehicle.
  • FIG. 2B is an explanatory diagram illustrating an example of a camera arrangement position in the vehicle and an example of an imageable range by the camera, and is an explanatory diagram illustrating a state viewed from the side of the vehicle.
  • FIG. 3A is a block diagram illustrating a hardware configuration of the control device according to the first embodiment.
  • FIG. 3B is a block diagram illustrating another hardware configuration of the control device according to the first embodiment.
  • FIG. 4A is a flowchart showing an operation of the occupant detection device according to Embodiment 1.
  • FIG. 4B is a flowchart showing another operation of the occupant detection device according to Embodiment 1. It is a block diagram which shows the state by which the passenger
  • FIG. 1 is a block diagram illustrating a state in which an occupant detection system according to Embodiment 1 is provided in a vehicle.
  • FIG. 2 is an explanatory diagram illustrating an example of the arrangement position of the camera in the vehicle and an example of an imageable range by the camera. With reference to FIG.1 and FIG.2, the passenger
  • the vehicle 1 has a camera 2 for imaging in the passenger compartment.
  • the camera 2 is configured by, for example, an infrared camera, a visible light camera, or a distance image sensor.
  • the camera 2 is provided, for example, on a dashboard (more specifically, a center cluster) of the vehicle 1.
  • FIG. 2 shows an example of an arrangement position of the camera 2 in the vehicle 1 and an example of an imageable range A by the camera 2.
  • all the seats in the vehicle 1 are included in the imageable range A. For this reason, all seats in the vehicle 1 are to be imaged by the camera 2.
  • the vehicle 1 has sensors 3.
  • the sensors 3 are, for example, a sensor that detects on / off of an ignition switch in the vehicle 1, a sensor that detects opening / closing of each door in the vehicle 1, a sensor that detects the traveling speed of the vehicle 1, and a seat surface portion of each seat in the vehicle 1. It is configured by at least one of a load sensor (a so-called “sitting sensor”) provided or a sensor for detecting a range position of the transmission in the vehicle 1.
  • a load sensor a so-called “sitting sensor”
  • the image data acquisition unit 11 acquires image data indicating an image captured by the camera 2 from the camera 2 at a predetermined time interval. This time interval is set to a different value according to, for example, the imaging frame rate by the camera 2.
  • the image data acquisition unit 11 outputs the acquired image data to the calibration processing unit 13, the first occupant detection processing unit 15, and the second occupant detection processing unit 16.
  • the start determination unit 12 uses the output signals from the sensors 3 to determine whether or not a predetermined event (hereinafter referred to as “start event”) has occurred in the vehicle 1.
  • the start determination unit 12 determines that a start event has occurred.
  • the start determination unit 12 determines that a start event has occurred.
  • the start determination unit 12 determines that a start event has occurred.
  • the start determination unit 12 determines that a start event has occurred.
  • the start determination unit 12 determines that a start event has occurred.
  • the calibration processing unit 13 uses the image data output from the image data acquisition unit 11 to perform calibration processing for occupant detection processing (hereinafter referred to as “first occupant detection processing”) by the first occupant detection processing unit 15. Is to execute.
  • the calibration processing by the calibration processing unit 13 is started when the start determination unit 12 determines that a start event has occurred.
  • the end determination unit 14 determines whether the calibration processing by the calibration processing unit 13 has ended after the calibration processing unit 13 has started the calibration processing.
  • the first occupant detection processing unit 15 detects the first occupant detection using the image data output from the image data acquisition unit 11 and the result of the calibration process. The process is executed. That is, the first occupant detection process is based on a method that requires a prior calibration process, and is executed after the calibration process is completed.
  • the second occupant detection processing unit 16 uses the image data acquisition unit 11 to The occupant detection process (hereinafter referred to as “second occupant detection process”) is executed using the output image data. That is, the second occupant detection process is based on a method that does not require a prior calibration process, and is executed before the end of the calibration process.
  • Each of the first occupant detection process and the second occupant detection process includes, for example, detection of the presence or absence of an occupant in the vehicle 1 (hereinafter referred to as “occupant detection”), Detection of the occupant's boarding position (hereinafter referred to as “boarding position detection”), estimation of the physique of each occupant (hereinafter referred to as “physique detection”) when there are one or more occupants in the vehicle 1, This includes the determination of whether or not a child seat is installed in each seat (hereinafter referred to as “child seat detection”).
  • the calibration processing unit 13 sets a plurality of reference values R used for occupant detection, boarding position detection, physique detection, and child seat detection by executing image recognition processing on a captured image by the camera 2.
  • the reference value R corresponds to, for example, the value R 1 corresponding to the position of each seat, the value R 2 corresponding to the position of each detection target (such as an occupant or a child seat) in the depth direction, and the position of the seat surface portion of each seat it is intended to include the value R 3 to.
  • the first occupant detection processing unit 15 executes image recognition processing on an image captured by the camera 2, thereby a plurality of values corresponding to a plurality of elements used for occupant detection, boarding position detection, physique detection, and child seat detection.
  • element value E is calculated.
  • the element value E is, for example, a value E 1 corresponding to the shoulder width of each occupant, a value E 2 corresponding to the face width of each occupant, a value E 3 corresponding to the seat height of each occupant, and a feature such as an uneven shape in each seat. it is intended to include the value E 4 corresponding to.
  • the first occupant detection processing unit 15 uses the reference value R set by the calibration processing unit 13 to convert each element value E from a pixel unit value (that is, a value in a captured image) E to a meter unit value.
  • a coefficient ⁇ for conversion to E ′ (that is, a value in real space) is calculated.
  • the first occupant detection processing unit 15 converts each element value E from the value E in the captured image to the value E ′ in the real space using the coefficient ⁇ .
  • the first occupant detection processing unit 15 performs occupant detection, boarding position detection, physique detection, and child seat detection using the converted element value E ′.
  • the boarding position detection and the physique detection are executed only when one or more passengers are detected by the passenger detection (that is, when one or more passengers are present in the vehicle 1).
  • an element value E 1 ′ corresponding to the shoulder width of each occupant for example, an element value E 2 ′ corresponding to the face width of each occupant, and an element value E 3 ′ corresponding to the seat height of each occupant are used.
  • the occupant's physique is classified into three types: the physique of an adult (for example, a person over 14 years old), the physique of a child (for example, a person over 6 years old and under 13 years old), and the physique of an infant (for example, a person under 12 years old). Shall be.
  • the first passenger detection processing unit 15 an element value E 1 corresponding to the adult body size ', E 2', E 3 ' and each of the range, the element values E 1 corresponding to the child's physique', E 2 ' , E 3 ′, and a database indicating each range of element values E 1 ′, E 2 ′, E 3 ′ corresponding to the infant's physique are stored in advance.
  • This database is generated by, for example, a statistical method.
  • the first occupant detection processing unit 15 determines in which range each element value E 1 ′ is in the database, and each element value E 2 ′ is in which range in the database. And whether each element value E 3 ′ is a value in the database, the physique of each occupant is an adult physique, a child physique or an infant physique Which of the two is estimated.
  • an element value E 2 ′ corresponding to the face width of each occupant and an element value E 4 corresponding to features such as the uneven shape in each seat are used. That is, the first occupant detection processing unit 15 determines whether the face corresponding to each element value E 2 ′ is an infant's face by comparing each element value E 2 ′ with a predetermined threshold value. To do. The first passenger detecting processor 15 determines whether the edge shape encompassing uneven shape corresponding to the individual element values E 4 (i.e. edge shape typical child seat has). Based on these determination results, the first occupant detection processing unit 15 determines whether a child seat is installed in each seat.
  • the dictionary data storage unit 17 stores a plurality of dictionary data in advance.
  • the plurality of dictionary data is generated by machine learning, for example.
  • Each of the plurality of dictionary data is to be compared with image data indicating an image captured by the camera 2.
  • the plurality of dictionary data correspond to different states in the vehicle 1. More specifically, the plurality of dictionary data includes the number of occupants in the vehicle 1, the position of each occupant in the vehicle 1, the physique of each occupant in the vehicle 1, and the child seat installation in each seat in the vehicle 1. It corresponds to various states that are different from each other.
  • the second occupant detection processing unit 16 acquires a plurality of dictionary data stored in the dictionary data storage unit 17 from the dictionary data storage unit 17.
  • the second occupant detection processing unit 16 compares the image data output by the image data acquisition unit 11 with each of the acquired plurality of dictionary data.
  • the second occupant detection processing unit 16 selects one dictionary data having the highest similarity to the image data among the plurality of dictionary data. Thereby, occupant detection, boarding position detection, physique detection, and child seat detection are realized.
  • the second occupant detection process is based on so-called “pattern recognition”. For this reason, the second occupant detection process is lower in robustness against the posture change of each occupant and the detection accuracy is lower than the first occupant detection process. In other words, the first occupant detection process is more robust with respect to changes in posture of each occupant and has higher detection accuracy than the second occupant detection process.
  • the result information storage unit 18 stores information indicating the result of the first occupant detection process when the first occupant detection processing unit 15 executes the first occupant detection process.
  • the result information storage unit 18 stores information indicating the result of the second occupant detection process when the second occupant detection processing unit 16 executes the second occupant detection process.
  • information stored in the result information storage unit 18 is collectively referred to as “result information”.
  • the airbag control device 4 acquires the result information stored in the result information storage unit 18 from the result information storage unit 18.
  • the airbag control device 4 controls the operation of the airbag in the vehicle 1 using the acquired result information.
  • the airbag control device 4 invalidates the operation of the airbag corresponding to the seat on which the infant is seated based on the result of the physique detection.
  • the airbag control device 4 invalidates the operation of the airbag corresponding to the seat on which the child seat is installed based on the child seat detection result.
  • the calibration processing unit 13, the end determination unit 14, the first occupant detection processing unit 15, and the second occupant detection processing unit 16 constitute a main part of the occupant detection device 100.
  • the image data acquisition unit 11, the start determination unit 12, the dictionary data storage unit 17, the result information storage unit 18, and the occupant detection device 100 constitute a main part of the control device 200.
  • the camera 2 and the control device 200 constitute the main part of the occupant detection system 300.
  • the control device 200 is configured by a computer, and the computer includes a processor 31 and memories 32 and 33.
  • the memory 32 causes the computer to function as the image data acquisition unit 11, the start determination unit 12, the calibration processing unit 13, the end determination unit 14, the first occupant detection processing unit 15, and the second occupant detection processing unit 16.
  • the program is stored.
  • the processor 31 reads and executes the program stored in the memory 32, the image data acquisition unit 11, the start determination unit 12, the calibration processing unit 13, the end determination unit 14, the first occupant detection processing unit 15, and the first
  • the function of the two occupant detection processing unit 16 is realized.
  • the functions of the dictionary data storage unit 17 and the result information storage unit 18 are realized by the memory 33.
  • control device 200 may include a memory 33 and a processing circuit 34.
  • the functions of the image data acquisition unit 11, the start determination unit 12, the calibration processing unit 13, the end determination unit 14, the first occupant detection processing unit 15, and the second occupant detection processing unit 16 are realized by the processing circuit 34. It may be a thing.
  • control device 200 may include a processor 31, memories 32 and 33, and a processing circuit 34 (not shown). In this case, some of the functions of the image data acquisition unit 11, the start determination unit 12, the calibration processing unit 13, the end determination unit 14, the first occupant detection processing unit 15, and the second occupant detection processing unit 16 are performed. It may be realized by the processor 31 and the memory 32, and the remaining functions may be realized by the processing circuit 34.
  • the processor 31 uses, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, or a DSP (Digital Signal Processor).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • microprocessor a microcontroller
  • DSP Digital Signal Processor
  • the memories 32 and 33 use, for example, a semiconductor memory or a magnetic disk. More specifically, the memories 32 and 33 are RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Stable Memory). (Solid State Drive) or HDD (Hard Disk Drive) or the like is used.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory EPROM (Erasable Programmable Read Only Memory)
  • EEPROM Electrically Stable Memory
  • Solid State Drive Solid State Drive
  • HDD Hard Disk Drive
  • the processing circuit 34 may be, for example, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), a SoC (System-LargeSemi-ChemicalSigleSigleSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigureSigure
  • the calibration processing unit 13 starts the calibration process. Thereafter, the calibration processing unit 13 continues the calibration process until the calibration process is completed (step ST1). Since a specific example of the calibration process has already been described, the description thereof will be omitted.
  • the end determination unit 14 determines whether the calibration processing by the calibration processing unit 13 has ended (step ST2).
  • step ST3 When it is determined by the end determination unit 14 that the calibration process has ended (step ST2 “YES”), the first occupant detection processing unit 15 executes the first occupant detection process (step ST3). Since the specific example of the first occupant detection process has already been described, the description thereof will be omitted.
  • step ST4 when it is determined by the end determination unit 14 that the calibration process has not ended (step ST2 “NO”), the second occupant detection processing unit 16 executes the second occupant detection process (step ST4). Since the specific example of the second occupant detection process has already been described, the description thereof will be omitted.
  • step ST2 If the determination result in step ST2 is “NO”, the calibration process is continuously executed in the background of the processes in steps ST2 and ST4. Therefore, after the process of step ST4 is executed, the end determination unit 14 may repeatedly execute the process of step ST2 until it is determined as “YES” in step ST2. And when it determines with step ST2 "YES", the 1st passenger
  • the second occupant detection processing unit 16 executes the second occupant detection process before the end of the calibration process, and then the first occupant detection processing unit 15 executes the first occupant detection process after the end of the calibration process. It may be what you do.
  • the occupant detection device 100 of Embodiment 1 includes the second occupant before the end of the calibration process.
  • a second occupant detection processing unit 16 that executes detection processing is provided. This shortens the time from when the start event occurs until the first occupant detection process is executed, as compared with a conventional occupant detection device that does not have a functional unit corresponding to the second occupant detection processing unit 16. be able to. As a result, it is possible to prevent the execution timing of the occupant detection process from being delayed with respect to the occurrence timing of an event such as the start of the vehicle 1.
  • the arrangement position of the camera 2 in the vehicle 1 is not limited to the example shown in FIG. 2, and the imageable range A by the camera 2 is not limited to the example shown in FIG.
  • only a part of the plurality of seats in the vehicle 1 may be included in the imageable range A. That is, only a part of the plurality of seats in the vehicle 1 may be an object to be imaged by the camera 2.
  • the sensors 3 are not limited to the above specific example, and the start event is not limited to the above specific example.
  • the sensors 3 may include any sensors as long as they are sensors provided in the vehicle 1.
  • any event may be set as a start event as long as the event can be determined using an output signal from the sensors 3.
  • the contents of the first occupant detection process are not limited to occupant detection, boarding position detection, physique detection, and child seat detection.
  • the first occupant detection process may include one or more of these detections.
  • the first occupant detection process may include other detections different from these detections.
  • the content of the second occupant detection process is not limited to occupant detection, boarding position detection, physique detection, and child seat detection.
  • the second occupant detection process may include one or more of these detections.
  • the second occupant detection process may include other detections different from these detections.
  • the first occupant detection process may be performed by a method that requires a prior calibration process, and the first occupant detection process method is not limited to the above specific example.
  • the second occupant detection process may be performed by a method that does not require a prior calibration process, and the method of the second occupant detection process is not limited to the above specific example.
  • the contents and method of the calibration process may be any one according to the contents and method of the first occupant detection process, and are not limited to the above specific examples.
  • the use of the results of the first occupant detection process and the second occupant detection process is not limited to the operation control of the airbag.
  • the result information may be used for any control in the vehicle 1.
  • the occupant detection device 100 uses the image data indicating the image captured by the camera 2 for imaging in the passenger compartment to perform the calibration process for the first occupant detection process.
  • Unit 13 end determination unit 14 for determining whether or not the calibration process has ended, and a first occupant detection process that uses the image data and the result of the calibration process after the end of the calibration process
  • An occupant detection processing unit 15 and a second occupant detection processing unit 16 that executes a second occupant detection process that does not require a calibration process using image data before the end of the calibration process are provided. Thereby, it can suppress that the execution timing of a passenger
  • a plurality of values (element values) E ′ corresponding to a plurality of elements are calculated by performing an image recognition process on the captured image.
  • the calculation of the value (E ′) is based on a plurality of reference values R, and the calibration process is to set a plurality of reference values R by executing an image recognition process on the captured image.
  • the second occupant detection process is based on pattern recognition based on comparison between image data and each of a plurality of dictionary data, and the plurality of dictionary data are generated by machine learning. Accordingly, it is possible to realize a second occupant detection process that is less accurate than the first condition detection process but does not require a prior calibration process.
  • the calibration processing unit 13 executes the calibration process for the first occupant detection process using image data indicating an image captured by the camera 2 for imaging in the vehicle interior.
  • Step ST1 step ST2 in which the end determination unit 14 determines whether or not the calibration process has ended, and the first occupant detection processing unit 15 after the end of the calibration process, the result of the image data and the calibration process
  • the first occupant detection process using step ST3 and the second occupant detection processing unit 16 execute the second occupant detection process that does not require the calibration process using the image data before the end of the calibration process.
  • Step ST4 Thereby, the effect similar to the said effect by the passenger
  • the occupant detection system 300 uses the camera 2 for imaging in the vehicle interior and the image data indicating the image captured by the camera 2 to execute calibration processing for the first occupant detection process.
  • An occupant detection apparatus 100 having a first occupant detection processing unit 15 and a second occupant detection processing unit 16 that executes a second occupant detection process that does not require calibration processing using image data before the end of the calibration process; Is provided. Thereby, the effect similar to the said effect by the passenger
  • FIG. FIG. 5 is a block diagram illustrating a state in which the vehicle occupant detection system according to Embodiment 2 is provided in the vehicle.
  • an occupant detection system 300a according to the second embodiment will be described.
  • FIG. 5 the same blocks as those shown in FIG.
  • the occupant detection device 100a performs personal authentication processing for each occupant in the vehicle 1 using the image data output by the image data acquisition unit 11. It has a function to do.
  • the result information storage unit 18a stores information indicating the result of the personal authentication process (that is, result information) when the occupant detection device 100a executes the personal authentication process.
  • the result information storage unit 18a stores information indicating the result of the first occupant detection process (that is, result information) when the first occupant detection processing unit 15 executes the first occupant detection process.
  • the first occupant detection process includes, for example, occupant detection, boarding position detection, physique detection, and child seat detection.
  • Information indicating the result of physique detection in the first occupant detection process is stored in the result information storage unit 18a in association with information indicating the result of the personal authentication process. That is, information indicating the result of physique detection in the first occupant detection process is stored in the result information storage unit 18a for each individual.
  • the second occupant detection processing unit 16a executes a second occupant detection process.
  • the second occupant detection process includes, for example, occupant detection, boarding position detection, physique detection, and child seat detection.
  • the occupant detection, riding position detection, and child seat detection methods by the second occupant detection processing unit 16a are the same as the occupant detection, riding position detection, and child seat detection methods by the second occupant detection processing unit 16 shown in FIG. That is, because it is based on pattern recognition), the description thereof will be omitted.
  • the physique detection method by the second occupant detection processing unit 16a is different from the physique detection method by the second occupant detection processing unit 16 shown in FIG. That is, the second occupant detection processing unit 16a displays information indicating the result of the physique detection in the first occupant detection process executed in the past among the result information stored in the result information storage unit 18a. Get from.
  • the second occupant detection processing unit 16a is a result of physique detection in the first occupant detection process executed in the past, using the acquired information, and for the same person as each occupant in the current vehicle 1
  • the result of physique detection is used as the result of physique detection in the second occupant detection process. This is because there is a low probability that the physique of the same person will greatly change between the execution of the past first occupant detection process and the execution of the current second occupant detection process.
  • the result information storage unit 18a stores information (that is, result information) indicating a result of the second occupant detection process when the second occupant detection processing unit 16a executes the second occupant detection process. However, information indicating the result of physique detection in the second occupant detection process is excluded from the storage target.
  • the calibration processing unit 13, the end determination unit 14, the first occupant detection processing unit 15, and the second occupant detection processing unit 16a constitute a main part of the occupant detection device 100a.
  • the image data acquisition unit 11, the start determination unit 12, the dictionary data storage unit 17, the result information storage unit 18a, and the occupant detection device 100a constitute a main part of the control device 200a.
  • the camera 2 and the control device 200a constitute the main part of the occupant detection system 300a.
  • the function of the second occupant detection processing unit 16a may be realized by the processor 31 and the memory 32, or may be realized by the processing circuit 34.
  • the function of the result information storage unit 18a is realized by the memory 33.
  • step ST4 the second occupant detection processing unit 16a uses the result of physique detection in the first occupant detection process executed in the past as the result of physique detection in the second occupant detection process.
  • the 2nd crew member detection process part 16a is 2nd shown in FIG.
  • the physique detection by pattern recognition similar to the occupant detection processing unit 16 may be executed.
  • information indicating the result of physique detection in the second occupant detection process may be included in the storage target.
  • the contents of the first occupant detection process are not limited to occupant detection, boarding position detection, physique detection, and child seat detection.
  • the contents of the second occupant detection process are not limited to occupant detection, boarding position detection, physique detection, and child seat detection.
  • the second occupant detection processing unit 16a may use the result of the first occupant detection process executed in the past as the result of the second occupant detection process for another detection different from the physique detection.
  • the occupant detection device 100a, the control device 200a, and the occupant detection system 300a can employ various modifications similar to those described in the first embodiment.
  • the second occupant detection process uses the result of the first occupant detection process executed in the past as the result of the second occupant detection process.
  • the physique detection it is possible to realize a second occupant detection process that does not require a prior calibration process and has the same accuracy as the first occupant detection process.
  • the occupant detection device, occupant detection method, and occupant detection system of the present invention can be applied to, for example, operation control of an airbag in a vehicle.

Abstract

L'invention concerne un dispositif de détection d'occupant (100) comportant : une unité de traitement d'étalonnage (13) destinée à utiliser des données d'image représentant une image photographique provenant d'une caméra d'imagerie d'intérieur de véhicule (2) pour exécuter un processus d'étalonnage pour un premier processus de détection d'occupant ; une unité de détermination d'extrémité (14) destinée à déterminer si le processus d'étalonnage est terminé ; une première unité de traitement de détection d'occupant (15) destinée à utiliser les données d'image et le résultat du processus d'étalonnage pour exécuter un premier processus de détection d'occupant une fois que le processus d'étalonnage est terminé ; et une seconde unité de traitement de détection d'occupant (16) destinée à utiliser les données d'image pour exécuter un second processus de détection d'occupant dans lequel le processus d'étalonnage n'est pas requis avant la fin du processus d'étalonnage.
PCT/JP2018/016455 2018-04-23 2018-04-23 Dispositif de détection d'occupant, procédé de détection d'occupant et système de détection d'occupant WO2019207625A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020515326A JPWO2019207625A1 (ja) 2018-04-23 2018-04-23 乗員検知装置、乗員検知方法及び乗員検知システム
PCT/JP2018/016455 WO2019207625A1 (fr) 2018-04-23 2018-04-23 Dispositif de détection d'occupant, procédé de détection d'occupant et système de détection d'occupant

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/016455 WO2019207625A1 (fr) 2018-04-23 2018-04-23 Dispositif de détection d'occupant, procédé de détection d'occupant et système de détection d'occupant

Publications (1)

Publication Number Publication Date
WO2019207625A1 true WO2019207625A1 (fr) 2019-10-31

Family

ID=68293608

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/016455 WO2019207625A1 (fr) 2018-04-23 2018-04-23 Dispositif de détection d'occupant, procédé de détection d'occupant et système de détection d'occupant

Country Status (2)

Country Link
JP (1) JPWO2019207625A1 (fr)
WO (1) WO2019207625A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023084738A1 (fr) * 2021-11-12 2023-05-19 三菱電機株式会社 Dispositif de détermination de physique et procédé de détermination de physique

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010029416A1 (en) * 1992-05-05 2001-10-11 Breed David S. Vehicular component control systems and methods
JP2001338282A (ja) * 2000-05-24 2001-12-07 Tokai Rika Co Ltd 乗員検知システム
JP2002008021A (ja) * 2000-06-16 2002-01-11 Tokai Rika Co Ltd 乗員検知システム
JP2006527354A (ja) * 2003-04-24 2006-11-30 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング 画像センサのキャリブレーションのための装置及び方法
JP2007153035A (ja) * 2005-12-01 2007-06-21 Auto Network Gijutsu Kenkyusho:Kk 乗員着座判定システム
JP2007198929A (ja) * 2006-01-27 2007-08-09 Hitachi Ltd 車両内状態検知システム,車両内状態検知装置および方法
JP2009107527A (ja) * 2007-10-31 2009-05-21 Denso Corp 車両の乗員検出装置
JP2010195139A (ja) * 2009-02-24 2010-09-09 Takata Corp 乗員拘束制御装置および乗員拘束制御方法
WO2012172865A1 (fr) * 2011-06-17 2012-12-20 本田技研工業株式会社 Dispositif de détection d'occupant
JP2014040198A (ja) * 2012-08-23 2014-03-06 Nissan Motor Co Ltd 車両用乗員検知装置及び車両用乗員検知方法

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010029416A1 (en) * 1992-05-05 2001-10-11 Breed David S. Vehicular component control systems and methods
JP2001338282A (ja) * 2000-05-24 2001-12-07 Tokai Rika Co Ltd 乗員検知システム
JP2002008021A (ja) * 2000-06-16 2002-01-11 Tokai Rika Co Ltd 乗員検知システム
JP2006527354A (ja) * 2003-04-24 2006-11-30 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング 画像センサのキャリブレーションのための装置及び方法
JP2007153035A (ja) * 2005-12-01 2007-06-21 Auto Network Gijutsu Kenkyusho:Kk 乗員着座判定システム
JP2007198929A (ja) * 2006-01-27 2007-08-09 Hitachi Ltd 車両内状態検知システム,車両内状態検知装置および方法
JP2009107527A (ja) * 2007-10-31 2009-05-21 Denso Corp 車両の乗員検出装置
JP2010195139A (ja) * 2009-02-24 2010-09-09 Takata Corp 乗員拘束制御装置および乗員拘束制御方法
WO2012172865A1 (fr) * 2011-06-17 2012-12-20 本田技研工業株式会社 Dispositif de détection d'occupant
JP2014040198A (ja) * 2012-08-23 2014-03-06 Nissan Motor Co Ltd 車両用乗員検知装置及び車両用乗員検知方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023084738A1 (fr) * 2021-11-12 2023-05-19 三菱電機株式会社 Dispositif de détermination de physique et procédé de détermination de physique

Also Published As

Publication number Publication date
JPWO2019207625A1 (ja) 2020-12-03

Similar Documents

Publication Publication Date Title
CN111469802B (zh) 座椅安全带状态确定系统和方法
US11120283B2 (en) Occupant monitoring device for vehicle and traffic system
US7983475B2 (en) Vehicular actuation system
US6493620B2 (en) Motor vehicle occupant detection system employing ellipse shape models and bayesian classification
JP5059551B2 (ja) 車両の乗員検出装置
EP3560770A1 (fr) Appareil de détermination d'informations d'occupant
US20180147955A1 (en) Passenger detection device and passenger detection program
JP2019057247A (ja) 画像処理装置及びプログラム
WO2021240777A1 (fr) Dispositif de détection d'occupant et procédé de détection d'occupant
JP6667743B2 (ja) 乗員検知装置、乗員検知システム及び乗員検知方法
WO2019207625A1 (fr) Dispositif de détection d'occupant, procédé de détection d'occupant et système de détection d'occupant
JP2018147329A (ja) 画像処理装置、画像処理システム、及び画像処理方法
US11146784B2 (en) Abnormality detection device and abnormality detection method
US11673559B2 (en) Disembarkation action determination device, vehicle, disembarkation action determination method, and non-transitory storage medium stored with program
WO2021240769A1 (fr) Dispositif de détection de passager et procédé de détection de passager
JP2019074964A (ja) 運転不能状態予測装置及び運転不能状態予測システム
US11983952B2 (en) Physique determination apparatus and physique determination method
JP7363758B2 (ja) 状態監視装置及び状態監視プログラム
JP2019074963A (ja) 所定部位検出装置及び所定部位検出システム
US20220319199A1 (en) Physique determination apparatus and physique determination method
JP7359084B2 (ja) 感情推定装置、感情推定方法及びプログラム
JP7259550B2 (ja) 物体位置検出装置
WO2024048185A1 (fr) Dispositif d'authentification d'occupant, procédé d'authentification d'occupant, et support lisible par ordinateur
US20230408679A1 (en) Occupant determination apparatus and occupant determination method
WO2020166002A1 (fr) Dispositif de détection d'état d'occupant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18916228

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020515326

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18916228

Country of ref document: EP

Kind code of ref document: A1