WO2024034109A1 - Dispositif de détermination de physique et procédé de détermination de physique - Google Patents

Dispositif de détermination de physique et procédé de détermination de physique Download PDF

Info

Publication number
WO2024034109A1
WO2024034109A1 PCT/JP2022/030703 JP2022030703W WO2024034109A1 WO 2024034109 A1 WO2024034109 A1 WO 2024034109A1 JP 2022030703 W JP2022030703 W JP 2022030703W WO 2024034109 A1 WO2024034109 A1 WO 2024034109A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant
physique
determination
unit
skeletal
Prior art date
Application number
PCT/JP2022/030703
Other languages
English (en)
Japanese (ja)
Inventor
晃平 鎌田
祥平 ▲高▼原
直哉 馬場
大暉 市川
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/030703 priority Critical patent/WO2024034109A1/fr
Publication of WO2024034109A1 publication Critical patent/WO2024034109A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to an occupant physique determination device and a physique determination method.
  • Patent Document 1 discloses that based on a captured image obtained by capturing an image of the inside of a vehicle, the occupant's body physique estimation index including the coordinates of a plurality of feature points of the occupant in the captured image extracted by deep learning etc.
  • An indoor monitoring device has been disclosed that estimates a posture and estimates the physique of an occupant based on the estimated posture.
  • the present disclosure has been made in order to solve the above-mentioned problems, and an object of the present disclosure is to provide a physique determination device that prevents a decrease in the accuracy of determining the physique of an occupant even if the posture of the occupant collapses. shall be.
  • the physique determination device includes an image acquisition unit that acquires a captured image of an occupant of a vehicle, and extracts skeletal points of the occupant indicating body parts of the vehicle based on the captured image acquired by the image acquisition unit.
  • a normal seating determination section that determines whether or not the occupant's posture is in a normal seating state based on situation information regarding the vehicle or the occupant's situation;
  • the skeletal points of the occupant extracted by the skeletal point extraction section are based on information about the skeletal points of It is determined whether or not the occupant's skeletal point is a normal seated skeletal point extracted based on an image taken in a seated state, and when it is determined that the skeletal point of the occupant extracted by the skeletal point extraction unit is a normal seated skeletal point.
  • a feature amount calculation unit that calculates a feature amount for physique determination based on information regarding normal sitting skeletal points, and a physique determination unit that determines the physique of the occupant based on the feature amount for physique determination calculated by the feature amount calculation unit. It is prepared.
  • the physique determination device can prevent the accuracy of determining the physique of the occupant from decreasing even if the occupant's posture collapses.
  • FIG. 1 is a diagram illustrating an example configuration of a system in which a physique determination device according to a first embodiment is used;
  • FIG. 1 is a diagram illustrating a configuration example of a physique determination device according to a first embodiment;
  • FIG. 2 is a flowchart for explaining an example of the operation of the physique determination device according to the first embodiment.
  • FIG. 3 is a diagram illustrating a configuration example of a physique determination device in a case where inclination information is used as situation information in the first embodiment.
  • FIG. 2 is a diagram illustrating a configuration example of a physique determination device in a case where vehicle information is used as situation information in Embodiment 1.
  • FIG. 6A and 6B are diagrams illustrating an example of the hardware configuration of the physique determination device according to the first embodiment.
  • FIG. 1 is a diagram showing a configuration example of a system in which a physique determination device 100 according to the first embodiment is used.
  • the physique determination device 100 according to the first embodiment is connected to an imaging device 200, a seatbelt control device 300, an airbag control device 400, an abandonment detection device 500, and an output device 600. Ru.
  • the physique determination device 100 is mounted on a vehicle.
  • the seatbelt control device 300, the airbag control device 400, the left-behind detection device 500, and the output device 600 are mounted on the vehicle.
  • the imaging device 200 is, for example, a near-infrared camera or a visible light camera, and images the occupant of the vehicle.
  • the imaging device 200 may be shared with an imaging device included in a so-called "Driver Monitoring System (DMS)" that is installed in a vehicle to monitor the condition of a driver in the vehicle.
  • DMS Driver Monitoring System
  • the imaging device 200 is installed to be able to image an area within the vehicle that includes at least an area where the upper body of a vehicle occupant should be present.
  • the range in which the upper body of the occupant in the vehicle should exist is, for example, a range corresponding to the space near the backrest of the seat and the front of the headrest.
  • the imaging device 200 can image the front seats including the driver's seat and the front passenger seat from the center of the instrument panel (hereinafter referred to as "instrument panel") of the vehicle. It is installed so that it can image the occupant in the seat (hereinafter referred to as "passenger seat occupant"). Note that this is just an example, and the imaging device 200 may be installed to be able to image only the driver, or may be installed to be able to image the passenger in the rear seat, for example.
  • the physique determination device 100 determines the physique of a vehicle occupant based on a captured image of the vehicle occupant taken by the imaging device 200. At this time, the physique determination device 100 determines whether or not the posture of the occupant of the vehicle is in a normally seated state, and determines the physique of the occupant based on the captured image taken with the occupant in the normally seated state. Make a judgment.
  • "regular seating” means that the occupant is seated on the seat in a normal state with no posture collapse.
  • a state in which the occupant's face is not facing forward is also considered to be a posture collapse.
  • the physique determination device 100 determines the physique of the vehicle occupant, the physique determination device 100 transmits the result of determining the physique of the vehicle occupant (hereinafter referred to as “physique determination result”) to the seat belt control device 300, the airbag control device 400, and the left-behind detection. Output to device 500. Furthermore, the physique determination device 100 outputs information to the output device 600 for outputting a message or the like urging the occupant to sit properly. Details of a configuration example of the physique determination device 100 will be described later using FIG. 2. In addition, in the following Embodiment 1, the passenger of a vehicle is also simply called a "passenger.”
  • the seat belt control device 300 controls the seat belt.
  • the seatbelt control device 300 has a seatbelt reminder function that outputs a warning in consideration of the occupant's physique based on the physique determination result output from the physique determination device 100.
  • the airbag control device 400 is, for example, an ECU (Engine Control Unit) for an airbag system, and controls the airbag based on the physique determination result output from the physique determination device 100.
  • ECU Engine Control Unit
  • the left-behind detection device 500 detects if an infant or the like is left behind in a vehicle. For example, the abandonment detection device 500 determines whether an infant or the like is left behind in the vehicle based on the physique determination result output from the physique determination device 100.
  • the output device 600 is, for example, a display device or an audio output device provided in a center cluster, a meter panel, or the like.
  • the output device 600 displays a message urging the occupant to sit properly, outputs the message in audio, or outputs a warning sound urging the occupant to sit properly.
  • FIG. 2 is a diagram showing a configuration example of the physique determination device 100 according to the first embodiment.
  • the physique determination device 100 includes an image acquisition section 101, a skeletal point extraction section 102, a situation information acquisition section 103, a normal sitting determination section 104, a feature calculation section 105, a feature leveling section 106, a physique determination section 107, and a posture determination section 104.
  • a warning section 108 is provided.
  • the situation information acquisition unit 103 includes a face orientation detection unit 1031.
  • the image acquisition unit 101 acquires a captured image of a passenger captured by the imaging device 200.
  • the image acquisition unit 101 outputs the acquired captured image to the skeleton point extraction unit 102.
  • the skeletal point extraction unit 102 extracts skeletal points of the occupant, which indicate body parts of the occupant, based on the captured image acquired by the image acquisition unit 101. Specifically, the skeletal point extracting unit 102 extracts skeletal points of the occupant, which indicate joint points determined for each part of the occupant's body, based on the captured image acquired by the image acquiring unit 101. Joint points determined for each part of the occupant's body include, for example, nose joint points, neck joint points, shoulder joint points, elbow joint points, waist joint points, wrist joint points, and knee joints. point, the ankle joint point. The joint points are predefined.
  • the skeletal point extraction unit 102 may extract the skeletal points of the occupant using a known method using a known technique such as an image recognition technique or a technique using a trained model (hereinafter referred to as a "machine learning model").
  • a skeleton point is, for example, a point in a captured image, and is expressed by coordinates in the captured image.
  • the skeletal point extraction unit 102 detects the coordinates of a skeletal point of the occupant and which part of the body of the occupant the skeletal point indicates.
  • the skeletal point extraction unit 102 extracts skeletal points for each occupant.
  • an area corresponding to each seat (hereinafter referred to as a "seat corresponding area") is set in advance for each seat.
  • the seat corresponding area is set in advance according to the installation position of the imaging device 200 and the angle of view.
  • the skeleton point extraction unit 102 extracts skeleton points for each seat corresponding area. For example, if a certain seat-corresponding area is a seat-corresponding area corresponding to a driver's seat, the skeleton point extraction unit 102 assumes that the skeleton points extracted in the seat-corresponding area are the skeleton points of the driver.
  • the skeleton point extraction unit 102 determines that the skeleton points extracted in the seat corresponding area are the skeleton points of the front passenger seat occupant. . In this way, the skeleton point extracting unit 102 extracts skeleton points for each occupant, for example, by extracting skeleton points from the seat corresponding area for each seat corresponding area.
  • Skeletal point extraction section 102 outputs information regarding the extracted skeleton points (hereinafter referred to as "skeletal point information") to situation information acquisition section 103.
  • the skeletal point information includes information in which information indicating the occupant, information indicating the coordinates of the skeletal point of the occupant, and information indicating which part of the body of the occupant the skeletal point indicates are associated; Also included are captured images acquired by the image acquisition unit 101.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the situation information acquisition unit 103 acquires information regarding the situation of the occupant (hereinafter referred to as “situation information”). ).
  • the situation information is, for example, information regarding the direction of the occupant's face.
  • the face orientation detection unit 1031 detects the passenger's face orientation based on the captured image acquired by the image acquisition unit 101.
  • the captured image acquired by the image acquisition unit 101 is included in the skeleton point information.
  • the face orientation detection unit 1031 may detect the passenger's face orientation using a known method using a known technique such as an image recognition technique or a technique using a machine learning model.
  • the direction of the occupant's face is expressed, for example, by an angle. Note that the face orientation detection unit 1031 detects the face orientation for each passenger.
  • the situation information acquisition unit 103 acquires information regarding the passenger's face direction detected by the face direction detection unit 1031 (hereinafter referred to as “face direction information”) as situation information.
  • the face orientation information includes information in which information indicating the occupant and information indicating the face orientation are associated with each other.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the face direction detection unit 1031 may determine which seat the occupant is seated in based on the seat corresponding area.
  • the face orientation information may also include a captured image acquired by the image acquisition unit 101.
  • the situation information acquisition unit 103 outputs the acquired situation information, here face orientation information, to the normal seating determination unit 104.
  • the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the situation information acquired by the situation information acquisition unit 103. Specifically, here, the normal seating determination unit 104 determines whether the occupant's posture is in the normal seating state based on the face orientation information. Note that the normal seating determination unit 104 determines for each passenger whether or not the passenger's posture is in a normal seating state. The normal seating determination unit 104 can identify the facial orientation of each occupant from the facial orientation information.
  • the normal seating determination unit 104 uses conditions (hereinafter referred to as (referred to as "regular seating determination conditions"), it is determined whether the occupant's posture is in a regular seating state.
  • the conditions for determining normal seating include, for example, conditions under which the occupant's posture is considered to be normal seating.
  • the normal seating determination conditions are generated in advance by an administrator or the like and stored in a location that can be referenced by the physique determination device 100.
  • a condition such as ⁇ Condition 1> below is defined as the condition for determining normal seating.
  • ⁇ Condition 1> The occupant's face must be facing forward. If the passenger's face direction is within a predetermined range (referred to as a "face direction determination range") with the front as a reference, it is assumed that the passenger's face direction is facing the front direction. Note that the face orientation determination range in ⁇ Condition 1> is set in advance as an appropriate range in which it is assumed that the face is facing the front direction.
  • the normal seating determination unit 104 determines that the posture of the occupant is the normal seating state when the passenger's face orientation satisfies the normal seating determination conditions. On the other hand, based on the face orientation information, if the occupant's face orientation does not satisfy the conditions for determining normal seating, the normal seating determination unit 104 determines that the occupant's posture is not the normal seating state.
  • the normal seating determination unit 104 outputs a determination result (hereinafter referred to as a “seating state determination result”) as to whether or not the occupant's posture is determined to be a normal seating state to the feature value calculation unit 105.
  • a determination result hereinafter referred to as a “seating state determination result”
  • normal seating determination section 104 outputs the skeleton point information output from skeleton point extraction section 102 to feature amount calculation section 105 together with the seating state determination result. Further, the normal seating determination unit 104 outputs the seating state determination result to the posture attention unit 108 .
  • the feature value calculation unit 105 calculates the position of the occupant based on the skeleton point information output from the skeleton point extraction unit 102 and the seating state determination result of the normal seating determination unit 104 as to whether or not the occupant's posture is in the normal seating state. Whether or not the occupant's skeletal points extracted by the skeletal point extraction unit 102 are skeletal points extracted based on an image taken with the occupant in a normally seated state (hereinafter referred to as "normally seated skeletal points"). Determine.
  • the feature value calculation unit 105 extracts The skeleton point thus obtained is determined to be a normal seated skeleton point. That is, the feature amount calculation unit 105 determines that the skeleton point information output from the skeleton point extraction unit 102 is regular sitting skeleton point information. Note that in the first embodiment, the feature amount calculation unit 105 acquires the skeleton point information output from the skeleton point extraction unit 102 via the normal seating determination unit 104, but this is only an example. do not have. For example, the feature amount calculation unit 105 may directly acquire skeleton point information from the skeleton point extraction unit 102. In this case, in FIG. 1, the arrow from the skeleton point extraction unit 102 is drawn toward the feature quantity calculation unit 105.
  • the feature calculation unit 105 determines that the occupant's skeletal point extracted by the skeletal point extraction unit 102 is a normal seating skeletal point
  • the feature calculation unit 105 provides skeletal point information of the normal seating skeletal point (hereinafter referred to as “regular sitting skeletal point Based on the information (hereinafter referred to as "feature amount for physique determination") used for determining the occupant's physique (hereinafter referred to as "feature amount for physique determination").
  • the feature amount calculation unit 105 calculates a feature amount for physique determination for each occupant.
  • the feature amount calculation unit 105 can identify the skeletal points of each occupant from the normal seating skeletal point information. Note that if the feature amount calculation unit 105 does not determine that the occupant's skeletal points extracted by the skeletal point extraction unit 102 are normal seating skeletal points, the feature amount calculation unit 105 does not calculate the feature amount for physique determination.
  • the feature amount calculation unit 105 calculates the feature amount for physique determination from the length in the captured image of a line segment connecting two of the skeletal points in the captured image, based on the normal sitting skeletal point information.
  • the feature amount calculation unit 105 calculates the arm length, shoulder width, and neck length of the occupant in the captured image as the feature amount for physique determination based on the normal sitting skeletal point information.
  • the feature amount calculation unit 105 may set the length of a line segment connecting a skeletal point indicating a shoulder and a skeletal point indicating an elbow as the length of the arm of the occupant as the feature amount for physique determination.
  • the feature value calculation unit 105 calculates, for example, the length of a line segment connecting the skeletal points indicating both shoulders, the length of the line segment connecting the skeletal points indicating the neck and the skeletal point indicating the right shoulder, or the length of the line segment connecting the skeletal points indicating the neck and the skeletal point indicating the right shoulder.
  • the feature quantity calculation unit 105 may, for example, set the length of the line segment connecting the skeletal point indicating the nose and the skeletal point indicating the neck as the length of the neck to be used as the feature quantity for physique determination.
  • the feature amount calculation unit 105 does not need to calculate all the arm length, shoulder width, and neck length of the occupant as feature amounts for physique determination; At least one of the length, shoulder width, and neck length may be calculated as the feature quantity for physique determination. Further, the feature amount calculation unit 105 may use the length of a line segment connecting two skeletal points other than the arm length, shoulder width, and neck length of the occupant as the feature amount for physique determination.
  • the feature quantity calculation unit 105 outputs information regarding the calculated feature quantity for physique determination (hereinafter referred to as “feature quantity information”) to the feature quantity leveling unit 106.
  • feature quantity information information regarding the calculated feature quantity for physique determination
  • information indicating the occupant is associated with a feature amount for physique determination.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the feature quantity leveling unit 106 calculates a plurality of physique determination feature quantities for a past preset period (hereinafter referred to as “leveling period”), in other words, the feature quantity calculation unit 105
  • the plurality of physique determination feature quantities calculated by are equalized.
  • the leveling period can be set to an appropriate length.
  • the leveling period is set by an administrator or the like, and is stored in a location that can be referenced by the feature amount leveling unit 106.
  • the feature leveling unit 106 associates the feature information output from the feature calculating unit 105 with the acquisition date and time of the feature information, and stores it at a location where the feature leveling unit 106 can refer to it. Memorize them in sequence.
  • the feature quantity leveling unit 106 may acquire a plurality of physique determination feature quantities in the leveling period from the stored feature quantity information. Note that when the feature values for physique determination for the leveling period are not stored, such as immediately after the vehicle starts running, the feature value leveling unit 106 stores only the stored feature values for physique determination, for example.
  • the feature amount for physique determination may be set for the leveling period.
  • the feature amount information stored by the feature amount calculation unit 105 is deleted, for example, when the power of the vehicle is turned on.
  • the feature leveling unit 106 levels the plurality of physique determination feature quantities for the leveling period, for example, by averaging, taking a median value, or taking a value corresponding to a set percentile. .
  • the feature leveling unit 106 calculates, for example, the average value of the plurality of physique determination feature quantities for the leveling period, the median value of the plurality of physique determination feature quantities for the leveling period, or the average value of the plurality of physique determination feature quantities for the leveling period.
  • the value corresponding to the set percentile of the plurality of features for physique determination for the period of time is set as the feature for physique determination after leveling (hereinafter referred to as "the feature for physique determination after leveling"). calculate.
  • the feature amount leveling unit 106 calculates a post-leveling physique determination feature amount for each occupant.
  • the feature amount leveling unit 106 can specify the feature amount for determining the physique of each occupant from the feature amount information.
  • the feature quantity calculation unit 105 calculates a plurality of types of feature quantities for physique determination, such as the length of the arms, shoulder width, and neck length of the occupant, the feature quantity leveling unit 106 level the respective feature quantities.
  • the physique determination device 100 can reduce the influence on the occupant's physique determination using the physique determination feature (specifically, the leveled physique determination feature).
  • the physique determination section 107 performs the determination of the occupant's physique. Details of the physique determining section 107 will be described later.
  • the feature amount leveling unit 106 averages the plurality of physique determination feature amounts for the leveling period to level the plurality of physique determination feature amounts. Even if the number of a plurality of feature quantities for physique determination is small, an accurate value, in other words, a value closer to the feature quantity according to the original physique of the occupant can be used as the feature quantity for physique determination after leveling. Therefore, in this case, for example, the administrator etc. can perform face orientation determination defined in the regular seating determination conditions used by the regular seating determination unit 104 to determine whether or not the passenger's posture is in the regular seating state. The scope of use can be narrowed.
  • the frequency at which the normal seating determining unit 104 determines that the passenger's posture is normal seating will decrease; Even if the number of multiple physique determination feature quantities is small when the passenger's posture is determined to be in a normal seating state, accurate physique determination after leveling can be performed from a small number of physique determination feature quantities. Feature quantities can be calculated. As a result, the physique determination device 100 can prevent a decrease in the accuracy of determining the occupant's physique using the leveled physique determination feature amount.
  • the feature value leveling unit 106 may take the median value of a plurality of feature values for physique determination for the leveling period, or may take a value corresponding to a set percentile to determine the When leveling feature quantities, the feature leveling unit 106 calculates accurate values from a large number of physique determination feature quantities, in other words, even if the number of physique determination feature quantities is large. A value closer to the corresponding feature amount can be set as the feature amount for physique determination after leveling. Therefore, in this case, for example, the administrator etc. can perform face orientation determination defined in the regular seating determination conditions used by the regular seating determination unit 104 to determine whether or not the passenger's posture is in the regular seating state.
  • the normal seating determination unit 104 can easily determine that the occupant's posture is in the normal seating state. It is assumed that by making it easier for the normal seating determining unit 104 to determine that the occupant's posture is normal seating, the accuracy of determining the normal seating state of the occupant will decrease; however, after that, the feature leveling unit 106 is capable of calculating an accurate leveled physique determination feature amount from many physique determination feature amounts when the posture state of the occupant is determined to be a normal seating state. As a result, the physique determination device 100 can prevent a decrease in the accuracy of determining the occupant's physique using the leveled physique determination feature amount.
  • the feature leveling unit 106 outputs information regarding the leveled physique determination feature (hereinafter referred to as “leveled feature information”) to the physique determining unit 107.
  • leveled feature information information regarding the leveled physique determination feature
  • information indicating the occupant and leveled feature amount information are associated for each occupant.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the physique of the occupant is determined based on the leveled physique determination feature quantity leveled by the physique determination unit 107 and the feature quantity leveling unit 106 .
  • the physique determination unit 107 determines the physique of each occupant.
  • the physique determination unit 107 can identify the leveled physique determination feature of each occupant from the leveled feature information.
  • the physique determining unit 107 may determine the physique of the occupant by a known method using a known technique such as an image recognition technique or a technique using a learned model (hereinafter referred to as a "machine learning model").
  • the physique determining unit 107 determines the physique of the occupant, such as "infant”, “small”, “normal”, or “large”, using a known method. Note that this is just an example, and the definition of the physique determined by the physique determination unit 107 can be set as appropriate.
  • the physique determination unit 107 outputs the physique determination result to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
  • information indicating the occupant is associated with information indicating the occupant's physique.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the posture warning unit 108 determines whether the occupant's posture, as determined by the normal seating determination unit 104, is not the normal seating state for a preset period of time (hereinafter referred to as “posture abnormality determination time”). If so, information (hereinafter referred to as "posture caution information”) urging the occupant to adopt a normal seating posture is output to the output device 600. Note that, based on the seating state determination result output from the normal seating determination unit 104, the posture warning unit 108 determines that the normal seating determination unit 104 has determined that the occupant's posture is not the normal seating state. can.
  • the posture caution section 108 associates the sitting state determination result output from the regular sitting determination section 104 with the acquisition date and time of the sitting state determination result, and stores it in a chronological order in a location where the posture caution section 108 can refer to.
  • the posture warning unit 108 can determine whether or not the occupant's posture is not the normal seating state for the posture abnormality determination period. Note that the seating state determination result stored by the posture attention unit 108 is deleted, for example, when the vehicle is powered on. If the posture warning unit 108 determines that the passenger's posture is not in the normal seating state for the posture abnormality determination period, the posture warning unit 108 outputs posture caution information to the output device 600.
  • the posture caution section 108 displays, for example, a message to the output device 600 urging the occupant to sit properly, outputs the message in audio, or displays a message urging the occupant to sit properly.
  • a warning sound is output to prompt the user to sit down.
  • the posture warning unit 108 outputs posture caution information and urges the occupant to sit properly, so that the physique determination device 100 can make the occupant maintain a correct posture. As a result, the physique determining device 100 can easily determine that the occupant's posture is in a normal seating state.
  • FIG. 3 is a flowchart for explaining an example of the operation of the physique determination device 100 according to the first embodiment.
  • the physique determination device 100 periodically repeats the operation shown in the flowchart of FIG. 3 while the vehicle is running.
  • the physique determining device 100 may repeat the operation shown in the flowchart of FIG. 3 until the vehicle is powered off.
  • the image acquisition unit 101 acquires a captured image of a passenger captured by the imaging device 200 (step ST1).
  • the image acquisition unit 101 outputs the acquired captured image to the skeleton point extraction unit 102.
  • the skeletal point extraction unit 102 extracts the skeletal points of the occupant, which indicate the body parts of the occupant, based on the captured image acquired by the image acquisition unit 101 in step ST1 (step ST2). Skeleton point extraction section 102 outputs skeleton point information to situation information acquisition section 103.
  • the situation information acquisition unit 103 acquires situation information when the skeleton point extraction unit 102 extracts the skeleton points of the occupant in step ST2 (step ST3).
  • the face orientation detection unit 1031 detects the passenger's face orientation based on the captured image acquired by the image acquisition unit 101 in step ST1.
  • the situation information acquisition unit 103 acquires face direction information regarding the passenger's face direction detected by the face direction detection unit 1031 as situation information.
  • the situation information acquisition unit 103 outputs the acquired situation information, here face orientation information, to the normal seating determination unit 104.
  • the normal seating determination unit 104 determines whether the posture of the occupant is normal seating based on the situation information acquired by the situation information acquisition unit 103 in step ST3 (step ST4). Specifically, here, the normal seating determination unit 104 determines whether the occupant's posture is in the normal seating state based on the face orientation information. The normal seating determination unit 104 outputs the seating state determination result to the feature amount calculation unit 105. At this time, normal seating determination section 104 outputs the skeleton point information output from skeleton point extraction section 102 to feature amount calculation section 105 together with the seating state determination result. Further, the normal seating determination unit 104 outputs the seating state determination result to the posture attention unit 108 .
  • the feature value calculation unit 105 determines whether or not the posture of the occupant is a normal seating state based on the skeleton point information output from the skeleton point extraction unit 102 in step ST2 and the state of the occupant's posture determined by the normal seating determination unit 104 in step ST4. Based on the seating state determination result, it is determined in step ST2 whether the skeletal points of the occupant extracted by the skeletal point extraction unit 102 are regular seating skeletal points. Then, when it is determined that the occupant's skeleton point extracted by the skeleton point extracting unit 102 in step ST2 is a regular sitting skeleton point, the feature amount calculation unit 105 adds the regular sitting skeleton point information of the regular sitting skeleton point. Based on this, a feature quantity for physique determination is calculated (step ST5). The feature quantity calculation unit 105 outputs the feature quantity information to the feature quantity leveling unit 106.
  • the feature amount calculation unit 105 does not calculate the feature amount for physique determination, and The operation of the determination device 100 ends the processing as shown in the flowchart of FIG.
  • the feature value leveling unit 106 calculates a plurality of physique determination feature values for the past leveling period based on the normal sitting skeleton point information output from the normal sitting determining unit 104, in other words, The plurality of feature quantities for physique determination calculated by the feature quantity calculation unit 105 are leveled (step ST6).
  • the feature leveling unit 106 outputs the leveled feature information to the physique determining unit 107.
  • the physique determination section 107 determines the physique of the occupant based on the leveled physique determination feature amount leveled by the feature leveling section 106 in step ST6 (step ST7).
  • the physique determination unit 107 outputs the physique determination result to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
  • the posture warning unit 108 transmits the posture caution information when the state of the occupant's posture, which is determined by the normal seating determination unit 104 in step ST4, is not the normal seating state and continues for the posture abnormality determination period. , is output to the output device 600 (step ST8).
  • the physique determination device 100 extracts the skeletal points of the vehicle occupant based on the captured image of the vehicle occupant. Furthermore, the physique determination device 100 determines whether the occupant's posture is in a normal seating state based on situation information, in this case, face orientation information. The physique determination device 100 determines whether or not the extracted skeletal points of the occupant are normal seated skeletal points based on the skeletal point information and the determination result as to whether the posture state of the occupant is a normal seating state. If the point is determined to be a regular sitting skeleton point, a feature quantity for physique determination is calculated based on the regular sitting skeleton point information. Then, the physique determination device 100 determines the occupant's physique based on the calculated physique determination feature amount.
  • the posture of the occupant determined by the above-described method is likely to include an error, or the error becomes large.
  • the skeletal points of the occupant extracted from the captured image tend to include errors, or the errors become large. If the skeletal points of the occupant extracted from the captured image tend to include errors, or if the error becomes large, the feature values calculated based on the skeletal points of the occupant tend to include errors, or The error becomes larger.
  • the posture of the occupant determined based on the feature amount is likely to include an error, or the error becomes large.
  • the accuracy of physique determination decreases.
  • the position of the skeleton point, which is the feature point of the occupant, on the captured image will fall within a certain range. is assumed. Therefore, when the occupant is properly seated, errors are unlikely to occur in the occupant's skeletal points extracted from the captured image and the feature amounts calculated from the skeletal points, and as a result, the determination is made based on the feature amounts. Errors in the occupant's posture are also less likely to occur. Even if an error occurs, the influence of the error on the physique determination is not large.
  • the occupant's physique determination result based on the captured image is used, for example, in functions such as seatbelt control, airbag control, and abandonment detection.
  • Functions such as seatbelt control, airbag control, and left-behind detection are safety-related functions, and therefore the physique determination results used for these functions are required to have high accuracy. Therefore, the less accurate results of determining the occupant's physique when the occupant is seated in a crooked position may be affected by functions such as seatbelt control, airbag control, or left-behind detection. cannot be used.
  • the physique determination device determines the physique used for functions such as seatbelt control, airbag control, or left-behind detection, it is determined that the occupant is sitting in a crooked position in the seat. If the occupant's physique determination result is provided, there is a risk of erroneous control or erroneous detection in the above function.
  • the physique determination device 100 determines whether or not the posture of the occupant is in the normal seating condition, and when it is determined that the occupant is in the normal seating condition.
  • the feature amount based on the skeletal points extracted from the captured image is used as the feature amount for physique determination, and the physique of the occupant is determined based on the feature amount for physique determination.
  • the physique determination device 100 calculates a physique determination feature amount based on a captured image when the posture state of the occupant is in a normal seating state, and determines the occupant's physique based on the physique determination feature amount. , it is possible to prevent a decrease in the accuracy of determining the occupant's physique.
  • the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
  • the physique determination device 100 provides the occupant's physique determination result that prevents deterioration of accuracy, so the physique determination result can be used for seatbelt control, airbag control, abandonment detection, etc., which require high accuracy. It is also possible to provide a physique determination result that can be used in the function.
  • the situation information acquisition unit 103 includes the face orientation detection unit 1031, and the face orientation detection unit 1031 detects the captured image based on the captured image acquired by the image acquisition unit 101. The direction of the passenger's face was detected.
  • the situation information acquisition unit 103 acquires facial orientation information of the occupant from a device provided outside the physique determination device 100 at a location that the physique determination device 100 can refer to, such as an occupant state detection device included in the DMS. You may obtain it.
  • the imaging device 200 is shared with the DMS.
  • the situation information acquisition section 103 can be configured without the face orientation detection section 1031.
  • the situation information acquired by the situation information acquisition unit 103 is information regarding the passenger's facial orientation, but this is only an example.
  • the physique determination device 100 may use the situation information acquired by the situation information acquisition unit 103 as skeletal point information, for example.
  • the normal seating determination section 104 may have the function of the situation information acquisition section 103. That is, it is not essential for the physique determination device 100 as shown in FIG. 1 to include the situation information acquisition unit 103.
  • the skeleton point extraction section 102 outputs skeleton point information to the normal seating determination section 104.
  • the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the skeleton point information output from the skeleton point extraction unit 102.
  • a condition such as ⁇ Condition 2> below is defined as the condition for determining normal seating.
  • ⁇ Condition 2> On the captured image, the position of the skeleton point indicating the occupant's elbow is not located within a preset range (hereinafter referred to as "elbow position determination range").
  • the elbow position determination range in ⁇ Condition 2> is set in advance, for example, to a predetermined range including the armrest or a predetermined range near the door of the vehicle.
  • the normal seating determination unit 104 determines that the posture of the occupant is the normal seating state when the position of the skeleton point indicating the elbow of the occupant satisfies the conditions for determining normal seating. On the other hand, based on the inclination information, if the inclination of the occupant's skeleton line does not satisfy the condition for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state. For example, if the position of the skeleton point indicating the occupant's elbow is located within the elbow position determination range, the normal seating determination unit 104 determines that the occupant's posture is not the normal seating state.
  • a flowchart showing the operation of the physique determination device 100 when the situation information is skeletal point information is the same as the flowchart shown in FIG.
  • the normal seating determination unit 104 acquires the skeleton point information output by the skeleton point extraction unit 102 in step ST2 as situation information.
  • the normal seating determination unit 104 determines whether or not the occupant's posture is in a normal seating state based on the skeleton point information acquired in step ST3.
  • the situation information is the occupant's skeletal point information
  • the normal seating determining section 104 determines that the occupant's posture is normal based on the occupant's skeletal point information extracted by the skeletal point extracting section 102. It may also be determined whether the user is in a seated state.
  • the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
  • the physique determination device 100 may obtain, for example, information regarding the inclination of a skeletal line connecting the skeletal points of the occupant (hereinafter referred to as "inclination information") as the situation information.
  • inclination information information regarding the inclination of a skeletal line connecting the skeletal points of the occupant
  • FIG. 4 is a diagram illustrating a configuration example of the physique determination device 100 in the first embodiment when tilt information is used as situation information.
  • the configuration example of the physique determination device 100 shown in FIG. 4 differs from the configuration example of the physique determination device 100 shown in FIG. are different. Further, in the physique determining device 100 shown in FIG.
  • the specific operation of the normal sitting determining section 104 is different from the specific operation of the normal seating determining section 104 in the physique determining device 100 shown in FIG.
  • the specific operations of the image acquisition unit 101, skeletal point extraction unit 102, feature value calculation unit 105, feature level leveling unit 106, physique determination unit 107, and posture attention unit 108 are the same as those already explained. As this is true, repeated explanation will be omitted.
  • the inclination detection unit 1032 detects the inclination of a skeletal line, which is a line segment connecting a plurality of skeletal points of the occupant, based on the skeletal point information output from the skeletal point extraction unit 102. For example, the inclination detection unit 1032 detects the inclination of a skeletal line of the occupant's shoulders that connects skeletal points indicating both shoulders of the occupant. Note that this is just an example, and the inclination detection unit 1032 can detect the inclination of an appropriate skeletal line, such as the inclination of a skeletal line indicating a torso (for example, a skeletal line indicating sitting height).
  • the fact that the occupant's posture is disrupted is likely to be reflected in the inclination of the skeletal line of the occupant's shoulders. Therefore, when the inclination detection unit 1032 detects the inclination of the shoulder skeleton line of the occupant, it is possible to reduce the erroneous determination by the normal seating determination unit 104 as to whether or not the occupant's posture is in the normal seating state. can.
  • the situation information acquisition unit 103 acquires the tilt information as situation information.
  • the inclination information includes information in which information indicating the occupant is associated with information indicating the inclination of the skeleton line.
  • the tilt information may also include a captured image acquired by the image acquisition unit 101.
  • the situation information acquisition unit 103 outputs the acquired situation information, here tilt information, to the normal seating determination unit 104.
  • a condition such as ⁇ Condition 3> below is defined as the condition for determining normal seating.
  • ⁇ Condition 3> The inclination of the occupant's skeletal line must be within a preset range (hereinafter referred to as the "inclination determination range"). Note that the range for determining inclination in ⁇ Condition 3> is set in advance as a predetermined range based on the inclination of the skeletal line of the occupant, which is assumed to be facing forward, for example.
  • the normal seating determination unit 104 determines that the posture of the occupant is normal seating if the inclination of the skeletal line of the occupant satisfies the conditions for determining normal seating. On the other hand, based on the inclination information, if the inclination of the occupant's skeleton line does not satisfy the condition for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state.
  • a flowchart showing the operation of the physique determination device 100 when the configuration example of the physique determination device 100 is as shown in FIG. 4 is the same as the flowchart shown in FIG. However, if the configuration example of the physique determination device 100 is as shown in FIG. 4, the situation information acquisition unit 103 acquires the inclination information as the situation information in step ST3. Then, the situation information acquisition unit 103 outputs the acquired situation information, here the tilt information, to the normal seating determination unit 104. In step ST4, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the inclination information.
  • the situation information is tilt information
  • the physique determination device 100 includes the tilt detection unit 1032 that detects the tilt of the occupant's skeletal line based on the information regarding the occupant's skeletal points extracted by the skeletal point extraction unit 102, and The seating determination unit 104 may determine whether the occupant is in a normal seating position based on the inclination of the occupant's skeletal line detected by the inclination detection unit 1032.
  • the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if, for example, the occupant's posture collapses while the vehicle is running.
  • the situation information may be not only information about the situation of the occupant, but also information about the situation of the vehicle, for example. Note that in this case as well, even if a situation occurs in which only the occupant's face is not facing the front direction, this is not considered to be a posture collapse.
  • the physique determination device 100 may acquire information regarding the situation of the vehicle (hereinafter referred to as "vehicle information") as the situation information.
  • Vehicle information includes, for example, information regarding the steering angle of the vehicle, vehicle speed, the opening/closing status of vehicle doors, the ignition status of the vehicle, or the operation status of in-vehicle devices (e.g., car navigation device, interior lamp, etc.). include.
  • FIG. 5 is a diagram showing a configuration example of the physique determination device 100 in the first embodiment when vehicle information is used as situation information.
  • the configuration example of the physique determination device 100 shown in FIG. 5 is different from the configuration example of the physique determination device 100 shown in FIG. The difference is in the preparation.
  • the specific operation of the normal sitting determining section 104 is different from the specific operation of the normal seating determining section 104 in the physique determining device 100 as shown in FIG.
  • the specific operations of the image acquisition unit 101, skeletal point extraction unit 102, feature value calculation unit 105, feature level leveling unit 106, physique determination unit 107, and posture attention unit 108 are the same as those already explained. As this is true, repeated explanation will be omitted.
  • the vehicle information acquisition unit 1033 acquires vehicle information from various devices installed in the vehicle. For example, the vehicle information acquisition unit 1033 may acquire information regarding the steering angle status from the steering angle sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire the vehicle speed from a vehicle speed sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the opening/closing status of the door from a sensor provided on the door. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the ignition status from the ignition sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the operation status of the in-vehicle device from an in-vehicle device such as a car navigation device.
  • the situation information acquisition unit 103 acquires the vehicle information acquired by the vehicle information acquisition unit 1033 as situation information.
  • the situation information acquisition unit 103 outputs the acquired situation information, here vehicle information, to the normal seating determination unit 104.
  • the normal seating determination unit 104 determines that the state of the occupant's posture is the normal seating state based on the vehicle information acquired by the vehicle information acquisition unit 1033. Determine whether or not. For example, in this case, conditions such as the following ⁇ Condition 4> to ⁇ Condition 8> are defined as the conditions for determining normal seating.
  • ⁇ Condition 4> The vehicle steering angle must be within a preset range (hereinafter referred to as the "steering angle determination range").
  • the vehicle speed exceeds a preset range (hereinafter referred to as "vehicle speed determination range").
  • ⁇ Condition 6> vehicle door not open
  • ⁇ Condition 7> The elapsed time after the ignition was turned on must be within a preset time (hereinafter referred to as "start determination time").
  • start determination time The in-vehicle device is not being operated.
  • the range for steering angle determination in ⁇ Condition 4> is set to, for example, a range of steering angles that cannot occur without bending the arm.
  • the vehicle speed determination range in ⁇ Condition 5> is set to, for example, a vehicle speed assuming that the vehicle is traveling on an expressway.
  • ⁇ Condition 6> for example, if the door of the vehicle is open, it is assumed that the occupant is getting in or out of the vehicle, and that the occupant is not properly seated.
  • the normal seating determination unit 104 determines that the posture of the occupant is normal seating. On the other hand, if the vehicle information does not satisfy the conditions for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state.
  • a flowchart showing the operation of the physique determination device 100 when the configuration example of the physique determination device 100 is as shown in FIG. 5 is the same as the flowchart shown in FIG. 3.
  • the situation information acquisition unit 103 acquires the vehicle information acquired by the vehicle information acquisition unit 1033 as the situation information. .
  • the situation information acquisition unit 103 outputs the acquired situation information, here vehicle information, to the normal seating determination unit 104.
  • the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the vehicle information acquired by the vehicle information acquisition unit 1033 in step ST3.
  • the situation information is vehicle information
  • the physique determination device 100 includes a vehicle information acquisition unit 1033 that acquires vehicle information
  • the normal seating determination unit 104 uses vehicle information acquired by the vehicle information acquisition unit 1033. It may be determined whether the passenger's posture is in a normal seating state. In this case as well, the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
  • the physique determination device 100 may acquire not only face orientation information but also skeletal point information, tilt information, or vehicle information as situation information.
  • devices other than the physique determination device 100 often have the function of detecting the direction of the occupant's face based on the captured image captured by the vehicle.
  • the occupant state detection device included in the DMS may have a function of detecting the direction of the occupant's face.
  • the physique determination device 100 can acquire the face orientation information from a device external to the physique determination device 100 by using the face orientation information as situation information, and is not required to have a function of detecting the facial orientation. do not have.
  • the physique determination device 100 does not need to perform processing for detecting the occupant's face orientation, it is possible to reduce the processing load associated with determining the occupant's physique.
  • the physique determination device 100 does not need to add a function for acquiring situation information even if the skeletal point information is used as situation information.
  • detecting the position of the eyes or mouth based on the captured image has higher detection accuracy than detecting the position of the skeletal point based on the captured image. For example, determining whether the occupant is facing forward based on the direction of the occupant's face has higher determination accuracy than determining whether the occupant is facing forward based on the occupant's skeletal points.
  • the physique determination device 100 determines whether the occupant's posture is in the normal seating state with more accuracy when the face orientation information is used as the situation information than when the skeletal point information is used as the situation information. can.
  • the physique determination device 100 may acquire two or more of face orientation information, skeleton point information, tilt information, and vehicle information as situation information. That is, in the physique determination device 100, the situation information acquisition section 103 may include, for example, two or more of the face direction detection section 1031, the tilt detection section 1032, and the vehicle information acquisition section 1033.
  • the normal seating determination unit 104 determines whether the posture of the occupant is the normal seating state based on the situation information and according to the conditions for determining normal seating.
  • the normal seating determination unit 104 determines whether the occupant's posture is the normal seating state based on the situation information and a machine learning model (hereinafter referred to as the "regular seating determination model").
  • the normal seating determination model is a machine learning model that receives situational information as input and outputs information indicating whether or not the passenger's posture is normal seating.
  • the normal seating determination model is generated in advance by an administrator or the like and stored in a location that can be referenced by the physique determination device 100.
  • the normal seating determination unit 104 uses both the normal seating determination conditions and the normal seating determination model based on the situation information to determine whether the occupant is in the normal seating state. It may be determined whether or not. For example, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state (first determination) based on the situation information and in accordance with the conditions for determining normal seating. Further, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state (second determination) based on the situation information and the normal seating determination model.
  • the normal seating determination unit 104 determines that the occupant's posture is normal seating when it is determined in both the first determination and the second determination that the occupant's posture is normal seating; If it is determined in either the first determination or the second determination that the occupant's posture is not the normal seating condition, it is determined that the occupant's posture is not the normal seating condition. Further, for example, it may be determined whether or not the occupant's posture is a normal seating state by weighting the seating state determination result of the first determination and the seating state determination result of the second determination.
  • the physique determination device 100 was provided with the feature leveling unit 106, but this is only an example.
  • the physique determination device 100 may be configured without the feature leveling unit 106.
  • the physique determination unit 107 may determine the physique of the occupant based on the physique determination feature calculated by the feature calculation unit 105.
  • the physique determining device 100 can omit the process of step ST6.
  • the physique determination unit 107 may determine the physique of the occupant based on the physique determination feature calculated by the feature calculation unit 105 in step ST5.
  • the physique determination device 100 was provided with the posture attention section 108, but this is only an example.
  • the physique determination device 100 may be configured without the posture attention section 108. Further, the physique determination device 100 does not need to be connected to the output device 600. In this case, in the operation of the physique determining device 100 described using the flowchart of FIG. 3, the physique determining device 100 can omit the process of step ST8.
  • the physique determination device 100 is connected to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500, but this is only an example.
  • the physique determination device 100 may be connected only to either the seatbelt control device 300, the airbag control device 400, or the left-behind detection device 500, or may perform various processes using the physique determination results. It may be connected to devices other than the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
  • the physique determination device 100 may be connected to a display device mounted on a vehicle. The display device performs display according to the physique determination result output from the physique determination device 100, for example. For example, if an infant is among the occupants, the display device displays an icon indicating that a child is in the vehicle.
  • a plurality of imaging devices 200 may be installed in a vehicle.
  • a plurality of imaging devices 200 may be installed so as to be able to image each passenger seated in each seat in the vehicle.
  • the physique determination device 100 acquires captured images from the plurality of imaging devices 200 and determines the physique of the occupant.
  • FIG. 6A and 6B are diagrams illustrating an example of the hardware configuration of the physique determination device 100 according to the first embodiment.
  • an image acquisition unit 101 a skeleton point extraction unit 102, a situation information acquisition unit 103, a face orientation detection unit 1031, a tilt detection unit 1032, a vehicle information acquisition unit 1033, and a normal seating determination unit
  • the functions of 104 , feature amount calculation section 105 , feature amount leveling section 106 , physique determination section 107 , and posture attention section 108 are realized by processing circuit 1001 .
  • the physique determination device 100 includes a processing circuit 1001 for performing control to determine the physique of the vehicle occupant based on a captured image of the vehicle occupant.
  • the processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be a processor 1004 that executes a program stored in memory as shown in FIG. 6B.
  • the processing circuit 1001 is dedicated hardware, the processing circuit 1001 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Circuit). Gate Array), or a combination of these.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Circuit
  • the processing circuit is the processor 1004, the image acquisition unit 101, the skeleton point extraction unit 102, the situation information acquisition unit 103, the face orientation detection unit 1031, the tilt detection unit 1032, the vehicle information acquisition unit 1033, and the normal seating
  • the functions of the determination unit 104, feature amount calculation unit 105, feature amount leveling unit 106, physique determination unit 107, and posture attention unit 108 are realized by software, firmware, or a combination of software and firmware. .
  • Software or firmware is written as a program and stored in memory 1005.
  • the processor 1004 reads out and executes the program stored in the memory 1005, thereby controlling the image acquisition unit 101, the skeleton point extraction unit 102, the situation information acquisition unit 103, the face direction detection unit 1031, and the tilt detection unit 1032.
  • the physique determination device 100 includes a memory 1005 for storing a program that, when executed by the processor 1004, results in the execution of steps ST1 to ST8 in FIG. 3 described above.
  • the programs stored in the memory 1005 include an image acquisition section 101, a skeleton point extraction section 102, a situation information acquisition section 103, a face orientation detection section 1031, a tilt detection section 1032, and a vehicle information acquisition section 1033.
  • the memory 1005 is, for example, RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically non-volatile or volatile This includes semiconductor memory, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
  • the function of the image acquisition unit 101 is realized by a processing circuit 1001 as dedicated hardware, and the function is realized by a skeletal point extraction unit 102, a situation information acquisition unit 103, a face orientation detection unit 1031, and a tilt detection unit 1032.
  • the processor 1004 is stored in the memory 1005 for the vehicle information acquisition section 1033, normal seating determination section 104, feature amount calculation section 105, feature amount leveling section 106, physique determination section 107, and posture attention section 108.
  • the function can be realized by reading and executing the program.
  • the physique determination device 100 also includes an input interface device 1002 that performs wired or wireless communication with devices such as an imaging device 200, a seatbelt control device 300, an airbag control device 400, an abandonment detection device 500, or an output device 600. and an output interface device 1003.
  • the physique determination device 100 is an in-vehicle device mounted on a vehicle, and includes a skeleton point extraction section 102, a situation information acquisition section 103, a normal seating determination section 104, and a feature amount calculation section.
  • 105, the feature value leveling unit 106, the physique determination unit 107, and the posture attention unit 108 are assumed to be included in the vehicle-mounted device.
  • the present invention is not limited to this, but includes the skeletal point extraction section 102, the situation information acquisition section 103, the normal seating determination section 104, the feature amount calculation section 105, the feature amount leveling section 106, the physique determination section 107, and the posture attention section.
  • the skeletal point extraction section 102, the situation information acquisition section 103, the normal seating determination section 104, the feature amount calculation section 105, the feature amount leveling section 106, the physique determination section 107, and the posture attention section 108 are all It may be provided in the server.
  • the physique determination device 100 includes the image acquisition unit 101 that acquires a captured image of an occupant of a vehicle, and a skeletal point extracting unit 102 that extracts skeletal points of the occupant indicating the body parts of the occupant;
  • the seating determination unit 104 determines whether or not the occupant's posture is a normal seating state based on the information regarding the occupant's skeletal points extracted by the skeletal point extraction unit 102 and the seating state determination result of the normal seating determination unit 104 as to whether or not the occupant's posture is in a normal seating state.
  • the skeletal point extraction unit 102 determines whether or not the skeletal points of the occupant extracted are normal seating skeletal points extracted based on an image taken with the occupant normally seated.
  • the feature calculating unit 105 calculates a feature for physique determination based on information about the normal sitting skeletal point, and the feature calculating unit 105 calculates
  • the vehicle is configured to include a physique determination unit 107 that determines the physique of the occupant based on the physique determination feature quantity.
  • the physique determination device 100 calculates a physique determination feature amount based on a captured image when the posture state of the occupant is in a normal seating state, and determines the occupant's physique based on the physique determination feature amount. , it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, even if the occupant's posture collapses, the physique determination device 100 can prevent the accuracy of determining the occupant's physique from decreasing.
  • any component of the embodiments can be modified or any component of the embodiments can be omitted.
  • a physique determination device calculates a physique determination feature amount based on a captured image when the posture state of an occupant is in a normally seated state, and determines the physique of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique.
  • 100 physique determination device 101 image acquisition unit, 102 skeletal point extraction unit, 103 situation information acquisition unit, 1031 face orientation detection unit, 1032 tilt detection unit, 1033 vehicle information acquisition unit, 104 normal seating determination unit, 105 feature amount calculation unit , 106 feature leveling unit, 107 physique determination unit, 108 posture warning unit, 200 imaging device, 300 seatbelt control device, 400 airbag control device, 500 left behind detection device, 600 output device, 1001 processing circuit, 1002 input interface Device, 1003 Output interface device, 1004 Processor, 1005 Memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Seats For Vehicles (AREA)
  • Image Processing (AREA)

Abstract

La présente invention comprend : une unité d'extraction de points de squelette (102) qui extrait des points de squelette d'un occupant de véhicule sur la base d'une image capturée de l'occupant du véhicule ; une unité de détermination d'assise normale (104) qui détermine, sur la base d'informations de situation relatives à la situation du véhicule ou de l'occupant du véhicule, si la posture de l'occupant du véhicule est ou non dans un état d'assise normale ; une unité de calcul de quantité de caractéristiques (105) qui détermine si les points de squelette de l'occupant de véhicule qui ont été extraits par l'unité d'extraction de points de squelette (102) sont ou non des points de squelette d'assise normale extraits sur la base d'une image capturée de l'occupant de véhicule dans un état d'assise normale et, si les points de squelette de l'occupant de véhicule qui ont été extraits par l'unité d'extraction de points de squelette (102) sont déterminés comme étant les points de squelette d'assise normale, calcule une quantité de caractéristiques pour une détermination de physique sur la base d'informations relatives aux points de squelette d'assise normale ; et une unité de détermination de physique (107) qui détermine le physique de l'occupant du véhicule sur la base de la quantité de caractéristiques pour la détermination du physique.
PCT/JP2022/030703 2022-08-12 2022-08-12 Dispositif de détermination de physique et procédé de détermination de physique WO2024034109A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/030703 WO2024034109A1 (fr) 2022-08-12 2022-08-12 Dispositif de détermination de physique et procédé de détermination de physique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/030703 WO2024034109A1 (fr) 2022-08-12 2022-08-12 Dispositif de détermination de physique et procédé de détermination de physique

Publications (1)

Publication Number Publication Date
WO2024034109A1 true WO2024034109A1 (fr) 2024-02-15

Family

ID=89851231

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/030703 WO2024034109A1 (fr) 2022-08-12 2022-08-12 Dispositif de détermination de physique et procédé de détermination de physique

Country Status (1)

Country Link
WO (1) WO2024034109A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10264751A (ja) * 1997-03-21 1998-10-06 Autoliv Japan Kk 助手席用エアバッグ装置
JPH10315906A (ja) * 1997-05-21 1998-12-02 Zexel Corp 乗員認識方法と乗員認識装置及びエアバック制御方法とエアバック装置
JP2008002838A (ja) * 2006-06-20 2008-01-10 Takata Corp 車両乗員検出システム、作動装置制御システム、車両
JP2013112220A (ja) * 2011-11-29 2013-06-10 Toyota Motor Corp 車両用乗員拘束装置
JP2021066276A (ja) * 2019-10-18 2021-04-30 株式会社デンソー 乗員体格判定装置
JP2021081836A (ja) * 2019-11-15 2021-05-27 アイシン精機株式会社 体格推定装置および姿勢推定装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10264751A (ja) * 1997-03-21 1998-10-06 Autoliv Japan Kk 助手席用エアバッグ装置
JPH10315906A (ja) * 1997-05-21 1998-12-02 Zexel Corp 乗員認識方法と乗員認識装置及びエアバック制御方法とエアバック装置
JP2008002838A (ja) * 2006-06-20 2008-01-10 Takata Corp 車両乗員検出システム、作動装置制御システム、車両
JP2013112220A (ja) * 2011-11-29 2013-06-10 Toyota Motor Corp 車両用乗員拘束装置
JP2021066276A (ja) * 2019-10-18 2021-04-30 株式会社デンソー 乗員体格判定装置
JP2021081836A (ja) * 2019-11-15 2021-05-27 アイシン精機株式会社 体格推定装置および姿勢推定装置

Similar Documents

Publication Publication Date Title
US10369926B2 (en) Driver state sensing system, driver state sensing method, and vehicle including the same
CN106956619A (zh) 用于车辆座椅监控的系统和方法
US9822576B2 (en) Method for operating an activatable locking device for a door and/or a window, securing device for a vehicle, vehicle
JP2006219009A (ja) 車両用機器自動調整システム
JP2009301367A (ja) 運転者状態推定装置
JP6187155B2 (ja) 注視対象物推定装置
JP2019189101A (ja) 乗員情報判定装置
CN111601736B (zh) 乘客检测装置、乘客检测系统以及乘客检测方法
JP7134364B2 (ja) 体格判定装置および体格判定方法
JP5498183B2 (ja) 行動検出装置
JP2021037216A (ja) 閉眼判定装置
JP4858516B2 (ja) 顔向き検出装置
WO2024034109A1 (fr) Dispositif de détermination de physique et procédé de détermination de physique
JP2018117740A (ja) 生体情報検出装置
JP7267467B2 (ja) 注意方向判定装置および注意方向判定方法
JP2005087284A (ja) 覚醒状態判定装置及び覚醒状態判定方法
JP7374373B2 (ja) 体格判定装置および体格判定方法
JP7320188B2 (ja) ドライバ異常姿勢検出装置
WO2023223442A1 (fr) Dispositif de détermination de physique et procédé de détermination de physique
JP7479575B2 (ja) 体格推定装置および体格推定方法
JP2020052869A (ja) 識別装置、識別方法および識別プログラム
JP4296963B2 (ja) 車載用健康診断装置および車載用健康診断方法
WO2023084738A1 (fr) Dispositif de détermination de physique et procédé de détermination de physique
WO2024079779A1 (fr) Dispositif de détermination d'état de passager, système de détermination d'état de passager, procédé de détermination d'état de passager et programme
JP7275409B2 (ja) 乗員状態判定装置および乗員状態判定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22955024

Country of ref document: EP

Kind code of ref document: A1