WO2024034109A1 - Physique determination device and physique determination method - Google Patents

Physique determination device and physique determination method Download PDF

Info

Publication number
WO2024034109A1
WO2024034109A1 PCT/JP2022/030703 JP2022030703W WO2024034109A1 WO 2024034109 A1 WO2024034109 A1 WO 2024034109A1 JP 2022030703 W JP2022030703 W JP 2022030703W WO 2024034109 A1 WO2024034109 A1 WO 2024034109A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant
physique
determination
unit
skeletal
Prior art date
Application number
PCT/JP2022/030703
Other languages
French (fr)
Japanese (ja)
Inventor
晃平 鎌田
祥平 ▲高▼原
直哉 馬場
大暉 市川
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/030703 priority Critical patent/WO2024034109A1/en
Publication of WO2024034109A1 publication Critical patent/WO2024034109A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to an occupant physique determination device and a physique determination method.
  • Patent Document 1 discloses that based on a captured image obtained by capturing an image of the inside of a vehicle, the occupant's body physique estimation index including the coordinates of a plurality of feature points of the occupant in the captured image extracted by deep learning etc.
  • An indoor monitoring device has been disclosed that estimates a posture and estimates the physique of an occupant based on the estimated posture.
  • the present disclosure has been made in order to solve the above-mentioned problems, and an object of the present disclosure is to provide a physique determination device that prevents a decrease in the accuracy of determining the physique of an occupant even if the posture of the occupant collapses. shall be.
  • the physique determination device includes an image acquisition unit that acquires a captured image of an occupant of a vehicle, and extracts skeletal points of the occupant indicating body parts of the vehicle based on the captured image acquired by the image acquisition unit.
  • a normal seating determination section that determines whether or not the occupant's posture is in a normal seating state based on situation information regarding the vehicle or the occupant's situation;
  • the skeletal points of the occupant extracted by the skeletal point extraction section are based on information about the skeletal points of It is determined whether or not the occupant's skeletal point is a normal seated skeletal point extracted based on an image taken in a seated state, and when it is determined that the skeletal point of the occupant extracted by the skeletal point extraction unit is a normal seated skeletal point.
  • a feature amount calculation unit that calculates a feature amount for physique determination based on information regarding normal sitting skeletal points, and a physique determination unit that determines the physique of the occupant based on the feature amount for physique determination calculated by the feature amount calculation unit. It is prepared.
  • the physique determination device can prevent the accuracy of determining the physique of the occupant from decreasing even if the occupant's posture collapses.
  • FIG. 1 is a diagram illustrating an example configuration of a system in which a physique determination device according to a first embodiment is used;
  • FIG. 1 is a diagram illustrating a configuration example of a physique determination device according to a first embodiment;
  • FIG. 2 is a flowchart for explaining an example of the operation of the physique determination device according to the first embodiment.
  • FIG. 3 is a diagram illustrating a configuration example of a physique determination device in a case where inclination information is used as situation information in the first embodiment.
  • FIG. 2 is a diagram illustrating a configuration example of a physique determination device in a case where vehicle information is used as situation information in Embodiment 1.
  • FIG. 6A and 6B are diagrams illustrating an example of the hardware configuration of the physique determination device according to the first embodiment.
  • FIG. 1 is a diagram showing a configuration example of a system in which a physique determination device 100 according to the first embodiment is used.
  • the physique determination device 100 according to the first embodiment is connected to an imaging device 200, a seatbelt control device 300, an airbag control device 400, an abandonment detection device 500, and an output device 600. Ru.
  • the physique determination device 100 is mounted on a vehicle.
  • the seatbelt control device 300, the airbag control device 400, the left-behind detection device 500, and the output device 600 are mounted on the vehicle.
  • the imaging device 200 is, for example, a near-infrared camera or a visible light camera, and images the occupant of the vehicle.
  • the imaging device 200 may be shared with an imaging device included in a so-called "Driver Monitoring System (DMS)" that is installed in a vehicle to monitor the condition of a driver in the vehicle.
  • DMS Driver Monitoring System
  • the imaging device 200 is installed to be able to image an area within the vehicle that includes at least an area where the upper body of a vehicle occupant should be present.
  • the range in which the upper body of the occupant in the vehicle should exist is, for example, a range corresponding to the space near the backrest of the seat and the front of the headrest.
  • the imaging device 200 can image the front seats including the driver's seat and the front passenger seat from the center of the instrument panel (hereinafter referred to as "instrument panel") of the vehicle. It is installed so that it can image the occupant in the seat (hereinafter referred to as "passenger seat occupant"). Note that this is just an example, and the imaging device 200 may be installed to be able to image only the driver, or may be installed to be able to image the passenger in the rear seat, for example.
  • the physique determination device 100 determines the physique of a vehicle occupant based on a captured image of the vehicle occupant taken by the imaging device 200. At this time, the physique determination device 100 determines whether or not the posture of the occupant of the vehicle is in a normally seated state, and determines the physique of the occupant based on the captured image taken with the occupant in the normally seated state. Make a judgment.
  • "regular seating” means that the occupant is seated on the seat in a normal state with no posture collapse.
  • a state in which the occupant's face is not facing forward is also considered to be a posture collapse.
  • the physique determination device 100 determines the physique of the vehicle occupant, the physique determination device 100 transmits the result of determining the physique of the vehicle occupant (hereinafter referred to as “physique determination result”) to the seat belt control device 300, the airbag control device 400, and the left-behind detection. Output to device 500. Furthermore, the physique determination device 100 outputs information to the output device 600 for outputting a message or the like urging the occupant to sit properly. Details of a configuration example of the physique determination device 100 will be described later using FIG. 2. In addition, in the following Embodiment 1, the passenger of a vehicle is also simply called a "passenger.”
  • the seat belt control device 300 controls the seat belt.
  • the seatbelt control device 300 has a seatbelt reminder function that outputs a warning in consideration of the occupant's physique based on the physique determination result output from the physique determination device 100.
  • the airbag control device 400 is, for example, an ECU (Engine Control Unit) for an airbag system, and controls the airbag based on the physique determination result output from the physique determination device 100.
  • ECU Engine Control Unit
  • the left-behind detection device 500 detects if an infant or the like is left behind in a vehicle. For example, the abandonment detection device 500 determines whether an infant or the like is left behind in the vehicle based on the physique determination result output from the physique determination device 100.
  • the output device 600 is, for example, a display device or an audio output device provided in a center cluster, a meter panel, or the like.
  • the output device 600 displays a message urging the occupant to sit properly, outputs the message in audio, or outputs a warning sound urging the occupant to sit properly.
  • FIG. 2 is a diagram showing a configuration example of the physique determination device 100 according to the first embodiment.
  • the physique determination device 100 includes an image acquisition section 101, a skeletal point extraction section 102, a situation information acquisition section 103, a normal sitting determination section 104, a feature calculation section 105, a feature leveling section 106, a physique determination section 107, and a posture determination section 104.
  • a warning section 108 is provided.
  • the situation information acquisition unit 103 includes a face orientation detection unit 1031.
  • the image acquisition unit 101 acquires a captured image of a passenger captured by the imaging device 200.
  • the image acquisition unit 101 outputs the acquired captured image to the skeleton point extraction unit 102.
  • the skeletal point extraction unit 102 extracts skeletal points of the occupant, which indicate body parts of the occupant, based on the captured image acquired by the image acquisition unit 101. Specifically, the skeletal point extracting unit 102 extracts skeletal points of the occupant, which indicate joint points determined for each part of the occupant's body, based on the captured image acquired by the image acquiring unit 101. Joint points determined for each part of the occupant's body include, for example, nose joint points, neck joint points, shoulder joint points, elbow joint points, waist joint points, wrist joint points, and knee joints. point, the ankle joint point. The joint points are predefined.
  • the skeletal point extraction unit 102 may extract the skeletal points of the occupant using a known method using a known technique such as an image recognition technique or a technique using a trained model (hereinafter referred to as a "machine learning model").
  • a skeleton point is, for example, a point in a captured image, and is expressed by coordinates in the captured image.
  • the skeletal point extraction unit 102 detects the coordinates of a skeletal point of the occupant and which part of the body of the occupant the skeletal point indicates.
  • the skeletal point extraction unit 102 extracts skeletal points for each occupant.
  • an area corresponding to each seat (hereinafter referred to as a "seat corresponding area") is set in advance for each seat.
  • the seat corresponding area is set in advance according to the installation position of the imaging device 200 and the angle of view.
  • the skeleton point extraction unit 102 extracts skeleton points for each seat corresponding area. For example, if a certain seat-corresponding area is a seat-corresponding area corresponding to a driver's seat, the skeleton point extraction unit 102 assumes that the skeleton points extracted in the seat-corresponding area are the skeleton points of the driver.
  • the skeleton point extraction unit 102 determines that the skeleton points extracted in the seat corresponding area are the skeleton points of the front passenger seat occupant. . In this way, the skeleton point extracting unit 102 extracts skeleton points for each occupant, for example, by extracting skeleton points from the seat corresponding area for each seat corresponding area.
  • Skeletal point extraction section 102 outputs information regarding the extracted skeleton points (hereinafter referred to as "skeletal point information") to situation information acquisition section 103.
  • the skeletal point information includes information in which information indicating the occupant, information indicating the coordinates of the skeletal point of the occupant, and information indicating which part of the body of the occupant the skeletal point indicates are associated; Also included are captured images acquired by the image acquisition unit 101.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the situation information acquisition unit 103 acquires information regarding the situation of the occupant (hereinafter referred to as “situation information”). ).
  • the situation information is, for example, information regarding the direction of the occupant's face.
  • the face orientation detection unit 1031 detects the passenger's face orientation based on the captured image acquired by the image acquisition unit 101.
  • the captured image acquired by the image acquisition unit 101 is included in the skeleton point information.
  • the face orientation detection unit 1031 may detect the passenger's face orientation using a known method using a known technique such as an image recognition technique or a technique using a machine learning model.
  • the direction of the occupant's face is expressed, for example, by an angle. Note that the face orientation detection unit 1031 detects the face orientation for each passenger.
  • the situation information acquisition unit 103 acquires information regarding the passenger's face direction detected by the face direction detection unit 1031 (hereinafter referred to as “face direction information”) as situation information.
  • the face orientation information includes information in which information indicating the occupant and information indicating the face orientation are associated with each other.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the face direction detection unit 1031 may determine which seat the occupant is seated in based on the seat corresponding area.
  • the face orientation information may also include a captured image acquired by the image acquisition unit 101.
  • the situation information acquisition unit 103 outputs the acquired situation information, here face orientation information, to the normal seating determination unit 104.
  • the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the situation information acquired by the situation information acquisition unit 103. Specifically, here, the normal seating determination unit 104 determines whether the occupant's posture is in the normal seating state based on the face orientation information. Note that the normal seating determination unit 104 determines for each passenger whether or not the passenger's posture is in a normal seating state. The normal seating determination unit 104 can identify the facial orientation of each occupant from the facial orientation information.
  • the normal seating determination unit 104 uses conditions (hereinafter referred to as (referred to as "regular seating determination conditions"), it is determined whether the occupant's posture is in a regular seating state.
  • the conditions for determining normal seating include, for example, conditions under which the occupant's posture is considered to be normal seating.
  • the normal seating determination conditions are generated in advance by an administrator or the like and stored in a location that can be referenced by the physique determination device 100.
  • a condition such as ⁇ Condition 1> below is defined as the condition for determining normal seating.
  • ⁇ Condition 1> The occupant's face must be facing forward. If the passenger's face direction is within a predetermined range (referred to as a "face direction determination range") with the front as a reference, it is assumed that the passenger's face direction is facing the front direction. Note that the face orientation determination range in ⁇ Condition 1> is set in advance as an appropriate range in which it is assumed that the face is facing the front direction.
  • the normal seating determination unit 104 determines that the posture of the occupant is the normal seating state when the passenger's face orientation satisfies the normal seating determination conditions. On the other hand, based on the face orientation information, if the occupant's face orientation does not satisfy the conditions for determining normal seating, the normal seating determination unit 104 determines that the occupant's posture is not the normal seating state.
  • the normal seating determination unit 104 outputs a determination result (hereinafter referred to as a “seating state determination result”) as to whether or not the occupant's posture is determined to be a normal seating state to the feature value calculation unit 105.
  • a determination result hereinafter referred to as a “seating state determination result”
  • normal seating determination section 104 outputs the skeleton point information output from skeleton point extraction section 102 to feature amount calculation section 105 together with the seating state determination result. Further, the normal seating determination unit 104 outputs the seating state determination result to the posture attention unit 108 .
  • the feature value calculation unit 105 calculates the position of the occupant based on the skeleton point information output from the skeleton point extraction unit 102 and the seating state determination result of the normal seating determination unit 104 as to whether or not the occupant's posture is in the normal seating state. Whether or not the occupant's skeletal points extracted by the skeletal point extraction unit 102 are skeletal points extracted based on an image taken with the occupant in a normally seated state (hereinafter referred to as "normally seated skeletal points"). Determine.
  • the feature value calculation unit 105 extracts The skeleton point thus obtained is determined to be a normal seated skeleton point. That is, the feature amount calculation unit 105 determines that the skeleton point information output from the skeleton point extraction unit 102 is regular sitting skeleton point information. Note that in the first embodiment, the feature amount calculation unit 105 acquires the skeleton point information output from the skeleton point extraction unit 102 via the normal seating determination unit 104, but this is only an example. do not have. For example, the feature amount calculation unit 105 may directly acquire skeleton point information from the skeleton point extraction unit 102. In this case, in FIG. 1, the arrow from the skeleton point extraction unit 102 is drawn toward the feature quantity calculation unit 105.
  • the feature calculation unit 105 determines that the occupant's skeletal point extracted by the skeletal point extraction unit 102 is a normal seating skeletal point
  • the feature calculation unit 105 provides skeletal point information of the normal seating skeletal point (hereinafter referred to as “regular sitting skeletal point Based on the information (hereinafter referred to as "feature amount for physique determination") used for determining the occupant's physique (hereinafter referred to as "feature amount for physique determination").
  • the feature amount calculation unit 105 calculates a feature amount for physique determination for each occupant.
  • the feature amount calculation unit 105 can identify the skeletal points of each occupant from the normal seating skeletal point information. Note that if the feature amount calculation unit 105 does not determine that the occupant's skeletal points extracted by the skeletal point extraction unit 102 are normal seating skeletal points, the feature amount calculation unit 105 does not calculate the feature amount for physique determination.
  • the feature amount calculation unit 105 calculates the feature amount for physique determination from the length in the captured image of a line segment connecting two of the skeletal points in the captured image, based on the normal sitting skeletal point information.
  • the feature amount calculation unit 105 calculates the arm length, shoulder width, and neck length of the occupant in the captured image as the feature amount for physique determination based on the normal sitting skeletal point information.
  • the feature amount calculation unit 105 may set the length of a line segment connecting a skeletal point indicating a shoulder and a skeletal point indicating an elbow as the length of the arm of the occupant as the feature amount for physique determination.
  • the feature value calculation unit 105 calculates, for example, the length of a line segment connecting the skeletal points indicating both shoulders, the length of the line segment connecting the skeletal points indicating the neck and the skeletal point indicating the right shoulder, or the length of the line segment connecting the skeletal points indicating the neck and the skeletal point indicating the right shoulder.
  • the feature quantity calculation unit 105 may, for example, set the length of the line segment connecting the skeletal point indicating the nose and the skeletal point indicating the neck as the length of the neck to be used as the feature quantity for physique determination.
  • the feature amount calculation unit 105 does not need to calculate all the arm length, shoulder width, and neck length of the occupant as feature amounts for physique determination; At least one of the length, shoulder width, and neck length may be calculated as the feature quantity for physique determination. Further, the feature amount calculation unit 105 may use the length of a line segment connecting two skeletal points other than the arm length, shoulder width, and neck length of the occupant as the feature amount for physique determination.
  • the feature quantity calculation unit 105 outputs information regarding the calculated feature quantity for physique determination (hereinafter referred to as “feature quantity information”) to the feature quantity leveling unit 106.
  • feature quantity information information regarding the calculated feature quantity for physique determination
  • information indicating the occupant is associated with a feature amount for physique determination.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the feature quantity leveling unit 106 calculates a plurality of physique determination feature quantities for a past preset period (hereinafter referred to as “leveling period”), in other words, the feature quantity calculation unit 105
  • the plurality of physique determination feature quantities calculated by are equalized.
  • the leveling period can be set to an appropriate length.
  • the leveling period is set by an administrator or the like, and is stored in a location that can be referenced by the feature amount leveling unit 106.
  • the feature leveling unit 106 associates the feature information output from the feature calculating unit 105 with the acquisition date and time of the feature information, and stores it at a location where the feature leveling unit 106 can refer to it. Memorize them in sequence.
  • the feature quantity leveling unit 106 may acquire a plurality of physique determination feature quantities in the leveling period from the stored feature quantity information. Note that when the feature values for physique determination for the leveling period are not stored, such as immediately after the vehicle starts running, the feature value leveling unit 106 stores only the stored feature values for physique determination, for example.
  • the feature amount for physique determination may be set for the leveling period.
  • the feature amount information stored by the feature amount calculation unit 105 is deleted, for example, when the power of the vehicle is turned on.
  • the feature leveling unit 106 levels the plurality of physique determination feature quantities for the leveling period, for example, by averaging, taking a median value, or taking a value corresponding to a set percentile. .
  • the feature leveling unit 106 calculates, for example, the average value of the plurality of physique determination feature quantities for the leveling period, the median value of the plurality of physique determination feature quantities for the leveling period, or the average value of the plurality of physique determination feature quantities for the leveling period.
  • the value corresponding to the set percentile of the plurality of features for physique determination for the period of time is set as the feature for physique determination after leveling (hereinafter referred to as "the feature for physique determination after leveling"). calculate.
  • the feature amount leveling unit 106 calculates a post-leveling physique determination feature amount for each occupant.
  • the feature amount leveling unit 106 can specify the feature amount for determining the physique of each occupant from the feature amount information.
  • the feature quantity calculation unit 105 calculates a plurality of types of feature quantities for physique determination, such as the length of the arms, shoulder width, and neck length of the occupant, the feature quantity leveling unit 106 level the respective feature quantities.
  • the physique determination device 100 can reduce the influence on the occupant's physique determination using the physique determination feature (specifically, the leveled physique determination feature).
  • the physique determination section 107 performs the determination of the occupant's physique. Details of the physique determining section 107 will be described later.
  • the feature amount leveling unit 106 averages the plurality of physique determination feature amounts for the leveling period to level the plurality of physique determination feature amounts. Even if the number of a plurality of feature quantities for physique determination is small, an accurate value, in other words, a value closer to the feature quantity according to the original physique of the occupant can be used as the feature quantity for physique determination after leveling. Therefore, in this case, for example, the administrator etc. can perform face orientation determination defined in the regular seating determination conditions used by the regular seating determination unit 104 to determine whether or not the passenger's posture is in the regular seating state. The scope of use can be narrowed.
  • the frequency at which the normal seating determining unit 104 determines that the passenger's posture is normal seating will decrease; Even if the number of multiple physique determination feature quantities is small when the passenger's posture is determined to be in a normal seating state, accurate physique determination after leveling can be performed from a small number of physique determination feature quantities. Feature quantities can be calculated. As a result, the physique determination device 100 can prevent a decrease in the accuracy of determining the occupant's physique using the leveled physique determination feature amount.
  • the feature value leveling unit 106 may take the median value of a plurality of feature values for physique determination for the leveling period, or may take a value corresponding to a set percentile to determine the When leveling feature quantities, the feature leveling unit 106 calculates accurate values from a large number of physique determination feature quantities, in other words, even if the number of physique determination feature quantities is large. A value closer to the corresponding feature amount can be set as the feature amount for physique determination after leveling. Therefore, in this case, for example, the administrator etc. can perform face orientation determination defined in the regular seating determination conditions used by the regular seating determination unit 104 to determine whether or not the passenger's posture is in the regular seating state.
  • the normal seating determination unit 104 can easily determine that the occupant's posture is in the normal seating state. It is assumed that by making it easier for the normal seating determining unit 104 to determine that the occupant's posture is normal seating, the accuracy of determining the normal seating state of the occupant will decrease; however, after that, the feature leveling unit 106 is capable of calculating an accurate leveled physique determination feature amount from many physique determination feature amounts when the posture state of the occupant is determined to be a normal seating state. As a result, the physique determination device 100 can prevent a decrease in the accuracy of determining the occupant's physique using the leveled physique determination feature amount.
  • the feature leveling unit 106 outputs information regarding the leveled physique determination feature (hereinafter referred to as “leveled feature information”) to the physique determining unit 107.
  • leveled feature information information regarding the leveled physique determination feature
  • information indicating the occupant and leveled feature amount information are associated for each occupant.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the physique of the occupant is determined based on the leveled physique determination feature quantity leveled by the physique determination unit 107 and the feature quantity leveling unit 106 .
  • the physique determination unit 107 determines the physique of each occupant.
  • the physique determination unit 107 can identify the leveled physique determination feature of each occupant from the leveled feature information.
  • the physique determining unit 107 may determine the physique of the occupant by a known method using a known technique such as an image recognition technique or a technique using a learned model (hereinafter referred to as a "machine learning model").
  • the physique determining unit 107 determines the physique of the occupant, such as "infant”, “small”, “normal”, or “large”, using a known method. Note that this is just an example, and the definition of the physique determined by the physique determination unit 107 can be set as appropriate.
  • the physique determination unit 107 outputs the physique determination result to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
  • information indicating the occupant is associated with information indicating the occupant's physique.
  • the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
  • the posture warning unit 108 determines whether the occupant's posture, as determined by the normal seating determination unit 104, is not the normal seating state for a preset period of time (hereinafter referred to as “posture abnormality determination time”). If so, information (hereinafter referred to as "posture caution information”) urging the occupant to adopt a normal seating posture is output to the output device 600. Note that, based on the seating state determination result output from the normal seating determination unit 104, the posture warning unit 108 determines that the normal seating determination unit 104 has determined that the occupant's posture is not the normal seating state. can.
  • the posture caution section 108 associates the sitting state determination result output from the regular sitting determination section 104 with the acquisition date and time of the sitting state determination result, and stores it in a chronological order in a location where the posture caution section 108 can refer to.
  • the posture warning unit 108 can determine whether or not the occupant's posture is not the normal seating state for the posture abnormality determination period. Note that the seating state determination result stored by the posture attention unit 108 is deleted, for example, when the vehicle is powered on. If the posture warning unit 108 determines that the passenger's posture is not in the normal seating state for the posture abnormality determination period, the posture warning unit 108 outputs posture caution information to the output device 600.
  • the posture caution section 108 displays, for example, a message to the output device 600 urging the occupant to sit properly, outputs the message in audio, or displays a message urging the occupant to sit properly.
  • a warning sound is output to prompt the user to sit down.
  • the posture warning unit 108 outputs posture caution information and urges the occupant to sit properly, so that the physique determination device 100 can make the occupant maintain a correct posture. As a result, the physique determining device 100 can easily determine that the occupant's posture is in a normal seating state.
  • FIG. 3 is a flowchart for explaining an example of the operation of the physique determination device 100 according to the first embodiment.
  • the physique determination device 100 periodically repeats the operation shown in the flowchart of FIG. 3 while the vehicle is running.
  • the physique determining device 100 may repeat the operation shown in the flowchart of FIG. 3 until the vehicle is powered off.
  • the image acquisition unit 101 acquires a captured image of a passenger captured by the imaging device 200 (step ST1).
  • the image acquisition unit 101 outputs the acquired captured image to the skeleton point extraction unit 102.
  • the skeletal point extraction unit 102 extracts the skeletal points of the occupant, which indicate the body parts of the occupant, based on the captured image acquired by the image acquisition unit 101 in step ST1 (step ST2). Skeleton point extraction section 102 outputs skeleton point information to situation information acquisition section 103.
  • the situation information acquisition unit 103 acquires situation information when the skeleton point extraction unit 102 extracts the skeleton points of the occupant in step ST2 (step ST3).
  • the face orientation detection unit 1031 detects the passenger's face orientation based on the captured image acquired by the image acquisition unit 101 in step ST1.
  • the situation information acquisition unit 103 acquires face direction information regarding the passenger's face direction detected by the face direction detection unit 1031 as situation information.
  • the situation information acquisition unit 103 outputs the acquired situation information, here face orientation information, to the normal seating determination unit 104.
  • the normal seating determination unit 104 determines whether the posture of the occupant is normal seating based on the situation information acquired by the situation information acquisition unit 103 in step ST3 (step ST4). Specifically, here, the normal seating determination unit 104 determines whether the occupant's posture is in the normal seating state based on the face orientation information. The normal seating determination unit 104 outputs the seating state determination result to the feature amount calculation unit 105. At this time, normal seating determination section 104 outputs the skeleton point information output from skeleton point extraction section 102 to feature amount calculation section 105 together with the seating state determination result. Further, the normal seating determination unit 104 outputs the seating state determination result to the posture attention unit 108 .
  • the feature value calculation unit 105 determines whether or not the posture of the occupant is a normal seating state based on the skeleton point information output from the skeleton point extraction unit 102 in step ST2 and the state of the occupant's posture determined by the normal seating determination unit 104 in step ST4. Based on the seating state determination result, it is determined in step ST2 whether the skeletal points of the occupant extracted by the skeletal point extraction unit 102 are regular seating skeletal points. Then, when it is determined that the occupant's skeleton point extracted by the skeleton point extracting unit 102 in step ST2 is a regular sitting skeleton point, the feature amount calculation unit 105 adds the regular sitting skeleton point information of the regular sitting skeleton point. Based on this, a feature quantity for physique determination is calculated (step ST5). The feature quantity calculation unit 105 outputs the feature quantity information to the feature quantity leveling unit 106.
  • the feature amount calculation unit 105 does not calculate the feature amount for physique determination, and The operation of the determination device 100 ends the processing as shown in the flowchart of FIG.
  • the feature value leveling unit 106 calculates a plurality of physique determination feature values for the past leveling period based on the normal sitting skeleton point information output from the normal sitting determining unit 104, in other words, The plurality of feature quantities for physique determination calculated by the feature quantity calculation unit 105 are leveled (step ST6).
  • the feature leveling unit 106 outputs the leveled feature information to the physique determining unit 107.
  • the physique determination section 107 determines the physique of the occupant based on the leveled physique determination feature amount leveled by the feature leveling section 106 in step ST6 (step ST7).
  • the physique determination unit 107 outputs the physique determination result to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
  • the posture warning unit 108 transmits the posture caution information when the state of the occupant's posture, which is determined by the normal seating determination unit 104 in step ST4, is not the normal seating state and continues for the posture abnormality determination period. , is output to the output device 600 (step ST8).
  • the physique determination device 100 extracts the skeletal points of the vehicle occupant based on the captured image of the vehicle occupant. Furthermore, the physique determination device 100 determines whether the occupant's posture is in a normal seating state based on situation information, in this case, face orientation information. The physique determination device 100 determines whether or not the extracted skeletal points of the occupant are normal seated skeletal points based on the skeletal point information and the determination result as to whether the posture state of the occupant is a normal seating state. If the point is determined to be a regular sitting skeleton point, a feature quantity for physique determination is calculated based on the regular sitting skeleton point information. Then, the physique determination device 100 determines the occupant's physique based on the calculated physique determination feature amount.
  • the posture of the occupant determined by the above-described method is likely to include an error, or the error becomes large.
  • the skeletal points of the occupant extracted from the captured image tend to include errors, or the errors become large. If the skeletal points of the occupant extracted from the captured image tend to include errors, or if the error becomes large, the feature values calculated based on the skeletal points of the occupant tend to include errors, or The error becomes larger.
  • the posture of the occupant determined based on the feature amount is likely to include an error, or the error becomes large.
  • the accuracy of physique determination decreases.
  • the position of the skeleton point, which is the feature point of the occupant, on the captured image will fall within a certain range. is assumed. Therefore, when the occupant is properly seated, errors are unlikely to occur in the occupant's skeletal points extracted from the captured image and the feature amounts calculated from the skeletal points, and as a result, the determination is made based on the feature amounts. Errors in the occupant's posture are also less likely to occur. Even if an error occurs, the influence of the error on the physique determination is not large.
  • the occupant's physique determination result based on the captured image is used, for example, in functions such as seatbelt control, airbag control, and abandonment detection.
  • Functions such as seatbelt control, airbag control, and left-behind detection are safety-related functions, and therefore the physique determination results used for these functions are required to have high accuracy. Therefore, the less accurate results of determining the occupant's physique when the occupant is seated in a crooked position may be affected by functions such as seatbelt control, airbag control, or left-behind detection. cannot be used.
  • the physique determination device determines the physique used for functions such as seatbelt control, airbag control, or left-behind detection, it is determined that the occupant is sitting in a crooked position in the seat. If the occupant's physique determination result is provided, there is a risk of erroneous control or erroneous detection in the above function.
  • the physique determination device 100 determines whether or not the posture of the occupant is in the normal seating condition, and when it is determined that the occupant is in the normal seating condition.
  • the feature amount based on the skeletal points extracted from the captured image is used as the feature amount for physique determination, and the physique of the occupant is determined based on the feature amount for physique determination.
  • the physique determination device 100 calculates a physique determination feature amount based on a captured image when the posture state of the occupant is in a normal seating state, and determines the occupant's physique based on the physique determination feature amount. , it is possible to prevent a decrease in the accuracy of determining the occupant's physique.
  • the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
  • the physique determination device 100 provides the occupant's physique determination result that prevents deterioration of accuracy, so the physique determination result can be used for seatbelt control, airbag control, abandonment detection, etc., which require high accuracy. It is also possible to provide a physique determination result that can be used in the function.
  • the situation information acquisition unit 103 includes the face orientation detection unit 1031, and the face orientation detection unit 1031 detects the captured image based on the captured image acquired by the image acquisition unit 101. The direction of the passenger's face was detected.
  • the situation information acquisition unit 103 acquires facial orientation information of the occupant from a device provided outside the physique determination device 100 at a location that the physique determination device 100 can refer to, such as an occupant state detection device included in the DMS. You may obtain it.
  • the imaging device 200 is shared with the DMS.
  • the situation information acquisition section 103 can be configured without the face orientation detection section 1031.
  • the situation information acquired by the situation information acquisition unit 103 is information regarding the passenger's facial orientation, but this is only an example.
  • the physique determination device 100 may use the situation information acquired by the situation information acquisition unit 103 as skeletal point information, for example.
  • the normal seating determination section 104 may have the function of the situation information acquisition section 103. That is, it is not essential for the physique determination device 100 as shown in FIG. 1 to include the situation information acquisition unit 103.
  • the skeleton point extraction section 102 outputs skeleton point information to the normal seating determination section 104.
  • the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the skeleton point information output from the skeleton point extraction unit 102.
  • a condition such as ⁇ Condition 2> below is defined as the condition for determining normal seating.
  • ⁇ Condition 2> On the captured image, the position of the skeleton point indicating the occupant's elbow is not located within a preset range (hereinafter referred to as "elbow position determination range").
  • the elbow position determination range in ⁇ Condition 2> is set in advance, for example, to a predetermined range including the armrest or a predetermined range near the door of the vehicle.
  • the normal seating determination unit 104 determines that the posture of the occupant is the normal seating state when the position of the skeleton point indicating the elbow of the occupant satisfies the conditions for determining normal seating. On the other hand, based on the inclination information, if the inclination of the occupant's skeleton line does not satisfy the condition for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state. For example, if the position of the skeleton point indicating the occupant's elbow is located within the elbow position determination range, the normal seating determination unit 104 determines that the occupant's posture is not the normal seating state.
  • a flowchart showing the operation of the physique determination device 100 when the situation information is skeletal point information is the same as the flowchart shown in FIG.
  • the normal seating determination unit 104 acquires the skeleton point information output by the skeleton point extraction unit 102 in step ST2 as situation information.
  • the normal seating determination unit 104 determines whether or not the occupant's posture is in a normal seating state based on the skeleton point information acquired in step ST3.
  • the situation information is the occupant's skeletal point information
  • the normal seating determining section 104 determines that the occupant's posture is normal based on the occupant's skeletal point information extracted by the skeletal point extracting section 102. It may also be determined whether the user is in a seated state.
  • the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
  • the physique determination device 100 may obtain, for example, information regarding the inclination of a skeletal line connecting the skeletal points of the occupant (hereinafter referred to as "inclination information") as the situation information.
  • inclination information information regarding the inclination of a skeletal line connecting the skeletal points of the occupant
  • FIG. 4 is a diagram illustrating a configuration example of the physique determination device 100 in the first embodiment when tilt information is used as situation information.
  • the configuration example of the physique determination device 100 shown in FIG. 4 differs from the configuration example of the physique determination device 100 shown in FIG. are different. Further, in the physique determining device 100 shown in FIG.
  • the specific operation of the normal sitting determining section 104 is different from the specific operation of the normal seating determining section 104 in the physique determining device 100 shown in FIG.
  • the specific operations of the image acquisition unit 101, skeletal point extraction unit 102, feature value calculation unit 105, feature level leveling unit 106, physique determination unit 107, and posture attention unit 108 are the same as those already explained. As this is true, repeated explanation will be omitted.
  • the inclination detection unit 1032 detects the inclination of a skeletal line, which is a line segment connecting a plurality of skeletal points of the occupant, based on the skeletal point information output from the skeletal point extraction unit 102. For example, the inclination detection unit 1032 detects the inclination of a skeletal line of the occupant's shoulders that connects skeletal points indicating both shoulders of the occupant. Note that this is just an example, and the inclination detection unit 1032 can detect the inclination of an appropriate skeletal line, such as the inclination of a skeletal line indicating a torso (for example, a skeletal line indicating sitting height).
  • the fact that the occupant's posture is disrupted is likely to be reflected in the inclination of the skeletal line of the occupant's shoulders. Therefore, when the inclination detection unit 1032 detects the inclination of the shoulder skeleton line of the occupant, it is possible to reduce the erroneous determination by the normal seating determination unit 104 as to whether or not the occupant's posture is in the normal seating state. can.
  • the situation information acquisition unit 103 acquires the tilt information as situation information.
  • the inclination information includes information in which information indicating the occupant is associated with information indicating the inclination of the skeleton line.
  • the tilt information may also include a captured image acquired by the image acquisition unit 101.
  • the situation information acquisition unit 103 outputs the acquired situation information, here tilt information, to the normal seating determination unit 104.
  • a condition such as ⁇ Condition 3> below is defined as the condition for determining normal seating.
  • ⁇ Condition 3> The inclination of the occupant's skeletal line must be within a preset range (hereinafter referred to as the "inclination determination range"). Note that the range for determining inclination in ⁇ Condition 3> is set in advance as a predetermined range based on the inclination of the skeletal line of the occupant, which is assumed to be facing forward, for example.
  • the normal seating determination unit 104 determines that the posture of the occupant is normal seating if the inclination of the skeletal line of the occupant satisfies the conditions for determining normal seating. On the other hand, based on the inclination information, if the inclination of the occupant's skeleton line does not satisfy the condition for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state.
  • a flowchart showing the operation of the physique determination device 100 when the configuration example of the physique determination device 100 is as shown in FIG. 4 is the same as the flowchart shown in FIG. However, if the configuration example of the physique determination device 100 is as shown in FIG. 4, the situation information acquisition unit 103 acquires the inclination information as the situation information in step ST3. Then, the situation information acquisition unit 103 outputs the acquired situation information, here the tilt information, to the normal seating determination unit 104. In step ST4, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the inclination information.
  • the situation information is tilt information
  • the physique determination device 100 includes the tilt detection unit 1032 that detects the tilt of the occupant's skeletal line based on the information regarding the occupant's skeletal points extracted by the skeletal point extraction unit 102, and The seating determination unit 104 may determine whether the occupant is in a normal seating position based on the inclination of the occupant's skeletal line detected by the inclination detection unit 1032.
  • the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if, for example, the occupant's posture collapses while the vehicle is running.
  • the situation information may be not only information about the situation of the occupant, but also information about the situation of the vehicle, for example. Note that in this case as well, even if a situation occurs in which only the occupant's face is not facing the front direction, this is not considered to be a posture collapse.
  • the physique determination device 100 may acquire information regarding the situation of the vehicle (hereinafter referred to as "vehicle information") as the situation information.
  • Vehicle information includes, for example, information regarding the steering angle of the vehicle, vehicle speed, the opening/closing status of vehicle doors, the ignition status of the vehicle, or the operation status of in-vehicle devices (e.g., car navigation device, interior lamp, etc.). include.
  • FIG. 5 is a diagram showing a configuration example of the physique determination device 100 in the first embodiment when vehicle information is used as situation information.
  • the configuration example of the physique determination device 100 shown in FIG. 5 is different from the configuration example of the physique determination device 100 shown in FIG. The difference is in the preparation.
  • the specific operation of the normal sitting determining section 104 is different from the specific operation of the normal seating determining section 104 in the physique determining device 100 as shown in FIG.
  • the specific operations of the image acquisition unit 101, skeletal point extraction unit 102, feature value calculation unit 105, feature level leveling unit 106, physique determination unit 107, and posture attention unit 108 are the same as those already explained. As this is true, repeated explanation will be omitted.
  • the vehicle information acquisition unit 1033 acquires vehicle information from various devices installed in the vehicle. For example, the vehicle information acquisition unit 1033 may acquire information regarding the steering angle status from the steering angle sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire the vehicle speed from a vehicle speed sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the opening/closing status of the door from a sensor provided on the door. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the ignition status from the ignition sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the operation status of the in-vehicle device from an in-vehicle device such as a car navigation device.
  • the situation information acquisition unit 103 acquires the vehicle information acquired by the vehicle information acquisition unit 1033 as situation information.
  • the situation information acquisition unit 103 outputs the acquired situation information, here vehicle information, to the normal seating determination unit 104.
  • the normal seating determination unit 104 determines that the state of the occupant's posture is the normal seating state based on the vehicle information acquired by the vehicle information acquisition unit 1033. Determine whether or not. For example, in this case, conditions such as the following ⁇ Condition 4> to ⁇ Condition 8> are defined as the conditions for determining normal seating.
  • ⁇ Condition 4> The vehicle steering angle must be within a preset range (hereinafter referred to as the "steering angle determination range").
  • the vehicle speed exceeds a preset range (hereinafter referred to as "vehicle speed determination range").
  • ⁇ Condition 6> vehicle door not open
  • ⁇ Condition 7> The elapsed time after the ignition was turned on must be within a preset time (hereinafter referred to as "start determination time").
  • start determination time The in-vehicle device is not being operated.
  • the range for steering angle determination in ⁇ Condition 4> is set to, for example, a range of steering angles that cannot occur without bending the arm.
  • the vehicle speed determination range in ⁇ Condition 5> is set to, for example, a vehicle speed assuming that the vehicle is traveling on an expressway.
  • ⁇ Condition 6> for example, if the door of the vehicle is open, it is assumed that the occupant is getting in or out of the vehicle, and that the occupant is not properly seated.
  • the normal seating determination unit 104 determines that the posture of the occupant is normal seating. On the other hand, if the vehicle information does not satisfy the conditions for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state.
  • a flowchart showing the operation of the physique determination device 100 when the configuration example of the physique determination device 100 is as shown in FIG. 5 is the same as the flowchart shown in FIG. 3.
  • the situation information acquisition unit 103 acquires the vehicle information acquired by the vehicle information acquisition unit 1033 as the situation information. .
  • the situation information acquisition unit 103 outputs the acquired situation information, here vehicle information, to the normal seating determination unit 104.
  • the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the vehicle information acquired by the vehicle information acquisition unit 1033 in step ST3.
  • the situation information is vehicle information
  • the physique determination device 100 includes a vehicle information acquisition unit 1033 that acquires vehicle information
  • the normal seating determination unit 104 uses vehicle information acquired by the vehicle information acquisition unit 1033. It may be determined whether the passenger's posture is in a normal seating state. In this case as well, the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
  • the physique determination device 100 may acquire not only face orientation information but also skeletal point information, tilt information, or vehicle information as situation information.
  • devices other than the physique determination device 100 often have the function of detecting the direction of the occupant's face based on the captured image captured by the vehicle.
  • the occupant state detection device included in the DMS may have a function of detecting the direction of the occupant's face.
  • the physique determination device 100 can acquire the face orientation information from a device external to the physique determination device 100 by using the face orientation information as situation information, and is not required to have a function of detecting the facial orientation. do not have.
  • the physique determination device 100 does not need to perform processing for detecting the occupant's face orientation, it is possible to reduce the processing load associated with determining the occupant's physique.
  • the physique determination device 100 does not need to add a function for acquiring situation information even if the skeletal point information is used as situation information.
  • detecting the position of the eyes or mouth based on the captured image has higher detection accuracy than detecting the position of the skeletal point based on the captured image. For example, determining whether the occupant is facing forward based on the direction of the occupant's face has higher determination accuracy than determining whether the occupant is facing forward based on the occupant's skeletal points.
  • the physique determination device 100 determines whether the occupant's posture is in the normal seating state with more accuracy when the face orientation information is used as the situation information than when the skeletal point information is used as the situation information. can.
  • the physique determination device 100 may acquire two or more of face orientation information, skeleton point information, tilt information, and vehicle information as situation information. That is, in the physique determination device 100, the situation information acquisition section 103 may include, for example, two or more of the face direction detection section 1031, the tilt detection section 1032, and the vehicle information acquisition section 1033.
  • the normal seating determination unit 104 determines whether the posture of the occupant is the normal seating state based on the situation information and according to the conditions for determining normal seating.
  • the normal seating determination unit 104 determines whether the occupant's posture is the normal seating state based on the situation information and a machine learning model (hereinafter referred to as the "regular seating determination model").
  • the normal seating determination model is a machine learning model that receives situational information as input and outputs information indicating whether or not the passenger's posture is normal seating.
  • the normal seating determination model is generated in advance by an administrator or the like and stored in a location that can be referenced by the physique determination device 100.
  • the normal seating determination unit 104 uses both the normal seating determination conditions and the normal seating determination model based on the situation information to determine whether the occupant is in the normal seating state. It may be determined whether or not. For example, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state (first determination) based on the situation information and in accordance with the conditions for determining normal seating. Further, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state (second determination) based on the situation information and the normal seating determination model.
  • the normal seating determination unit 104 determines that the occupant's posture is normal seating when it is determined in both the first determination and the second determination that the occupant's posture is normal seating; If it is determined in either the first determination or the second determination that the occupant's posture is not the normal seating condition, it is determined that the occupant's posture is not the normal seating condition. Further, for example, it may be determined whether or not the occupant's posture is a normal seating state by weighting the seating state determination result of the first determination and the seating state determination result of the second determination.
  • the physique determination device 100 was provided with the feature leveling unit 106, but this is only an example.
  • the physique determination device 100 may be configured without the feature leveling unit 106.
  • the physique determination unit 107 may determine the physique of the occupant based on the physique determination feature calculated by the feature calculation unit 105.
  • the physique determining device 100 can omit the process of step ST6.
  • the physique determination unit 107 may determine the physique of the occupant based on the physique determination feature calculated by the feature calculation unit 105 in step ST5.
  • the physique determination device 100 was provided with the posture attention section 108, but this is only an example.
  • the physique determination device 100 may be configured without the posture attention section 108. Further, the physique determination device 100 does not need to be connected to the output device 600. In this case, in the operation of the physique determining device 100 described using the flowchart of FIG. 3, the physique determining device 100 can omit the process of step ST8.
  • the physique determination device 100 is connected to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500, but this is only an example.
  • the physique determination device 100 may be connected only to either the seatbelt control device 300, the airbag control device 400, or the left-behind detection device 500, or may perform various processes using the physique determination results. It may be connected to devices other than the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
  • the physique determination device 100 may be connected to a display device mounted on a vehicle. The display device performs display according to the physique determination result output from the physique determination device 100, for example. For example, if an infant is among the occupants, the display device displays an icon indicating that a child is in the vehicle.
  • a plurality of imaging devices 200 may be installed in a vehicle.
  • a plurality of imaging devices 200 may be installed so as to be able to image each passenger seated in each seat in the vehicle.
  • the physique determination device 100 acquires captured images from the plurality of imaging devices 200 and determines the physique of the occupant.
  • FIG. 6A and 6B are diagrams illustrating an example of the hardware configuration of the physique determination device 100 according to the first embodiment.
  • an image acquisition unit 101 a skeleton point extraction unit 102, a situation information acquisition unit 103, a face orientation detection unit 1031, a tilt detection unit 1032, a vehicle information acquisition unit 1033, and a normal seating determination unit
  • the functions of 104 , feature amount calculation section 105 , feature amount leveling section 106 , physique determination section 107 , and posture attention section 108 are realized by processing circuit 1001 .
  • the physique determination device 100 includes a processing circuit 1001 for performing control to determine the physique of the vehicle occupant based on a captured image of the vehicle occupant.
  • the processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be a processor 1004 that executes a program stored in memory as shown in FIG. 6B.
  • the processing circuit 1001 is dedicated hardware, the processing circuit 1001 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Circuit). Gate Array), or a combination of these.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Circuit
  • the processing circuit is the processor 1004, the image acquisition unit 101, the skeleton point extraction unit 102, the situation information acquisition unit 103, the face orientation detection unit 1031, the tilt detection unit 1032, the vehicle information acquisition unit 1033, and the normal seating
  • the functions of the determination unit 104, feature amount calculation unit 105, feature amount leveling unit 106, physique determination unit 107, and posture attention unit 108 are realized by software, firmware, or a combination of software and firmware. .
  • Software or firmware is written as a program and stored in memory 1005.
  • the processor 1004 reads out and executes the program stored in the memory 1005, thereby controlling the image acquisition unit 101, the skeleton point extraction unit 102, the situation information acquisition unit 103, the face direction detection unit 1031, and the tilt detection unit 1032.
  • the physique determination device 100 includes a memory 1005 for storing a program that, when executed by the processor 1004, results in the execution of steps ST1 to ST8 in FIG. 3 described above.
  • the programs stored in the memory 1005 include an image acquisition section 101, a skeleton point extraction section 102, a situation information acquisition section 103, a face orientation detection section 1031, a tilt detection section 1032, and a vehicle information acquisition section 1033.
  • the memory 1005 is, for example, RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically non-volatile or volatile This includes semiconductor memory, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
  • the function of the image acquisition unit 101 is realized by a processing circuit 1001 as dedicated hardware, and the function is realized by a skeletal point extraction unit 102, a situation information acquisition unit 103, a face orientation detection unit 1031, and a tilt detection unit 1032.
  • the processor 1004 is stored in the memory 1005 for the vehicle information acquisition section 1033, normal seating determination section 104, feature amount calculation section 105, feature amount leveling section 106, physique determination section 107, and posture attention section 108.
  • the function can be realized by reading and executing the program.
  • the physique determination device 100 also includes an input interface device 1002 that performs wired or wireless communication with devices such as an imaging device 200, a seatbelt control device 300, an airbag control device 400, an abandonment detection device 500, or an output device 600. and an output interface device 1003.
  • the physique determination device 100 is an in-vehicle device mounted on a vehicle, and includes a skeleton point extraction section 102, a situation information acquisition section 103, a normal seating determination section 104, and a feature amount calculation section.
  • 105, the feature value leveling unit 106, the physique determination unit 107, and the posture attention unit 108 are assumed to be included in the vehicle-mounted device.
  • the present invention is not limited to this, but includes the skeletal point extraction section 102, the situation information acquisition section 103, the normal seating determination section 104, the feature amount calculation section 105, the feature amount leveling section 106, the physique determination section 107, and the posture attention section.
  • the skeletal point extraction section 102, the situation information acquisition section 103, the normal seating determination section 104, the feature amount calculation section 105, the feature amount leveling section 106, the physique determination section 107, and the posture attention section 108 are all It may be provided in the server.
  • the physique determination device 100 includes the image acquisition unit 101 that acquires a captured image of an occupant of a vehicle, and a skeletal point extracting unit 102 that extracts skeletal points of the occupant indicating the body parts of the occupant;
  • the seating determination unit 104 determines whether or not the occupant's posture is a normal seating state based on the information regarding the occupant's skeletal points extracted by the skeletal point extraction unit 102 and the seating state determination result of the normal seating determination unit 104 as to whether or not the occupant's posture is in a normal seating state.
  • the skeletal point extraction unit 102 determines whether or not the skeletal points of the occupant extracted are normal seating skeletal points extracted based on an image taken with the occupant normally seated.
  • the feature calculating unit 105 calculates a feature for physique determination based on information about the normal sitting skeletal point, and the feature calculating unit 105 calculates
  • the vehicle is configured to include a physique determination unit 107 that determines the physique of the occupant based on the physique determination feature quantity.
  • the physique determination device 100 calculates a physique determination feature amount based on a captured image when the posture state of the occupant is in a normal seating state, and determines the occupant's physique based on the physique determination feature amount. , it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, even if the occupant's posture collapses, the physique determination device 100 can prevent the accuracy of determining the occupant's physique from decreasing.
  • any component of the embodiments can be modified or any component of the embodiments can be omitted.
  • a physique determination device calculates a physique determination feature amount based on a captured image when the posture state of an occupant is in a normally seated state, and determines the physique of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique.
  • 100 physique determination device 101 image acquisition unit, 102 skeletal point extraction unit, 103 situation information acquisition unit, 1031 face orientation detection unit, 1032 tilt detection unit, 1033 vehicle information acquisition unit, 104 normal seating determination unit, 105 feature amount calculation unit , 106 feature leveling unit, 107 physique determination unit, 108 posture warning unit, 200 imaging device, 300 seatbelt control device, 400 airbag control device, 500 left behind detection device, 600 output device, 1001 processing circuit, 1002 input interface Device, 1003 Output interface device, 1004 Processor, 1005 Memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Seats For Vehicles (AREA)
  • Image Processing (AREA)

Abstract

The present invention comprises: a skeletal point extraction unit (102) that extracts skeletal points of a vehicle occupant on the basis of a captured image of the vehicle occupant; a normal sitting determination unit (104) that determines, on the basis of situation information relating to the situation of the vehicle or the vehicle occupant, whether the posture of the vehicle occupant is in a normal sitting state; a feature quantity calculation unit (105) that determines whether or not the skeletal points of the vehicle occupant that were extracted by the skeletal point extraction unit (102) are normal sitting skeletal points extracted on the basis of a captured image of the vehicle occupant in a normal sitting state, and, if the skeletal points of the vehicle occupant that were extracted by the skeletal point extraction unit (102) are determined to be the normal sitting skeletal points, calculates a feature quantity for physique determination on the basis of information relating to the normal sitting skeletal points; and a physique determination unit (107) that determines the physique of the vehicle occupant on the basis of the feature quantity for physique determination.

Description

体格判定装置および体格判定方法Physique determination device and physique determination method
 本開示は、乗員の体格判定装置および体格判定方法に関する。 The present disclosure relates to an occupant physique determination device and a physique determination method.
 従来、車両の乗員が撮像された撮像画像に基づいて当該乗員の姿勢を判定し、判定した姿勢から、当該乗員の体格を判定する技術が知られている。
 例えば、特許文献1には、車室内を撮像して得られる撮像画像に基づいて、ディープラーニング等によって抽出した撮像画像内の乗員の複数の特徴点の座標を含む体格推定指標に基づいて乗員の姿勢を推定し、推定した姿勢に基づいて乗員の体格を推定する室内監視装置が開示されている。
2. Description of the Related Art Conventionally, a technique is known in which the posture of a vehicle occupant is determined based on a captured image of the vehicle occupant, and the physique of the vehicle occupant is determined from the determined posture.
For example, Patent Document 1 discloses that based on a captured image obtained by capturing an image of the inside of a vehicle, the occupant's body physique estimation index including the coordinates of a plurality of feature points of the occupant in the captured image extracted by deep learning etc. An indoor monitoring device has been disclosed that estimates a posture and estimates the physique of an occupant based on the estimated posture.
特開2020-104680号公報JP2020-104680A
 乗員が座席に姿勢が崩れた状態で着座している場合、姿勢崩れのパターンは無数に考えられる。そのため、乗員が座席に姿勢が崩れた状態で着座している場合、無数に考えられる姿勢崩れのパターンを網羅することが難しく、乗員が姿勢を崩すことなく着座している場合と比べて、撮像画像に基づいて判定される乗員の姿勢には、誤差が含まれやすくなる、または、当該誤差が大きくなる。その結果、体格判定の精度が低下する。
 特許文献1に開示されているような従来技術はこのことが考慮されていないため、乗員の姿勢が崩れた状態が発生すると、乗員の体格の判定精度が低下するおそれがあるという課題があった。
When a passenger is seated in a seat with a crooked posture, there are countless possible patterns of crooked posture. Therefore, when an occupant is seated in a seat with a crooked posture, it is difficult to cover the countless possible patterns of crooked posture, and the imaging The posture of the occupant determined based on the image tends to include errors, or the errors become large. As a result, the accuracy of physique determination decreases.
Since the conventional technology disclosed in Patent Document 1 does not take this into consideration, there is a problem that when the posture of the occupant collapses, the accuracy of determining the occupant's physique may decrease. .
 本開示は上記のような課題を解決するためになされたもので、乗員の姿勢が崩れた状態が発生したとしても当該乗員の体格判定の精度の低下を防ぐ体格判定装置を提供することを目的とする。 The present disclosure has been made in order to solve the above-mentioned problems, and an object of the present disclosure is to provide a physique determination device that prevents a decrease in the accuracy of determining the physique of an occupant even if the posture of the occupant collapses. shall be.
 本開示に係る体格判定装置は、車両の乗員が撮像された撮像画像を取得する画像取得部と、画像取得部が取得した撮像画像に基づき、乗員の体の部位を示す乗員の骨格点を抽出する骨格点抽出部と、車両または乗員の状況に関する状況情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する正規着座判定部と、骨格点抽出部が抽出した乗員の骨格点に関する情報と正規着座判定部による乗員の姿勢の状態は正規着座の状態であるか否かの着座状態判定結果とに基づき、骨格点抽出部が抽出した乗員の骨格点は乗員が正規着座の状態で撮像された撮像画像に基づき抽出された正規着座骨格点であるか否かを判定し、骨格点抽出部が抽出した乗員の骨格点は正規着座骨格点であると判定した場合に、正規着座骨格点に関する情報に基づいて体格判定用特徴量を算出する特徴量算出部と、特徴量算出部が算出した体格判定用特徴量に基づき、乗員の体格を判定する体格判定部とを備えたものである。 The physique determination device according to the present disclosure includes an image acquisition unit that acquires a captured image of an occupant of a vehicle, and extracts skeletal points of the occupant indicating body parts of the vehicle based on the captured image acquired by the image acquisition unit. a normal seating determination section that determines whether or not the occupant's posture is in a normal seating state based on situation information regarding the vehicle or the occupant's situation; The skeletal points of the occupant extracted by the skeletal point extraction section are based on information about the skeletal points of It is determined whether or not the occupant's skeletal point is a normal seated skeletal point extracted based on an image taken in a seated state, and when it is determined that the skeletal point of the occupant extracted by the skeletal point extraction unit is a normal seated skeletal point. , a feature amount calculation unit that calculates a feature amount for physique determination based on information regarding normal sitting skeletal points, and a physique determination unit that determines the physique of the occupant based on the feature amount for physique determination calculated by the feature amount calculation unit. It is prepared.
 本開示によれば、体格判定装置は、乗員の姿勢が崩れた状態が発生したとしても当該乗員の体格判定の精度の低下を防ぐことができる。 According to the present disclosure, the physique determination device can prevent the accuracy of determining the physique of the occupant from decreasing even if the occupant's posture collapses.
実施の形態1に係る体格判定装置が用いられるシステムの構成例を示す図である。1 is a diagram illustrating an example configuration of a system in which a physique determination device according to a first embodiment is used; FIG. 実施の形態1に係る体格判定装置の構成例を示す図である。1 is a diagram illustrating a configuration example of a physique determination device according to a first embodiment; FIG. 実施の形態1に係る体格判定装置の動作の一例を説明するためのフローチャートである。2 is a flowchart for explaining an example of the operation of the physique determination device according to the first embodiment. 実施の形態1において、傾き情報を状況情報とした場合の、体格判定装置の構成例を示す図である。FIG. 3 is a diagram illustrating a configuration example of a physique determination device in a case where inclination information is used as situation information in the first embodiment. 実施の形態1において、車両情報を状況情報とした場合の、体格判定装置の構成例を示す図である。FIG. 2 is a diagram illustrating a configuration example of a physique determination device in a case where vehicle information is used as situation information in Embodiment 1. FIG. 図6Aおよび図6Bは、実施の形態1に係る体格判定装置のハードウェア構成の一例を示す図である。6A and 6B are diagrams illustrating an example of the hardware configuration of the physique determination device according to the first embodiment.
 以下、本開示の実施の形態について、図面を参照しながら詳細に説明する。
実施の形態1.
 図1は、実施の形態1に係る体格判定装置100が用いられるシステムの構成例を示す図である。
 図1に示すようなシステムにおいて、実施の形態1に係る体格判定装置100は、撮像装置200、シートベルト制御装置300、エアバッグ制御装置400、置き去り検知装置500、および、出力装置600と接続される。
 実施の形態1において、体格判定装置100は、車両に搭載されることを想定する。
 また、シートベルト制御装置300、エアバッグ制御装置400、置き去り検知装置500、および、出力装置600は、車両に搭載される。
Embodiments of the present disclosure will be described in detail below with reference to the drawings.
Embodiment 1.
FIG. 1 is a diagram showing a configuration example of a system in which a physique determination device 100 according to the first embodiment is used.
In the system shown in FIG. 1, the physique determination device 100 according to the first embodiment is connected to an imaging device 200, a seatbelt control device 300, an airbag control device 400, an abandonment detection device 500, and an output device 600. Ru.
In the first embodiment, it is assumed that the physique determination device 100 is mounted on a vehicle.
Further, the seatbelt control device 300, the airbag control device 400, the left-behind detection device 500, and the output device 600 are mounted on the vehicle.
 撮像装置200は、例えば、近赤外線カメラ、または、可視光カメラであり、車両の乗員を撮像する。撮像装置200は、例えば、車両内の運転者の状態を監視するために車両に搭載される、いわゆる「ドライバーモニタリングシステム(Driver Monitoring System,DMS)」が有する撮像装置と共用のものでもよい。
 撮像装置200は、少なくとも、車両の乗員の上半身が存在すべき範囲を含む車両内の範囲を撮像可能に設置される。車両内の乗員の上半身が存在すべき範囲とは、例えば、座席の背もたれ、および、ヘッドレストの前方付近の空間に相当する範囲である。
 例えば、撮像装置200は、車両のインストルメントパネル(以下「インパネ」という。)の中央に、インパネの中央から運転席および助手席を含む前部座席を撮像可能に、言い換えれば、運転者および助手席の乗員(以下「助手席乗員」という。)を撮像可能に、設置される。なお、これは一例に過ぎず、撮像装置200は、例えば、運転者のみを撮像可能に設置されてもよいし、後部座席の乗員を撮像可能に設置されてもよい。
The imaging device 200 is, for example, a near-infrared camera or a visible light camera, and images the occupant of the vehicle. For example, the imaging device 200 may be shared with an imaging device included in a so-called "Driver Monitoring System (DMS)" that is installed in a vehicle to monitor the condition of a driver in the vehicle.
The imaging device 200 is installed to be able to image an area within the vehicle that includes at least an area where the upper body of a vehicle occupant should be present. The range in which the upper body of the occupant in the vehicle should exist is, for example, a range corresponding to the space near the backrest of the seat and the front of the headrest.
For example, the imaging device 200 can image the front seats including the driver's seat and the front passenger seat from the center of the instrument panel (hereinafter referred to as "instrument panel") of the vehicle. It is installed so that it can image the occupant in the seat (hereinafter referred to as "passenger seat occupant"). Note that this is just an example, and the imaging device 200 may be installed to be able to image only the driver, or may be installed to be able to image the passenger in the rear seat, for example.
 実施の形態1に係る体格判定装置100は、撮像装置200によって車両の乗員が撮像された撮像画像に基づいて、車両の乗員の体格を判定する。
 その際、体格判定装置100は、車両の乗員の姿勢の状態が正規着座の状態であるか否かを判定し、乗員が正規着座の状態で撮像された撮像画像に基づいて、当該乗員の体格の判定を行う。
 実施の形態1において、「正規着座」とは、乗員が、姿勢崩れを起こしていない正常な状態で座席に着座しているこという。なお、実施の形態1では、乗員の顔向きが正面方向を向いていない状態も姿勢崩れとみなす。乗員の顔向きが正面方向を向いていない状態であるとき、当該乗員は体ごと正面を向いていない可能性が高いと推定される。
 体格判定装置100は、車両の乗員の体格を判定すると、車両の乗員の体格を判定した結果(以下「体格判定結果」という。)を、シートベルト制御装置300、エアバッグ制御装置400、置き去り検知装置500に出力する。また、体格判定装置100は、乗員に対して、正規着座するよう促すメッセージ等を出力させるための情報を、出力装置600に出力する。
 体格判定装置100の構成例の詳細については、図2を用いて後述する。
 なお、以下の実施の形態1において、車両の乗員を、単に「乗員」ともいう。
The physique determination device 100 according to the first embodiment determines the physique of a vehicle occupant based on a captured image of the vehicle occupant taken by the imaging device 200.
At this time, the physique determination device 100 determines whether or not the posture of the occupant of the vehicle is in a normally seated state, and determines the physique of the occupant based on the captured image taken with the occupant in the normally seated state. Make a judgment.
In the first embodiment, "regular seating" means that the occupant is seated on the seat in a normal state with no posture collapse. In the first embodiment, a state in which the occupant's face is not facing forward is also considered to be a posture collapse. When the occupant's face is not facing forward, it is estimated that there is a high possibility that the occupant's whole body is not facing forward.
When the physique determination device 100 determines the physique of the vehicle occupant, the physique determination device 100 transmits the result of determining the physique of the vehicle occupant (hereinafter referred to as “physique determination result”) to the seat belt control device 300, the airbag control device 400, and the left-behind detection. Output to device 500. Furthermore, the physique determination device 100 outputs information to the output device 600 for outputting a message or the like urging the occupant to sit properly.
Details of a configuration example of the physique determination device 100 will be described later using FIG. 2.
In addition, in the following Embodiment 1, the passenger of a vehicle is also simply called a "passenger."
 シートベルト制御装置300は、シートベルトの制御を行う。例えば、シートベルト制御装置300は、体格判定装置100から出力された体格判定結果に基づき乗員の体格を考慮して警報を出力する、シートベルトリマインダーの機能を有する。 The seat belt control device 300 controls the seat belt. For example, the seatbelt control device 300 has a seatbelt reminder function that outputs a warning in consideration of the occupant's physique based on the physique determination result output from the physique determination device 100.
 エアバッグ制御装置400は、例えば、エアバッグシステム用のECU(Engine Control Unit)であり、体格判定装置100から出力された体格判定結果に基づいてエアバッグの制御を行う。 The airbag control device 400 is, for example, an ECU (Engine Control Unit) for an airbag system, and controls the airbag based on the physique determination result output from the physique determination device 100.
 置き去り検知装置500は、車両内の幼児等の置き去りを検知する。例えば、置き去り検知装置500は、体格判定装置100から出力された体格判定結果に基づいて車両内に幼児等が置き去りになっているかを判定する。 The left-behind detection device 500 detects if an infant or the like is left behind in a vehicle. For example, the abandonment detection device 500 determines whether an infant or the like is left behind in the vehicle based on the physique determination result output from the physique determination device 100.
 出力装置600は、例えば、センタークラスターまたはメータパネル等に設けられている表示装置または音声出力装置である。例えば、出力装置600は、乗員に対して正規着座するよう促すメッセージの表示、当該メッセージの音声出力、または、乗員に対して正規着座するよう促す警告音の出力を行う。 The output device 600 is, for example, a display device or an audio output device provided in a center cluster, a meter panel, or the like. For example, the output device 600 displays a message urging the occupant to sit properly, outputs the message in audio, or outputs a warning sound urging the occupant to sit properly.
 実施の形態1に係る体格判定装置100の構成例について説明する。
 図2は、実施の形態1に係る体格判定装置100の構成例を示す図である。
 体格判定装置100は、画像取得部101、骨格点抽出部102、状況情報取得部103、正規着座判定部104、特徴量算出部105、特徴量平準化部106、体格判定部107、および、姿勢注意部108を備える。状況情報取得部103は、顔向き検出部1031を備える。
A configuration example of the physique determination device 100 according to the first embodiment will be described.
FIG. 2 is a diagram showing a configuration example of the physique determination device 100 according to the first embodiment.
The physique determination device 100 includes an image acquisition section 101, a skeletal point extraction section 102, a situation information acquisition section 103, a normal sitting determination section 104, a feature calculation section 105, a feature leveling section 106, a physique determination section 107, and a posture determination section 104. A warning section 108 is provided. The situation information acquisition unit 103 includes a face orientation detection unit 1031.
 画像取得部101は、撮像装置200によって乗員が撮像された撮像画像を取得する。
 画像取得部101は、取得した撮像画像を、骨格点抽出部102に出力する。
The image acquisition unit 101 acquires a captured image of a passenger captured by the imaging device 200.
The image acquisition unit 101 outputs the acquired captured image to the skeleton point extraction unit 102.
 骨格点抽出部102は、画像取得部101が取得した撮像画像に基づいて、乗員の体の部位を示す、乗員の骨格点を抽出する。詳細には、骨格点抽出部102は、画像取得部101が取得した撮像画像に基づいて、乗員の体の部位ごとに決められた関節点を示す、乗員の骨格点を抽出する。乗員の体の部位ごとに決められた関節点とは、例えば、鼻の関節点、首の関節点、肩の関節点、肘の関節点、腰の関節点、手首の関節点、膝の関節点、足首の関節点である。当該関節点は、予め定義されている。
 骨格点抽出部102は、画像認識技術または学習済みのモデル(以下「機械学習モデル」という。)を用いる技術等の公知の技術を用いた公知の方法で、乗員の骨格点を抽出すればよい。
 骨格点は、例えば、撮像画像における点であり、撮像画像における座標であらわされる。骨格点抽出部102は、乗員の骨格点の座標と、当該骨格点が乗員の体のどの部位を示す骨格点であるかを検出する。
The skeletal point extraction unit 102 extracts skeletal points of the occupant, which indicate body parts of the occupant, based on the captured image acquired by the image acquisition unit 101. Specifically, the skeletal point extracting unit 102 extracts skeletal points of the occupant, which indicate joint points determined for each part of the occupant's body, based on the captured image acquired by the image acquiring unit 101. Joint points determined for each part of the occupant's body include, for example, nose joint points, neck joint points, shoulder joint points, elbow joint points, waist joint points, wrist joint points, and knee joints. point, the ankle joint point. The joint points are predefined.
The skeletal point extraction unit 102 may extract the skeletal points of the occupant using a known method using a known technique such as an image recognition technique or a technique using a trained model (hereinafter referred to as a "machine learning model"). .
A skeleton point is, for example, a point in a captured image, and is expressed by coordinates in the captured image. The skeletal point extraction unit 102 detects the coordinates of a skeletal point of the occupant and which part of the body of the occupant the skeletal point indicates.
 なお、骨格点抽出部102は、乗員ごとに、骨格点を抽出する。
 例えば、撮像画像において、座席ごとに、当該座席に対応する領域(以下「座席対応領域」という。)が予め設定されている。座席対応領域は、撮像装置200の設置位置、および、画角に応じて、予め設定されている。
 例えば、骨格点抽出部102は、座席対応領域ごとに、骨格点を抽出する。例えば、骨格点抽出部102は、ある座席対応領域が、運転席に対応する座席対応領域であれば、当該座席対応領域において抽出した骨格点は、運転者の骨格点であるとする。また、例えば、骨格点抽出部102は、ある座席対応領域が、助手席に対応する座席対応領域であれば、当該座席対応領域において抽出した骨格点は、助手席乗員の骨格点であるとする。このように、骨格点抽出部102は、例えば、座席対応領域ごとに、当該座席対応領域から骨格点を抽出することで、乗員ごとの骨格点を抽出する。
Note that the skeletal point extraction unit 102 extracts skeletal points for each occupant.
For example, in a captured image, an area corresponding to each seat (hereinafter referred to as a "seat corresponding area") is set in advance for each seat. The seat corresponding area is set in advance according to the installation position of the imaging device 200 and the angle of view.
For example, the skeleton point extraction unit 102 extracts skeleton points for each seat corresponding area. For example, if a certain seat-corresponding area is a seat-corresponding area corresponding to a driver's seat, the skeleton point extraction unit 102 assumes that the skeleton points extracted in the seat-corresponding area are the skeleton points of the driver. Further, for example, if a certain seat corresponding area is a seat corresponding area corresponding to a passenger seat, the skeleton point extraction unit 102 determines that the skeleton points extracted in the seat corresponding area are the skeleton points of the front passenger seat occupant. . In this way, the skeleton point extracting unit 102 extracts skeleton points for each occupant, for example, by extracting skeleton points from the seat corresponding area for each seat corresponding area.
 骨格点抽出部102は、抽出した骨格点に関する情報(以下「骨格点情報」という。)を、状況情報取得部103に出力する。
 骨格点情報には、乗員を示す情報と当該乗員の骨格点の座標を示す情報と骨格点が当該乗員の体のどの部位を示す骨格点であるかを示す情報とが対応付けられた情報、および、画像取得部101が取得した撮像画像が含まれる。
 実施の形態1において、乗員を示す情報は、例えば、乗員が着座している座席を示す情報とする。
Skeletal point extraction section 102 outputs information regarding the extracted skeleton points (hereinafter referred to as "skeletal point information") to situation information acquisition section 103.
The skeletal point information includes information in which information indicating the occupant, information indicating the coordinates of the skeletal point of the occupant, and information indicating which part of the body of the occupant the skeletal point indicates are associated; Also included are captured images acquired by the image acquisition unit 101.
In the first embodiment, the information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
 状況情報取得部103は、骨格点抽出部102が乗員の骨格点を抽出すると、言い換えれば、骨格点抽出部102から骨格点情報が出力されると、乗員の状況に関する情報(以下「状況情報」という。)を取得する。状況情報は、例えば、乗員の顔向きに関する情報である。 When the skeletal point extraction unit 102 extracts the skeletal points of the occupant, in other words, when the skeletal point information is output from the skeletal point extraction unit 102, the situation information acquisition unit 103 acquires information regarding the situation of the occupant (hereinafter referred to as “situation information”). ). The situation information is, for example, information regarding the direction of the occupant's face.
 顔向き検出部1031は、画像取得部101が取得した撮像画像に基づき、乗員の顔向きを検出する。画像取得部101が取得した撮像画像は、骨格点情報に含まれている。
 顔向き検出部1031は、画像認識技術または機械学習モデルを用いる技術等の公知の技術を用いた公知の方法で、乗員の顔向きを検出すればよい。乗員の顔向きは、例えば、角度であらわされる。
 なお、顔向き検出部1031は、乗員ごとに、顔向きを検出する。
The face orientation detection unit 1031 detects the passenger's face orientation based on the captured image acquired by the image acquisition unit 101. The captured image acquired by the image acquisition unit 101 is included in the skeleton point information.
The face orientation detection unit 1031 may detect the passenger's face orientation using a known method using a known technique such as an image recognition technique or a technique using a machine learning model. The direction of the occupant's face is expressed, for example, by an angle.
Note that the face orientation detection unit 1031 detects the face orientation for each passenger.
 状況情報取得部103は、顔向き検出部1031が検出した乗員の顔向きに関する情報(以下「顔向き情報」という。)を、状況情報として取得する。顔向き情報には、乗員を示す情報と、顔向きを示す情報とが対応付けられた情報が含まれる。乗員を示す情報は、例えば、乗員が着座している座席を示す情報である。顔向き検出部1031は、例えば、座席対応領域から乗員がどの座席に着座している乗員であるかを判定すればよい。顔向き情報には、これらに加え、画像取得部101が取得した撮像画像が含まれてもよい。
 状況情報取得部103は、取得した状況情報、ここでは顔向き情報を、正規着座判定部104に出力する。
The situation information acquisition unit 103 acquires information regarding the passenger's face direction detected by the face direction detection unit 1031 (hereinafter referred to as “face direction information”) as situation information. The face orientation information includes information in which information indicating the occupant and information indicating the face orientation are associated with each other. The information indicating the occupant is, for example, information indicating the seat where the occupant is seated. For example, the face direction detection unit 1031 may determine which seat the occupant is seated in based on the seat corresponding area. In addition to these, the face orientation information may also include a captured image acquired by the image acquisition unit 101.
The situation information acquisition unit 103 outputs the acquired situation information, here face orientation information, to the normal seating determination unit 104.
 正規着座判定部104は、状況情報取得部103が取得した状況情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
 詳細には、ここでは、正規着座判定部104は、顔向き情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
 なお、正規着座判定部104は、乗員ごとに、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。正規着座判定部104は、各乗員の顔向きを、顔向き情報から特定できる。
The normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the situation information acquired by the situation information acquisition unit 103.
Specifically, here, the normal seating determination unit 104 determines whether the occupant's posture is in the normal seating state based on the face orientation information.
Note that the normal seating determination unit 104 determines for each passenger whether or not the passenger's posture is in a normal seating state. The normal seating determination unit 104 can identify the facial orientation of each occupant from the facial orientation information.
 正規着座判定部104による、乗員の姿勢の状態は正規着座の状態であるか否かの判定方法の一例について説明する。
 例えば、正規着座判定部104は、状況情報、ここでは顔向き情報、に基づき、予め設定されている、乗員の姿勢の状態が正規着座の状態であるか否かを判定するための条件(以下「正規着座判定用条件」という。)に従って、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
 正規着座判定用条件には、例えば、乗員の姿勢の状態が正規着座の状態であるとみなす条件が定義されている。正規着座判定用条件は、予め、管理者等によって生成され、体格判定装置100が参照可能な場所に記憶されている。
An example of a method for determining whether or not the passenger's posture is a normal seating state by the normal seating determination unit 104 will be described.
For example, the normal seating determination unit 104 uses conditions (hereinafter referred to as (referred to as "regular seating determination conditions"), it is determined whether the occupant's posture is in a regular seating state.
The conditions for determining normal seating include, for example, conditions under which the occupant's posture is considered to be normal seating. The normal seating determination conditions are generated in advance by an administrator or the like and stored in a location that can be referenced by the physique determination device 100.
 正規着座判定用条件には、例えば、以下の<条件1>のような条件が定義されている。

<条件1>
 乗員の顔向きが、正面方向を向いていること。
 乗員の顔向きが、正面を基準とした所定の範囲(「顔向き判定用範囲」という。)内である場合、乗員の顔向きは正面方向を向いているとする。

 なお、<条件1>における顔向き判定用範囲には、予め、正面方向を向いているとみなすとする適宜の範囲が設定されている。
For example, a condition such as <Condition 1> below is defined as the condition for determining normal seating.

<Condition 1>
The occupant's face must be facing forward.
If the passenger's face direction is within a predetermined range (referred to as a "face direction determination range") with the front as a reference, it is assumed that the passenger's face direction is facing the front direction.

Note that the face orientation determination range in <Condition 1> is set in advance as an appropriate range in which it is assumed that the face is facing the front direction.
 正規着座判定部104は、顔向き情報に基づき、乗員の顔向きが正規着座判定用条件を満たす場合、乗員の姿勢の状態は正規着座の状態であると判定する。
 一方、正規着座判定部104は、顔向き情報に基づき、乗員の顔向きが正規着座判定用条件を満たさない場合、乗員の姿勢の状態は正規着座の状態ではないと判定する。
Based on the face orientation information, the normal seating determination unit 104 determines that the posture of the occupant is the normal seating state when the passenger's face orientation satisfies the normal seating determination conditions.
On the other hand, based on the face orientation information, if the occupant's face orientation does not satisfy the conditions for determining normal seating, the normal seating determination unit 104 determines that the occupant's posture is not the normal seating state.
 正規着座判定部104は、乗員の姿勢の状態は正規着座の状態であると判定したか否かの判定結果(以下「着座状態判定結果」という。)を、特徴量算出部105に出力する。このとき、正規着座判定部104は、骨格点抽出部102から出力された骨格点情報を、着座状態判定結果とともに、特徴量算出部105に出力する。
 また、正規着座判定部104は、着座状態判定結果を、姿勢注意部108に出力する。
The normal seating determination unit 104 outputs a determination result (hereinafter referred to as a “seating state determination result”) as to whether or not the occupant's posture is determined to be a normal seating state to the feature value calculation unit 105. At this time, normal seating determination section 104 outputs the skeleton point information output from skeleton point extraction section 102 to feature amount calculation section 105 together with the seating state determination result.
Further, the normal seating determination unit 104 outputs the seating state determination result to the posture attention unit 108 .
 特徴量算出部105は、骨格点抽出部102から出力された骨格点情報と、正規着座判定部104による乗員の姿勢の状態は正規着座の状態であるか否かの着座状態判定結果とに基づき、骨格点抽出部102が抽出した乗員の骨格点は、乗員が正規着座の状態で撮像された撮像画像に基づき抽出された骨格点(以下「正規着座骨格点」という。)であるか否かを判定する。
 例えば、特徴量算出部105は、正規着座判定部104から出力された着座状態判定結果が、乗員の姿勢の状態が正規着座の状態である旨の結果である場合、骨格点抽出部102が抽出した骨格点は正規着座骨格点であると判定する。すなわち、特徴量算出部105は、骨格点抽出部102から出力された骨格点情報は、正規着座骨格点情報であると判定する。
 なお、実施の形態1では、特徴量算出部105は、骨格点抽出部102から出力された骨格点情報を、正規着座判定部104を介して取得するようにしているが、これは一例に過ぎない。例えば、特徴量算出部105は、骨格点情報を、直接、骨格点抽出部102から取得してもよい。この場合、図1において、骨格点抽出部102からの矢印は、特徴量算出部105に対して引かれる。
The feature value calculation unit 105 calculates the position of the occupant based on the skeleton point information output from the skeleton point extraction unit 102 and the seating state determination result of the normal seating determination unit 104 as to whether or not the occupant's posture is in the normal seating state. Whether or not the occupant's skeletal points extracted by the skeletal point extraction unit 102 are skeletal points extracted based on an image taken with the occupant in a normally seated state (hereinafter referred to as "normally seated skeletal points"). Determine.
For example, if the seating state determination result output from the normal seating determination unit 104 indicates that the occupant's posture is a normal seating state, the feature value calculation unit 105 extracts The skeleton point thus obtained is determined to be a normal seated skeleton point. That is, the feature amount calculation unit 105 determines that the skeleton point information output from the skeleton point extraction unit 102 is regular sitting skeleton point information.
Note that in the first embodiment, the feature amount calculation unit 105 acquires the skeleton point information output from the skeleton point extraction unit 102 via the normal seating determination unit 104, but this is only an example. do not have. For example, the feature amount calculation unit 105 may directly acquire skeleton point information from the skeleton point extraction unit 102. In this case, in FIG. 1, the arrow from the skeleton point extraction unit 102 is drawn toward the feature quantity calculation unit 105.
 そして、特徴量算出部105は、骨格点抽出部102が抽出した乗員の骨格点は正規着座骨格点であると判定した場合に、当該正規着座骨格点の骨格点情報(以下「正規着座骨格点情報」という。)に基づいて、乗員の体格判定に用いる特徴量(以下「体格判定用特徴量」という。)を算出する。
 特徴量算出部105は、乗員ごとに、体格判定用特徴量を算出する。特徴量算出部105は、各乗員の骨格点を、正規着座骨格点情報から特定できる。
 なお、特徴量算出部105は、骨格点抽出部102が抽出した乗員の骨格点は正規着座骨格点であると判定しなかった場合は、体格判定用特徴量の算出は行わない。
Then, when the feature value calculation unit 105 determines that the occupant's skeletal point extracted by the skeletal point extraction unit 102 is a normal seating skeletal point, the feature calculation unit 105 provides skeletal point information of the normal seating skeletal point (hereinafter referred to as “regular sitting skeletal point Based on the information (hereinafter referred to as "feature amount for physique determination") used for determining the occupant's physique (hereinafter referred to as "feature amount for physique determination").
The feature amount calculation unit 105 calculates a feature amount for physique determination for each occupant. The feature amount calculation unit 105 can identify the skeletal points of each occupant from the normal seating skeletal point information.
Note that if the feature amount calculation unit 105 does not determine that the occupant's skeletal points extracted by the skeletal point extraction unit 102 are normal seating skeletal points, the feature amount calculation unit 105 does not calculate the feature amount for physique determination.
 例えば、特徴量算出部105は、正規着座骨格点情報に基づいて、撮像画像における骨格点のうちの2点を結ぶ線分の、撮像画像における長さから、体格判定用特徴量を算出する。
 詳細には、例えば、特徴量算出部105は、正規着座骨格点情報に基づいて、撮像画像における乗員の腕の長さ、肩幅、および、首の長さを、体格判定用特徴量として算出する。
 特徴量算出部105は、例えば、肩を示す骨格点と肘を示す骨格点とを結ぶ線分の長さを、体格判定用特徴量とする乗員の腕の長さとすればよい。また、特徴量算出部105は、例えば、両肩を示す骨格点を結ぶ線分の長さ、首を示す骨格点と右肩を示す骨格点とを結ぶ線分の長さ、または、首を示す骨格点と左肩を示す骨格点とを結ぶ線分の長さを、体格判定用特徴量とする肩幅とすればよい。また、特徴量算出部105は、例えば、鼻を示す骨格点と首と示す骨格点とを結ぶ線分の長さを、体格判定用特徴量とする首の長さとすればよい。
 なお、これは一例に過ぎず、特徴量算出部105は、乗員の腕の長さ、肩幅、および、首の長さを、全て、体格判定用特徴量として算出する必要はなく、乗員の腕の長さ、肩幅、首の長さのうちの少なくとも1つを体格判定用特徴量として算出してもよい。また、特徴量算出部105は、乗員の腕の長さ、肩幅、首の長さ以外の、2つの骨格点を結ぶ線分の長さを、体格判定用特徴量としてもよい。
For example, the feature amount calculation unit 105 calculates the feature amount for physique determination from the length in the captured image of a line segment connecting two of the skeletal points in the captured image, based on the normal sitting skeletal point information.
In detail, for example, the feature amount calculation unit 105 calculates the arm length, shoulder width, and neck length of the occupant in the captured image as the feature amount for physique determination based on the normal sitting skeletal point information. .
For example, the feature amount calculation unit 105 may set the length of a line segment connecting a skeletal point indicating a shoulder and a skeletal point indicating an elbow as the length of the arm of the occupant as the feature amount for physique determination. In addition, the feature value calculation unit 105 calculates, for example, the length of a line segment connecting the skeletal points indicating both shoulders, the length of the line segment connecting the skeletal points indicating the neck and the skeletal point indicating the right shoulder, or the length of the line segment connecting the skeletal points indicating the neck and the skeletal point indicating the right shoulder. The length of the line segment connecting the skeleton point shown in FIG. Further, the feature quantity calculation unit 105 may, for example, set the length of the line segment connecting the skeletal point indicating the nose and the skeletal point indicating the neck as the length of the neck to be used as the feature quantity for physique determination.
Note that this is just an example, and the feature amount calculation unit 105 does not need to calculate all the arm length, shoulder width, and neck length of the occupant as feature amounts for physique determination; At least one of the length, shoulder width, and neck length may be calculated as the feature quantity for physique determination. Further, the feature amount calculation unit 105 may use the length of a line segment connecting two skeletal points other than the arm length, shoulder width, and neck length of the occupant as the feature amount for physique determination.
 特徴量算出部105は、算出した体格判定用特徴量に関する情報(以下「特徴量情報」という。)を、特徴量平準化部106に出力する。
 特徴量情報において、乗員ごとに、当該乗員を示す情報と体格判定用特徴量とが対応付けられている。乗員を示す情報は、例えば、乗員が着座している座席を示す情報である。
The feature quantity calculation unit 105 outputs information regarding the calculated feature quantity for physique determination (hereinafter referred to as “feature quantity information”) to the feature quantity leveling unit 106.
In the feature amount information, for each occupant, information indicating the occupant is associated with a feature amount for physique determination. The information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
 特徴量平準化部106は、過去の予め設定された期間(以下「平準化用期間」という。)分の複数の体格判定用特徴量、言い換えれば、平準化用期間遡って特徴量算出部105が算出した複数の体格判定用特徴量、を、平準化する。
 平準化用期間は、適宜の長さとできる。平準化用期間は、管理者等によって設定され、特徴量平準化部106が参照可能な場所に記憶されている。
The feature quantity leveling unit 106 calculates a plurality of physique determination feature quantities for a past preset period (hereinafter referred to as “leveling period”), in other words, the feature quantity calculation unit 105 The plurality of physique determination feature quantities calculated by are equalized.
The leveling period can be set to an appropriate length. The leveling period is set by an administrator or the like, and is stored in a location that can be referenced by the feature amount leveling unit 106.
 例えば、特徴量平準化部106は、特徴量算出部105から出力された特徴量情報を、当該特徴量情報の取得日時と対応付けて、特徴量平準化部106が参照可能な場所に、時系列で記憶させておく。特徴量平準化部106は、記憶させておいた特徴量情報から、平準化用期間の複数の体格判定用特徴量を取得すればよい。なお、車両の走行開始直後等、平準化用期間分の体格判定用特徴量が記憶されていない場合、特徴量平準化部106は、例えば、記憶されているだけの体格判定用特徴量を、平準化用期間分の体格判定用特徴量とすればよい。
 特徴量算出部105が記憶させた特徴量情報は、例えば、車両の電源がオンにされた際に削除される。
For example, the feature leveling unit 106 associates the feature information output from the feature calculating unit 105 with the acquisition date and time of the feature information, and stores it at a location where the feature leveling unit 106 can refer to it. Memorize them in sequence. The feature quantity leveling unit 106 may acquire a plurality of physique determination feature quantities in the leveling period from the stored feature quantity information. Note that when the feature values for physique determination for the leveling period are not stored, such as immediately after the vehicle starts running, the feature value leveling unit 106 stores only the stored feature values for physique determination, for example. The feature amount for physique determination may be set for the leveling period.
The feature amount information stored by the feature amount calculation unit 105 is deleted, for example, when the power of the vehicle is turned on.
 特徴量平準化部106は、例えば、平均する、中央値をとる、または、設定されたパーセンタイルに対応する値をとることで、平準化用期間分の複数の体格判定用特徴量を平準化する。
 つまり、特徴量平準化部106は、例えば、平準化用期間分の複数の体格判定用特徴量の平均値、平準化用期間分の複数の体格判定用特徴量の中央値、または、平準化用期間分の複数の体格判定用特徴量の、設定されたパーセンタイルに対応する値を、平準化された後の体格判定用特徴量(以下「平準化後体格判定用特徴量」という。)として算出する。
The feature leveling unit 106 levels the plurality of physique determination feature quantities for the leveling period, for example, by averaging, taking a median value, or taking a value corresponding to a set percentile. .
In other words, the feature leveling unit 106 calculates, for example, the average value of the plurality of physique determination feature quantities for the leveling period, the median value of the plurality of physique determination feature quantities for the leveling period, or the average value of the plurality of physique determination feature quantities for the leveling period. The value corresponding to the set percentile of the plurality of features for physique determination for the period of time is set as the feature for physique determination after leveling (hereinafter referred to as "the feature for physique determination after leveling"). calculate.
 なお、特徴量平準化部106は、乗員ごとに、平準化後体格判定用特徴量を算出する。特徴量平準化部106は、各乗員の体格判定用特徴量を、特徴量情報から特定できる。
 また、特徴量平準化部106は、乗員の腕の長さ、肩幅、および、首の長さ等、特徴量算出部105によって複数種類の体格判定用特徴量が算出される場合、各体格判定用特徴量を、それぞれ平準化する。
Note that the feature amount leveling unit 106 calculates a post-leveling physique determination feature amount for each occupant. The feature amount leveling unit 106 can specify the feature amount for determining the physique of each occupant from the feature amount information.
In addition, when the feature quantity calculation unit 105 calculates a plurality of types of feature quantities for physique determination, such as the length of the arms, shoulder width, and neck length of the occupant, the feature quantity leveling unit 106 level the respective feature quantities.
 特徴量平準化部106が、体格判定用特徴量の平準化を行うことで、例えば、一時的に、乗員の本来の体格に応じた特徴量から外れた体格判定用特徴量が算出されたとしても、体格判定装置100は、当該体格判定用特徴量(詳細には平準化後体格判定用特徴量)を用いた乗員の体格判定に及ぼす影響を低減することができる。なお、乗員の体格判定は体格判定部107が行う。体格判定部107の詳細は、後述する。 For example, when the feature value leveling unit 106 equalizes the feature values for physique determination, it is assumed that the feature values for physique determination are temporarily deviated from the feature values corresponding to the original physique of the occupant. Also, the physique determination device 100 can reduce the influence on the occupant's physique determination using the physique determination feature (specifically, the leveled physique determination feature). Note that the physique determination section 107 performs the determination of the occupant's physique. Details of the physique determining section 107 will be described later.
 例えば、特徴量平準化部106が平準化用期間分の複数の体格判定用特徴量を平均することで当該複数の体格判定用特徴量を平準化する場合、特徴量平準化部106は、当該複数の体格判定用特徴量の数が少ない場合であっても、精確な値、言い換えれば、乗員の本来の体格に応じた特徴量により近い値を、平準化後体格判定用特徴量とできる。そのため、この場合、例えば、管理者等は、正規着座判定部104が乗員の姿勢の状態が正規着座の状態であるか否かを判定する際に用いる正規着座判定用条件で定義する顔向き判定用範囲を、狭めることができる。顔向き判定用範囲を狭めることで正規着座判定部104によって乗員の姿勢の状態が正規着座の状態であると判定される頻度は低くなることが想定されるが、その後、特徴量平準化部106は、乗員の姿勢の状態が正規着座の状態であると判定されたときの複数の体格判定用特徴量の数が少なくても、少ない体格判定用特徴量から、精確な平準化後体格判定用特徴量を算出することができる。その結果、体格判定装置100は、当該平準化後体格判定用特徴量を用いた乗員の体格判定の精度の低下を防ぐことができる。 For example, when the feature amount leveling unit 106 averages the plurality of physique determination feature amounts for the leveling period to level the plurality of physique determination feature amounts, the feature amount leveling unit 106 averages the plurality of physique determination feature amounts for the leveling period. Even if the number of a plurality of feature quantities for physique determination is small, an accurate value, in other words, a value closer to the feature quantity according to the original physique of the occupant can be used as the feature quantity for physique determination after leveling. Therefore, in this case, for example, the administrator etc. can perform face orientation determination defined in the regular seating determination conditions used by the regular seating determination unit 104 to determine whether or not the passenger's posture is in the regular seating state. The scope of use can be narrowed. By narrowing the face orientation determination range, it is assumed that the frequency at which the normal seating determining unit 104 determines that the passenger's posture is normal seating will decrease; Even if the number of multiple physique determination feature quantities is small when the passenger's posture is determined to be in a normal seating state, accurate physique determination after leveling can be performed from a small number of physique determination feature quantities. Feature quantities can be calculated. As a result, the physique determination device 100 can prevent a decrease in the accuracy of determining the occupant's physique using the leveled physique determination feature amount.
 また、例えば、特徴量平準化部106が平準化用期間分の複数の体格判定用特徴量の中央値をとる、または、設定されたパーセンタイルに対応する値をとることで当該複数の体格判定用特徴量を平準化する場合、特徴量平準化部106は、体格判定用特徴量の数が多くても、多数の体格判定用特徴量から、精確な値、言い換えれば、乗員の本体の体格に応じた特徴量により近い値を、平準化後体格判定用特徴量とすることができる。そのため、この場合、例えば、管理者等は、正規着座判定部104が乗員の姿勢の状態が正規着座の状態であるか否かを判定する際に用いる正規着座判定用条件で定義する顔向き判定用範囲を広げて、正規着座判定部104によって乗員の姿勢の状態が正規着座の状態であると判定されやすくできる。正規着座判定部104によって乗員の姿勢の状態が正規着座の状態であると判定されやすくすることで乗員の正規着座の状態の判定精度は下がることが想定されるが、その後、特徴量平準化部106は、乗員の姿勢の状態が正規着座の状態であると判定されたときの多くの体格判定用特徴量から、精確な平準化後体格判定用特徴量を算出することができる。その結果、体格判定装置100は、当該平準化後体格判定用特徴量を用いた乗員の体格判定の精度の低下を防ぐことができる。 For example, the feature value leveling unit 106 may take the median value of a plurality of feature values for physique determination for the leveling period, or may take a value corresponding to a set percentile to determine the When leveling feature quantities, the feature leveling unit 106 calculates accurate values from a large number of physique determination feature quantities, in other words, even if the number of physique determination feature quantities is large. A value closer to the corresponding feature amount can be set as the feature amount for physique determination after leveling. Therefore, in this case, for example, the administrator etc. can perform face orientation determination defined in the regular seating determination conditions used by the regular seating determination unit 104 to determine whether or not the passenger's posture is in the regular seating state. By widening the range of use, the normal seating determination unit 104 can easily determine that the occupant's posture is in the normal seating state. It is assumed that by making it easier for the normal seating determining unit 104 to determine that the occupant's posture is normal seating, the accuracy of determining the normal seating state of the occupant will decrease; however, after that, the feature leveling unit 106 is capable of calculating an accurate leveled physique determination feature amount from many physique determination feature amounts when the posture state of the occupant is determined to be a normal seating state. As a result, the physique determination device 100 can prevent a decrease in the accuracy of determining the occupant's physique using the leveled physique determination feature amount.
 特徴量平準化部106は、平準化後体格判定用特徴量に関する情報(以下「平準化特徴量情報」という。)を、体格判定部107に出力する。
 平準化特徴量情報において、乗員ごとに当該乗員を示す情報と平準化特徴量情報とが対応付けられている。乗員を示す情報は、例えば、乗員が着座している座席を示す情報である。
The feature leveling unit 106 outputs information regarding the leveled physique determination feature (hereinafter referred to as “leveled feature information”) to the physique determining unit 107.
In the leveled feature amount information, information indicating the occupant and leveled feature amount information are associated for each occupant. The information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
 体格判定部107、特徴量平準化部106が平準化した後の平準化後体格判定用特徴量に基づき、乗員の体格を判定する。
 なお、体格判定部107は、乗員ごとに体格の判定を行う。体格判定部107は、各乗員の平準化後体格判定用特徴量を、平準化特徴量情報から特定できる。
The physique of the occupant is determined based on the leveled physique determination feature quantity leveled by the physique determination unit 107 and the feature quantity leveling unit 106 .
Note that the physique determination unit 107 determines the physique of each occupant. The physique determination unit 107 can identify the leveled physique determination feature of each occupant from the leveled feature information.
 体格判定部107は、画像認識技術または学習済みのモデル(以下「機械学習モデル」という。)を用いる技術等の公知の技術を用いた公知の方法で、乗員の体格を判定すればよい。
 体格判定部107は、公知の方法で、例えば、「幼児」、「小柄」、「標準」、または、「大柄」のいずれか等の、乗員の体格を判定する。なお、これは一例に過ぎず、体格判定部107が判定する体格の定義は、適宜設定可能である。
The physique determining unit 107 may determine the physique of the occupant by a known method using a known technique such as an image recognition technique or a technique using a learned model (hereinafter referred to as a "machine learning model").
The physique determining unit 107 determines the physique of the occupant, such as "infant", "small", "normal", or "large", using a known method. Note that this is just an example, and the definition of the physique determined by the physique determination unit 107 can be set as appropriate.
 体格判定部107は、体格判定結果を、シートベルト制御装置300、エアバッグ制御装置400、および、置き去り検知装置500に出力する。
 体格判定結果において、乗員毎に、当該乗員を示す情報と、乗員の体格を示す情報とが対応付けられている。乗員を示す情報は、例えば、乗員が着座している座席を示す情報である。
The physique determination unit 107 outputs the physique determination result to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
In the physique determination result, for each occupant, information indicating the occupant is associated with information indicating the occupant's physique. The information indicating the occupant is, for example, information indicating the seat where the occupant is seated.
 姿勢注意部108は、正規着座判定部104によって判定された、乗員の姿勢の状態は正規着座の状態ではない状態が、予め設定された時間(以下「姿勢異常判定用時間」という。)継続している場合、乗員に対して正規着座の姿勢をとるよう促す情報(以下「姿勢注意情報」という。)を、出力装置600に出力する。
 なお、姿勢注意部108は、正規着座判定部104から出力された着座状態判定結果に基づけば、正規着座判定部104が、乗員の姿勢の状態は正規着座の状態ではないと判定したことを判定できる。
The posture warning unit 108 determines whether the occupant's posture, as determined by the normal seating determination unit 104, is not the normal seating state for a preset period of time (hereinafter referred to as “posture abnormality determination time”). If so, information (hereinafter referred to as "posture caution information") urging the occupant to adopt a normal seating posture is output to the output device 600.
Note that, based on the seating state determination result output from the normal seating determination unit 104, the posture warning unit 108 determines that the normal seating determination unit 104 has determined that the occupant's posture is not the normal seating state. can.
 例えば、姿勢注意部108は、正規着座判定部104から出力された着座状態判定結果を、当該着座状態判定結果の取得日時と対応付けて、姿勢注意部108が参照可能な場所に、時系列で記憶しておく。姿勢注意部108は、記憶させておいた着座状態判定結果に基づけば、乗員の姿勢の状態が正規着座の状態ではない状態が姿勢異常判定用時間継続しているか否かを判定できる。なお、姿勢注意部108が記憶させた着座状態判定結果は、例えば、車両の電源がオンにされた際に削除される。
 そして、姿勢注意部108は、乗員の姿勢の状態が正規着座の状態ではない状態が姿勢異常判定用時間継続していると判定した場合、姿勢注意情報を、出力装置600に出力する。
For example, the posture caution section 108 associates the sitting state determination result output from the regular sitting determination section 104 with the acquisition date and time of the sitting state determination result, and stores it in a chronological order in a location where the posture caution section 108 can refer to. Remember it. Based on the stored seating state determination result, the posture warning unit 108 can determine whether or not the occupant's posture is not the normal seating state for the posture abnormality determination period. Note that the seating state determination result stored by the posture attention unit 108 is deleted, for example, when the vehicle is powered on.
If the posture warning unit 108 determines that the passenger's posture is not in the normal seating state for the posture abnormality determination period, the posture warning unit 108 outputs posture caution information to the output device 600.
 姿勢注意部108は、姿勢注意情報を出力することで、出力装置600に対して、例えば、乗員に対して正規着座するよう促すメッセージの表示、当該メッセージの音声出力、または、乗員に対して正規着座するよう促す警告音の出力を行わせる。 By outputting the posture caution information, the posture caution section 108 displays, for example, a message to the output device 600 urging the occupant to sit properly, outputs the message in audio, or displays a message urging the occupant to sit properly. A warning sound is output to prompt the user to sit down.
 姿勢注意部108が、姿勢注意情報を出力して乗員に対して正規着座するよう促すことで、体格判定装置100は、乗員に、姿勢を正しくさせることができる。その結果、体格判定装置100は、乗員の姿勢の状態が正規着座の状態であると判定されやすくできる。 The posture warning unit 108 outputs posture caution information and urges the occupant to sit properly, so that the physique determination device 100 can make the occupant maintain a correct posture. As a result, the physique determining device 100 can easily determine that the occupant's posture is in a normal seating state.
 実施の形態1に係る体格判定装置100の動作について説明する。
 図3は、実施の形態1に係る体格判定装置100の動作の一例を説明するためのフローチャートである。
 例えば、体格判定装置100は、車両が走行中に、定期的に、図3のフローチャートで示すような動作を繰り返す。
 また、例えば、体格判定装置100は、車両の電源がオンにされると、車両の電源がオフにされるまで、図3のフローチャートで示すような動作を繰り返してもよい。
The operation of the physique determination device 100 according to the first embodiment will be described.
FIG. 3 is a flowchart for explaining an example of the operation of the physique determination device 100 according to the first embodiment.
For example, the physique determination device 100 periodically repeats the operation shown in the flowchart of FIG. 3 while the vehicle is running.
Furthermore, for example, when the vehicle is powered on, the physique determining device 100 may repeat the operation shown in the flowchart of FIG. 3 until the vehicle is powered off.
 画像取得部101は、撮像装置200によって乗員が撮像された撮像画像を取得する(ステップST1)。
 画像取得部101は、取得した撮像画像を、骨格点抽出部102に出力する。
The image acquisition unit 101 acquires a captured image of a passenger captured by the imaging device 200 (step ST1).
The image acquisition unit 101 outputs the acquired captured image to the skeleton point extraction unit 102.
 骨格点抽出部102は、ステップST1にて画像取得部101が取得した撮像画像に基づいて、乗員の体の部位を示す、乗員の骨格点を抽出する(ステップST2)。
 骨格点抽出部102は、骨格点情報を、状況情報取得部103に出力する。
The skeletal point extraction unit 102 extracts the skeletal points of the occupant, which indicate the body parts of the occupant, based on the captured image acquired by the image acquisition unit 101 in step ST1 (step ST2).
Skeleton point extraction section 102 outputs skeleton point information to situation information acquisition section 103.
 状況情報取得部103は、ステップST2にて骨格点抽出部102が乗員の骨格点を抽出すると、状況情報を取得する(ステップST3)。
 ステップST3にて、顔向き検出部1031は、ステップST1にて画像取得部101が取得した撮像画像に基づき、乗員の顔向きを検出する。状況情報取得部103は、顔向き検出部1031が検出した乗員の顔向きに関する顔向き情報を、状況情報として取得する。
 状況情報取得部103は、取得した状況情報、ここでは顔向き情報を、正規着座判定部104に出力する。
The situation information acquisition unit 103 acquires situation information when the skeleton point extraction unit 102 extracts the skeleton points of the occupant in step ST2 (step ST3).
In step ST3, the face orientation detection unit 1031 detects the passenger's face orientation based on the captured image acquired by the image acquisition unit 101 in step ST1. The situation information acquisition unit 103 acquires face direction information regarding the passenger's face direction detected by the face direction detection unit 1031 as situation information.
The situation information acquisition unit 103 outputs the acquired situation information, here face orientation information, to the normal seating determination unit 104.
 正規着座判定部104は、ステップST3にて状況情報取得部103が取得した状況情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する(ステップST4)。詳細には、ここでは、正規着座判定部104は、顔向き情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
 正規着座判定部104は、着座状態判定結果を、特徴量算出部105に出力する。このとき、正規着座判定部104は、骨格点抽出部102から出力された骨格点情報を、着座状態判定結果とともに、特徴量算出部105に出力する。
 また、正規着座判定部104は、着座状態判定結果を、姿勢注意部108に出力する。
The normal seating determination unit 104 determines whether the posture of the occupant is normal seating based on the situation information acquired by the situation information acquisition unit 103 in step ST3 (step ST4). Specifically, here, the normal seating determination unit 104 determines whether the occupant's posture is in the normal seating state based on the face orientation information.
The normal seating determination unit 104 outputs the seating state determination result to the feature amount calculation unit 105. At this time, normal seating determination section 104 outputs the skeleton point information output from skeleton point extraction section 102 to feature amount calculation section 105 together with the seating state determination result.
Further, the normal seating determination unit 104 outputs the seating state determination result to the posture attention unit 108 .
 特徴量算出部105は、ステップST2にて骨格点抽出部102から出力された骨格点情報と、ステップST4による正規着座判定部104による乗員の姿勢の状態は正規着座の状態であるか否かの着座状態判定結果とに基づき、ステップST2にて骨格点抽出部102が抽出した乗員の骨格点は正規着座骨格点であるか否かを判定する。
 そして、特徴量算出部105は、ステップST2にて骨格点抽出部102が抽出した乗員の骨格点は正規着座骨格点であると判定した場合に、当該正規着座骨格点の正規着座骨格点情報に基づいて、体格判定用特徴量を算出する(ステップST5)。
 特徴量算出部105は、特徴量情報を、特徴量平準化部106に出力する。
The feature value calculation unit 105 determines whether or not the posture of the occupant is a normal seating state based on the skeleton point information output from the skeleton point extraction unit 102 in step ST2 and the state of the occupant's posture determined by the normal seating determination unit 104 in step ST4. Based on the seating state determination result, it is determined in step ST2 whether the skeletal points of the occupant extracted by the skeletal point extraction unit 102 are regular seating skeletal points.
Then, when it is determined that the occupant's skeleton point extracted by the skeleton point extracting unit 102 in step ST2 is a regular sitting skeleton point, the feature amount calculation unit 105 adds the regular sitting skeleton point information of the regular sitting skeleton point. Based on this, a feature quantity for physique determination is calculated (step ST5).
The feature quantity calculation unit 105 outputs the feature quantity information to the feature quantity leveling unit 106.
 特徴量算出部105は、ステップST2にて骨格点抽出部102が抽出した乗員の骨格点は正規着座骨格点であると判定しなかった場合は、体格判定用特徴量の算出は行わず、体格判定装置100の動作は、図3のフローチャートで示すような処理を終了する。 If the skeletal points of the occupant extracted by the skeletal point extraction unit 102 in step ST2 are not determined to be regular seating skeletal points, the feature amount calculation unit 105 does not calculate the feature amount for physique determination, and The operation of the determination device 100 ends the processing as shown in the flowchart of FIG.
 特徴量平準化部106は、正規着座判定部104から出力された正規着座骨格点情報に基づき、過去の平準化用期間分の複数の体格判定用特徴量、言い換えれば、平準化用期間遡って特徴量算出部105が算出した複数の体格判定用特徴量、を、平準化する(ステップST6)。
 特徴量平準化部106は、平準化特徴量情報を、体格判定部107に出力する。
The feature value leveling unit 106 calculates a plurality of physique determination feature values for the past leveling period based on the normal sitting skeleton point information output from the normal sitting determining unit 104, in other words, The plurality of feature quantities for physique determination calculated by the feature quantity calculation unit 105 are leveled (step ST6).
The feature leveling unit 106 outputs the leveled feature information to the physique determining unit 107.
 体格判定部107、ステップST6にて特徴量平準化部106が平準化した後の平準化後体格判定用特徴量に基づき、乗員の体格を判定する(ステップST7)。
 体格判定部107は、体格判定結果を、シートベルト制御装置300、エアバッグ制御装置400、および、置き去り検知装置500に出力する。
The physique determination section 107 determines the physique of the occupant based on the leveled physique determination feature amount leveled by the feature leveling section 106 in step ST6 (step ST7).
The physique determination unit 107 outputs the physique determination result to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
 姿勢注意部108は、ステップST4にて正規着座判定部104によって判定された、乗員の姿勢の状態は正規着座の状態ではない状態が、姿勢異常判定用時間継続している場合、姿勢注意情報を、出力装置600に出力する(ステップST8)。 The posture warning unit 108 transmits the posture caution information when the state of the occupant's posture, which is determined by the normal seating determination unit 104 in step ST4, is not the normal seating state and continues for the posture abnormality determination period. , is output to the output device 600 (step ST8).
 このように、体格判定装置100は、車両の乗員が撮像された撮像画像に基づき、乗員の骨格点を抽出する。また、体格判定装置100は、状況情報、ここでは、顔向き情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。体格判定装置100は、骨格点情報と、乗員の姿勢の状態は正規着座の状態であるか否かの判定結果とに基づき、抽出した乗員の骨格点は正規着座骨格点であるか否かを判定し、正規着座骨格点であると判定した場合に、正規着座骨格点情報に基づいて、体格判定用特徴量を算出する。そして、体格判定装置100は、算出した体格判定用特徴量に基づき、乗員の体格を判定する。 In this way, the physique determination device 100 extracts the skeletal points of the vehicle occupant based on the captured image of the vehicle occupant. Furthermore, the physique determination device 100 determines whether the occupant's posture is in a normal seating state based on situation information, in this case, face orientation information. The physique determination device 100 determines whether or not the extracted skeletal points of the occupant are normal seated skeletal points based on the skeletal point information and the determination result as to whether the posture state of the occupant is a normal seating state. If the point is determined to be a regular sitting skeleton point, a feature quantity for physique determination is calculated based on the regular sitting skeleton point information. Then, the physique determination device 100 determines the occupant's physique based on the calculated physique determination feature amount.
 上述したとおり、乗員が座席に姿勢が崩れた状態で着座している場合、姿勢崩れのパターンは無数に考えられる。そのため、乗員が座席に姿勢が崩れた状態で着座している場合は、無数に考えられる姿勢崩れのパターンを網羅することが難しく、乗員が正規着座している場合と比べて、撮像画像に基づいて判定される乗員の姿勢には、誤差が含まれやすくなる、または、当該誤差が大きくなる。例えば、撮像画像から抽出される乗員の骨格点に誤差が含まれやすくなる、または、当該誤差が大きくなる。撮像画像から抽出される乗員の骨格点に誤差が含まれやすくなる、または、当該誤差が大きくなると、乗員の骨格点に基づいて算出される特徴量にも誤差が含まれやすくなる、または、当該誤差が大きくなる。そうすると、特徴量に基づいて判定される乗員の姿勢にも、誤差が含まれやすくなる、または、当該誤差が大きくなる。その結果、体格判定の精度が低下する。
 なお、乗員が座席に姿勢を崩すことなく着座している場合、すなわち、正規着座している場合は、撮像画像上の乗員の特徴点である骨格点の位置は、ある程度の範囲内におさまると想定される。よって、乗員が正規着座している場合、撮像画像から抽出される乗員の骨格点、および、当該骨格点から算出される特徴量に誤差は生じにくく、その結果、特徴量に基づいて判定される乗員の姿勢の誤差も生じにくい。仮に誤差が生じたとしても、当該誤差が体格判定に与える影響は大きくない。
As described above, when an occupant is seated in a seat with a crooked posture, there are countless possible patterns of crooked posture. Therefore, when an occupant is seated in a seat with a crooked posture, it is difficult to cover the countless possible patterns of crooked posture. The posture of the occupant determined by the above-described method is likely to include an error, or the error becomes large. For example, the skeletal points of the occupant extracted from the captured image tend to include errors, or the errors become large. If the skeletal points of the occupant extracted from the captured image tend to include errors, or if the error becomes large, the feature values calculated based on the skeletal points of the occupant tend to include errors, or The error becomes larger. Then, the posture of the occupant determined based on the feature amount is likely to include an error, or the error becomes large. As a result, the accuracy of physique determination decreases.
Note that when the occupant is seated without changing his or her posture, that is, when the occupant is seated normally, the position of the skeleton point, which is the feature point of the occupant, on the captured image will fall within a certain range. is assumed. Therefore, when the occupant is properly seated, errors are unlikely to occur in the occupant's skeletal points extracted from the captured image and the feature amounts calculated from the skeletal points, and as a result, the determination is made based on the feature amounts. Errors in the occupant's posture are also less likely to occur. Even if an error occurs, the influence of the error on the physique determination is not large.
 撮像画像に基づく乗員の体格判定結果は、例えば、シートベルトの制御、エアバッグの制御、および、置き去り検知等の機能で用いられる。シートベルトの制御、エアバッグの制御、および、置き去り検知等の機能は、安全にかかわる機能であるため、当該機能に用いられる体格判定結果には、高い精度が求められる。そのため、乗員が座席に姿勢が崩れた状態で着座している場合に判定された、精度が低い乗員の体格判定結果は、シートベルトの制御、エアバッグの制御、または、置き去り検知等の機能で用いることができない。仮に、体格判定装置が、シートベルトの制御、エアバッグの制御、または、置き去り検知等の機能で用いられる体格判定結果として、乗員が座席に姿勢が崩れた状態で着座している場合に判定された乗員の体格判定結果を提供してしまうと、上記機能において、誤制御または誤検知を招くおそれがある。 The occupant's physique determination result based on the captured image is used, for example, in functions such as seatbelt control, airbag control, and abandonment detection. Functions such as seatbelt control, airbag control, and left-behind detection are safety-related functions, and therefore the physique determination results used for these functions are required to have high accuracy. Therefore, the less accurate results of determining the occupant's physique when the occupant is seated in a crooked position may be affected by functions such as seatbelt control, airbag control, or left-behind detection. cannot be used. If the physique determination device determines the physique used for functions such as seatbelt control, airbag control, or left-behind detection, it is determined that the occupant is sitting in a crooked position in the seat. If the occupant's physique determination result is provided, there is a risk of erroneous control or erroneous detection in the above function.
 これに対し、実施の形態1に係る体格判定装置100は、上述のとおり、乗員の姿勢の状態が正規着座の状態であるか否かを判定し、正規着座の状態であると判定したときの撮像画像から抽出した骨格点に基づく特徴量を体格判定用特徴量とし、当該体格判定用特徴量に基づいて乗員の体格判定を行う。体格判定装置100は、乗員の姿勢の状態が正規着座の状態であるときの撮像画像に基づいて体格判定用特徴量を算出し、当該体格判定用特徴量に基づいて乗員の体格を判定するため、乗員の体格判定の精度の低下を防ぐことができる。すなわち、体格判定装置100は、例えば、車両の走行中に乗員の姿勢が崩れた状態が発生したとしても、当該乗員の体格判定の精度の低下を防ぐことができる。
 そして、体格判定装置100は、精度の低下を防ぐようにした乗員の体格判定結果を提供するので、体格判定結果について高い精度が求められるシートベルトの制御、エアバッグの制御、および、置き去り検知等の機能に対しても、当該機能で用いることができる体格判定結果を提供できる。
On the other hand, as described above, the physique determination device 100 according to the first embodiment determines whether or not the posture of the occupant is in the normal seating condition, and when it is determined that the occupant is in the normal seating condition. The feature amount based on the skeletal points extracted from the captured image is used as the feature amount for physique determination, and the physique of the occupant is determined based on the feature amount for physique determination. The physique determination device 100 calculates a physique determination feature amount based on a captured image when the posture state of the occupant is in a normal seating state, and determines the occupant's physique based on the physique determination feature amount. , it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
The physique determination device 100 provides the occupant's physique determination result that prevents deterioration of accuracy, so the physique determination result can be used for seatbelt control, airbag control, abandonment detection, etc., which require high accuracy. It is also possible to provide a physique determination result that can be used in the function.
 なお、以上の実施の形態1では、体格判定装置100において、状況情報取得部103は、顔向き検出部1031を備え、顔向き検出部1031が、画像取得部101が取得した撮像画像に基づき、乗員の顔向きを検出していた。しかし、これは一例に過ぎない。例えば、状況情報取得部103は、DMSが備えている乗員状態検出装置等、体格判定装置100の外部において体格判定装置100が参照可能な場所に備えられている装置から、乗員の顔向き情報を取得してもよい。なお、この場合、撮像装置200は、DMSと共用のものであることを想定している。
 この場合、体格判定装置100において、状況情報取得部103は、顔向き検出部1031を備えない構成とできる。
Note that in the first embodiment described above, in the physique determination device 100, the situation information acquisition unit 103 includes the face orientation detection unit 1031, and the face orientation detection unit 1031 detects the captured image based on the captured image acquired by the image acquisition unit 101. The direction of the passenger's face was detected. However, this is just one example. For example, the situation information acquisition unit 103 acquires facial orientation information of the occupant from a device provided outside the physique determination device 100 at a location that the physique determination device 100 can refer to, such as an occupant state detection device included in the DMS. You may obtain it. Note that in this case, it is assumed that the imaging device 200 is shared with the DMS.
In this case, in the physique determination device 100, the situation information acquisition section 103 can be configured without the face orientation detection section 1031.
 また、以上の実施の形態1では、体格判定装置100において、状況情報取得部103が取得する状況情報は、乗員の顔向きに関する情報としていたが、これは一例に過ぎない。以上の実施の形態1において、体格判定装置100は、例えば、状況情報取得部103が取得する状況情報を、骨格点情報としてもよい。
 この場合、正規着座判定部104が、状況情報取得部103の機能を有してもよい。つまり、図1に示したような体格判定装置100において、状況情報取得部103を備えることは必須ではない。
 この場合、骨格点抽出部102は、骨格点情報を正規着座判定部104に出力する。
 正規着座判定部104は、骨格点抽出部102から出力された骨格点情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
Furthermore, in the first embodiment described above, in the physique determination device 100, the situation information acquired by the situation information acquisition unit 103 is information regarding the passenger's facial orientation, but this is only an example. In the first embodiment described above, the physique determination device 100 may use the situation information acquired by the situation information acquisition unit 103 as skeletal point information, for example.
In this case, the normal seating determination section 104 may have the function of the situation information acquisition section 103. That is, it is not essential for the physique determination device 100 as shown in FIG. 1 to include the situation information acquisition unit 103.
In this case, the skeleton point extraction section 102 outputs skeleton point information to the normal seating determination section 104.
The normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the skeleton point information output from the skeleton point extraction unit 102.
 例えば、この場合、正規着座判定用条件には、例えば、以下の<条件2>のような条件が定義されている。

<条件2>
 撮像画像上で、乗員の肘を示す骨格点の位置が予め設定された範囲(以下「肘位置判定用範囲」という。)内に位置していないこと

 なお、<条件2>における肘位置判定用範囲には、予め、例えば、アームレストを含む所定の範囲、または、車両のドア付近の所定の範囲が設定されている。
For example, in this case, a condition such as <Condition 2> below is defined as the condition for determining normal seating.

<Condition 2>
On the captured image, the position of the skeleton point indicating the occupant's elbow is not located within a preset range (hereinafter referred to as "elbow position determination range").

Note that the elbow position determination range in <Condition 2> is set in advance, for example, to a predetermined range including the armrest or a predetermined range near the door of the vehicle.
 正規着座判定部104は、骨格点情報に基づき、乗員の肘を示す骨格点の位置が正規着座判定用条件を満たす場合、乗員の姿勢の状態は正規着座の状態であると判定する。
 一方、正規着座判定部104は、傾き情報に基づき、乗員の骨格線の傾きが正規着座判定用条件を満たさない場合、乗員の姿勢の状態は正規着座の状態ではないと判定する。
 例えば、乗員の肘を示す骨格点の位置が肘位置判定用範囲内に位置していれば、正規着座判定部104は、乗員の姿勢の状態は正規着座の状態ではないと判定する。
Based on the skeleton point information, the normal seating determination unit 104 determines that the posture of the occupant is the normal seating state when the position of the skeleton point indicating the elbow of the occupant satisfies the conditions for determining normal seating.
On the other hand, based on the inclination information, if the inclination of the occupant's skeleton line does not satisfy the condition for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state.
For example, if the position of the skeleton point indicating the occupant's elbow is located within the elbow position determination range, the normal seating determination unit 104 determines that the occupant's posture is not the normal seating state.
 体格判定装置100において、状況情報を骨格点情報とする場合の体格判定装置100の動作を示すフローチャートは、図3に示したフローチャートのとおりである。
 ただし、ステップST3では、正規着座判定部104が、ステップST2にて骨格点抽出部102が出力した骨格点情報を、状況情報として取得する。
 そして、正規着座判定部104は、ステップST4において、ステップST3にて取得した骨格点情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
In the physique determination device 100, a flowchart showing the operation of the physique determination device 100 when the situation information is skeletal point information is the same as the flowchart shown in FIG.
However, in step ST3, the normal seating determination unit 104 acquires the skeleton point information output by the skeleton point extraction unit 102 in step ST2 as situation information.
Then, in step ST4, the normal seating determination unit 104 determines whether or not the occupant's posture is in a normal seating state based on the skeleton point information acquired in step ST3.
 その他の、ステップST1~ステップST2、ステップST5~ステップST8における体格判定装置100の具体的な動作は、説明済みの具体的な動作のとおりであるため、重複した説明を省略する。 The other specific operations of the physique determination device 100 in steps ST1 to ST2 and steps ST5 to ST8 are the same as the specific operations that have already been explained, and therefore, redundant explanations will be omitted.
 このように、状況情報は乗員の骨格点情報とし、体格判定装置100において、正規着座判定部104は、骨格点抽出部102が抽出した乗員の骨格点情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定してもよい。この場合も、体格判定装置100は、乗員の姿勢の状態が正規着座の状態であるときの撮像画像に基づいて体格判定用特徴量を算出し、当該体格判定用特徴量に基づいて乗員の体格を判定するため、乗員の体格判定の精度の低下を防ぐことができる。すなわち、体格判定装置100は、例えば、車両の走行中に乗員の姿勢が崩れた状態が発生したとしても、当該乗員の体格判定の精度の低下を防ぐことができる。 In this way, the situation information is the occupant's skeletal point information, and in the physique determining device 100, the normal seating determining section 104 determines that the occupant's posture is normal based on the occupant's skeletal point information extracted by the skeletal point extracting section 102. It may also be determined whether the user is in a seated state. In this case as well, the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
 また、以上の実施の形態1において、体格判定装置100は、例えば、乗員の骨格点を結ぶ骨格線の傾きに関する情報(以下「傾き情報」という。)を状況情報として取得してもよい。なお、この場合は、仮に、乗員の顔向きだけが正面方向を向いていない状態が発生しても姿勢崩れとはみなされない。
 ここで、図4は、実施の形態1において、傾き情報を状況情報とした場合の、体格判定装置100の構成例を示す図である。
 図4に示す体格判定装置100の構成例は、図1に示した体格判定装置100の構成例とは、状況情報取得部103が、顔向き検出部1031に代えて傾き検出部1032を備える点が異なる。
 また、図4に示す体格判定装置100では、正規着座判定部104の具体的な動作が、図1に示したような体格判定装置100における正規着座判定部104の具体的な動作とは異なる。画像取得部101、骨格点抽出部102、特徴量算出部105、特徴量平準化部106、体格判定部107、および、姿勢注意部108の具体的な動作は、説明済みの具体的な動作のとおりであるので、重複した説明を省略する。
Further, in the first embodiment described above, the physique determination device 100 may obtain, for example, information regarding the inclination of a skeletal line connecting the skeletal points of the occupant (hereinafter referred to as "inclination information") as the situation information. In this case, even if a situation occurs in which only the occupant's face is not facing forward, it is not considered to be a posture collapse.
Here, FIG. 4 is a diagram illustrating a configuration example of the physique determination device 100 in the first embodiment when tilt information is used as situation information.
The configuration example of the physique determination device 100 shown in FIG. 4 differs from the configuration example of the physique determination device 100 shown in FIG. are different.
Further, in the physique determining device 100 shown in FIG. 4, the specific operation of the normal sitting determining section 104 is different from the specific operation of the normal seating determining section 104 in the physique determining device 100 shown in FIG. The specific operations of the image acquisition unit 101, skeletal point extraction unit 102, feature value calculation unit 105, feature level leveling unit 106, physique determination unit 107, and posture attention unit 108 are the same as those already explained. As this is true, repeated explanation will be omitted.
 傾き検出部1032は、骨格点抽出部102から出力された骨格点情報に基づき、乗員の複数の骨格点を結ぶ線分である骨格線の傾きを検出する。
 例えば、傾き検出部1032は、乗員の両肩を示す骨格点を結ぶ、乗員の肩の骨格線の傾きを検出する。なお、これは一例に過ぎず、傾き検出部1032は、胴体部を示す骨格線(例えば座高を示す骨格線)の傾き等、適宜の骨格線の傾きを検出できる。ただし、乗員の姿勢の状態が崩れていることは、乗員の肩の骨格線の傾きにあらわれやすい。よって、傾き検出部1032は、乗員の肩の骨格線の傾きを検出すると、正規着座判定部104による、乗員の姿勢の状態が正規着座の状態であるか否かの誤判定を低減することができる。
The inclination detection unit 1032 detects the inclination of a skeletal line, which is a line segment connecting a plurality of skeletal points of the occupant, based on the skeletal point information output from the skeletal point extraction unit 102.
For example, the inclination detection unit 1032 detects the inclination of a skeletal line of the occupant's shoulders that connects skeletal points indicating both shoulders of the occupant. Note that this is just an example, and the inclination detection unit 1032 can detect the inclination of an appropriate skeletal line, such as the inclination of a skeletal line indicating a torso (for example, a skeletal line indicating sitting height). However, the fact that the occupant's posture is disrupted is likely to be reflected in the inclination of the skeletal line of the occupant's shoulders. Therefore, when the inclination detection unit 1032 detects the inclination of the shoulder skeleton line of the occupant, it is possible to reduce the erroneous determination by the normal seating determination unit 104 as to whether or not the occupant's posture is in the normal seating state. can.
 状況情報取得部103は、傾き情報を、状況情報として取得する。傾き情報には、乗員を示す情報と、骨格線の傾きを示す情報とが対応付けられた情報が含まれる。傾き情報には、これらに加え、画像取得部101が取得した撮像画像が含まれてもよい。
 状況情報取得部103は、取得した状況情報、ここでは傾き情報を、正規着座判定部104に出力する。
The situation information acquisition unit 103 acquires the tilt information as situation information. The inclination information includes information in which information indicating the occupant is associated with information indicating the inclination of the skeleton line. In addition to these, the tilt information may also include a captured image acquired by the image acquisition unit 101.
The situation information acquisition unit 103 outputs the acquired situation information, here tilt information, to the normal seating determination unit 104.
 体格判定装置100の構成例が図4に示すような構成例である場合、正規着座判定部104は、傾き検出部1032が検出した乗員の骨格線の傾きに関する傾き情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
 例えば、この場合、正規着座判定用条件には、以下の<条件3>のような条件が定義されている。

<条件3>
 乗員の骨格線の傾きが、予め設定された範囲(以下「傾き判定用範囲」という。)内であること

 なお、<条件3>における傾き判定用範囲には、予め、例えば、正面を向いているとみなす乗員の骨格線の傾きを基準とした所定の範囲が設定されている。
When the configuration example of the physique determination device 100 is as shown in FIG. It is determined whether the state is a normal seating state.
For example, in this case, a condition such as <Condition 3> below is defined as the condition for determining normal seating.

<Condition 3>
The inclination of the occupant's skeletal line must be within a preset range (hereinafter referred to as the "inclination determination range").

Note that the range for determining inclination in <Condition 3> is set in advance as a predetermined range based on the inclination of the skeletal line of the occupant, which is assumed to be facing forward, for example.
 正規着座判定部104は、傾き情報に基づき、乗員の骨格線の傾きが正規着座判定用条件を満たす場合、乗員の姿勢の状態は正規着座の状態であると判定する。
 一方、正規着座判定部104は、傾き情報に基づき、乗員の骨格線の傾きが正規着座判定用条件を満たさない場合、乗員の姿勢の状態は正規着座の状態ではないと判定する。
Based on the inclination information, the normal seating determination unit 104 determines that the posture of the occupant is normal seating if the inclination of the skeletal line of the occupant satisfies the conditions for determining normal seating.
On the other hand, based on the inclination information, if the inclination of the occupant's skeleton line does not satisfy the condition for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state.
 体格判定装置100の構成例が図4に示すような構成例である場合の体格判定装置100の動作を示すフローチャートは、図3に示したフローチャートのとおりである。
 ただし、体格判定装置100の構成例が図4に示すような構成例である場合、ステップST3において、状況情報取得部103は、傾き情報を、状況情報として取得する。
 そして、状況情報取得部103は、取得した状況情報、ここでは傾き情報を、正規着座判定部104に出力する。
 ステップST4において、正規着座判定部104は、傾き情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
A flowchart showing the operation of the physique determination device 100 when the configuration example of the physique determination device 100 is as shown in FIG. 4 is the same as the flowchart shown in FIG.
However, if the configuration example of the physique determination device 100 is as shown in FIG. 4, the situation information acquisition unit 103 acquires the inclination information as the situation information in step ST3.
Then, the situation information acquisition unit 103 outputs the acquired situation information, here the tilt information, to the normal seating determination unit 104.
In step ST4, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the inclination information.
 その他の、ステップST1~ステップST2、ステップST5~ステップST8における体格判定装置100の具体的な動作は、説明済みの具体的な動作のとおりであるため、重複した説明を省略する。 The other specific operations of the physique determination device 100 in steps ST1 to ST2 and steps ST5 to ST8 are the same as the specific operations that have already been explained, and therefore, redundant explanations will be omitted.
 このように、状況情報は傾き情報とし、体格判定装置100は、骨格点抽出部102が抽出した乗員の骨格点に関する情報に基づき乗員の骨格線の傾きを検出する傾き検出部1032を備え、正規着座判定部104は、傾き検出部1032が検出した乗員の骨格線の傾きに基づいて乗員の姿勢の状態は正規着座の状態であるか否かを判定してもよい。この場合も、体格判定装置100は、乗員の姿勢の状態が正規着座の状態であるときの撮像画像に基づいて体格判定用特徴量を算出し、当該体格判定用特徴量に基づいて乗員の体格を判定するため、乗員の体格判定の精度の低下を防ぐことができる。すなわち、体格判定装置100は、例えば、車両の走行中に乗員の姿勢が崩れた状態が発生したとしても、当該乗員の体格判定の精度の低下を防ぐことができる。 In this way, the situation information is tilt information, and the physique determination device 100 includes the tilt detection unit 1032 that detects the tilt of the occupant's skeletal line based on the information regarding the occupant's skeletal points extracted by the skeletal point extraction unit 102, and The seating determination unit 104 may determine whether the occupant is in a normal seating position based on the inclination of the occupant's skeletal line detected by the inclination detection unit 1032. In this case as well, the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if, for example, the occupant's posture collapses while the vehicle is running.
 また、以上の実施の形態1において、状況情報は、乗員の状況に関する情報のみならず、例えば、車両の状況に関する情報であってもよい。なお、この場合も、仮に、乗員の顔向きだけが正面方向を向いていない状態が発生しても姿勢崩れとはみなされない。
 例えば、以上の実施の形態1において、体格判定装置100は、車両の状況に関する情報(以下「車両情報」という。)を状況情報として取得してもよい。
 車両情報は、例えば、車両のハンドル操舵角の状況、車速、車両のドアの開閉状況、車両のイグニッション状況、または、車載装置(例えば、カーナビゲーション装置またはルームランプ等)の操作状況、に関する情報を含む。
Further, in the above-described first embodiment, the situation information may be not only information about the situation of the occupant, but also information about the situation of the vehicle, for example. Note that in this case as well, even if a situation occurs in which only the occupant's face is not facing the front direction, this is not considered to be a posture collapse.
For example, in the first embodiment described above, the physique determination device 100 may acquire information regarding the situation of the vehicle (hereinafter referred to as "vehicle information") as the situation information.
Vehicle information includes, for example, information regarding the steering angle of the vehicle, vehicle speed, the opening/closing status of vehicle doors, the ignition status of the vehicle, or the operation status of in-vehicle devices (e.g., car navigation device, interior lamp, etc.). include.
 ここで、図5は、実施の形態1において、車両情報を状況情報とした場合の、体格判定装置100の構成例を示す図である。
 図5に示す体格判定装置100の構成例は、図1に示した体格判定装置100の構成例とは、状況情報取得部103が、顔向き検出部1031に代えて、車両情報取得部1033を備える点が異なる。
 また、図5に示す体格判定装置100では、正規着座判定部104の具体的な動作が、図1に示したような体格判定装置100における正規着座判定部104の具体的な動作とは異なる。
 画像取得部101、骨格点抽出部102、特徴量算出部105、特徴量平準化部106、体格判定部107、および、姿勢注意部108の具体的な動作は、説明済みの具体的な動作のとおりであるので、重複した説明を省略する。
Here, FIG. 5 is a diagram showing a configuration example of the physique determination device 100 in the first embodiment when vehicle information is used as situation information.
The configuration example of the physique determination device 100 shown in FIG. 5 is different from the configuration example of the physique determination device 100 shown in FIG. The difference is in the preparation.
Further, in the physique determining device 100 shown in FIG. 5, the specific operation of the normal sitting determining section 104 is different from the specific operation of the normal seating determining section 104 in the physique determining device 100 as shown in FIG.
The specific operations of the image acquisition unit 101, skeletal point extraction unit 102, feature value calculation unit 105, feature level leveling unit 106, physique determination unit 107, and posture attention unit 108 are the same as those already explained. As this is true, repeated explanation will be omitted.
 車両情報取得部1033は、車両に搭載されている種々の装置から、車両情報を取得する。例えば、車両情報取得部1033は、舵角センサからハンドル操舵角の状況に関する情報を取得すればよい。また、例えば、車両情報取得部1033は、車速センサから車速を取得すればよい。また、例えば、車両情報取得部1033は、ドアに設けられているセンサからドアの開閉状況に関する情報を取得すればよい。また、例えば、車両情報取得部1033は、イグニッションセンサからイグニッション状況に関する情報を取得すればよい。また、例えば、車両情報取得部1033は、カーナビゲーション装置等の車載装置から車載装置の操作状況に関する情報を取得すればよい。 The vehicle information acquisition unit 1033 acquires vehicle information from various devices installed in the vehicle. For example, the vehicle information acquisition unit 1033 may acquire information regarding the steering angle status from the steering angle sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire the vehicle speed from a vehicle speed sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the opening/closing status of the door from a sensor provided on the door. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the ignition status from the ignition sensor. Further, for example, the vehicle information acquisition unit 1033 may acquire information regarding the operation status of the in-vehicle device from an in-vehicle device such as a car navigation device.
 状況情報取得部103は、車両情報取得部1033が取得した車両情報を、状況情報として取得する。
 状況情報取得部103は、取得した状況情報、ここでは車両情報を、正規着座判定部104に出力する。
The situation information acquisition unit 103 acquires the vehicle information acquired by the vehicle information acquisition unit 1033 as situation information.
The situation information acquisition unit 103 outputs the acquired situation information, here vehicle information, to the normal seating determination unit 104.
 体格判定装置100の構成例が図5に示すような構成例である場合、正規着座判定部104は、車両情報取得部1033が取得した車両情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
 例えば、この場合、正規着座判定用条件には、以下の<条件4>~<条件8>のような条件が定義されている。

<条件4>
 車両のハンドル操舵角が予め設定された範囲(以下「操舵角判定用範囲」という。)内であること

<条件5>
 車速が予め設定された範囲(以下「車速判定用範囲」という。)を超えていること

<条件6>
 車両のドアが開いていないこと

<条件7>
 イグニッションがオンにされた後の経過時間が予め設定された時間(以下「開始判定用時間」という。)以内であること

<条件8>
 車載装置の操作がされていないこと

 なお、<条件4>における操舵角判定用範囲には、例えば、腕を曲げなければ生じ得ない程度の操舵角の範囲が設定される。
 また、<条件5>における車速判定用範囲には、例えば、高速道路を走行であることを想定した車速が設定される。
 <条件6>について、例えば、車両のドアが開いていれば乗員が乗降中であり、乗員は正規着座ではないと想定される。
When the configuration example of the physique determination device 100 is as shown in FIG. 5, the normal seating determination unit 104 determines that the state of the occupant's posture is the normal seating state based on the vehicle information acquired by the vehicle information acquisition unit 1033. Determine whether or not.
For example, in this case, conditions such as the following <Condition 4> to <Condition 8> are defined as the conditions for determining normal seating.

<Condition 4>
The vehicle steering angle must be within a preset range (hereinafter referred to as the "steering angle determination range").

<Condition 5>
The vehicle speed exceeds a preset range (hereinafter referred to as "vehicle speed determination range").

<Condition 6>
vehicle door not open

<Condition 7>
The elapsed time after the ignition was turned on must be within a preset time (hereinafter referred to as "start determination time").

<Condition 8>
The in-vehicle device is not being operated.

Note that the range for steering angle determination in <Condition 4> is set to, for example, a range of steering angles that cannot occur without bending the arm.
Furthermore, the vehicle speed determination range in <Condition 5> is set to, for example, a vehicle speed assuming that the vehicle is traveling on an expressway.
Regarding <Condition 6>, for example, if the door of the vehicle is open, it is assumed that the occupant is getting in or out of the vehicle, and that the occupant is not properly seated.
 正規着座判定部104は、車両情報が正規着座判定用条件を満たす場合、乗員の姿勢の状態は正規着座の状態であると判定する。
 一方、正規着座判定部104は、車両情報が正規着座判定用条件を満たさない場合、乗員の姿勢の状態は正規着座の状態ではないと判定する。
If the vehicle information satisfies the condition for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is normal seating.
On the other hand, if the vehicle information does not satisfy the conditions for determining normal seating, the normal seating determination unit 104 determines that the posture of the occupant is not the normal seating state.
 体格判定装置100の構成例が図5に示すような構成例である場合の体格判定装置100の動作を示すフローチャートは、図3に示したフローチャートのとおりである。
 ただし、体格判定装置100の構成例が図5に示すような構成例である場合、ステップST3において、状況情報取得部103は、車両情報取得部1033が取得した車両情報を、状況情報として取得する。
 そして、状況情報取得部103は、取得した状況情報、ここでは車両情報を、正規着座判定部104に出力する。
 ステップST4において、正規着座判定部104は、ステップST3にて車両情報取得部1033が取得した車両情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する。
A flowchart showing the operation of the physique determination device 100 when the configuration example of the physique determination device 100 is as shown in FIG. 5 is the same as the flowchart shown in FIG. 3.
However, if the configuration example of the physique determination device 100 is the configuration example shown in FIG. 5, in step ST3, the situation information acquisition unit 103 acquires the vehicle information acquired by the vehicle information acquisition unit 1033 as the situation information. .
Then, the situation information acquisition unit 103 outputs the acquired situation information, here vehicle information, to the normal seating determination unit 104.
In step ST4, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state based on the vehicle information acquired by the vehicle information acquisition unit 1033 in step ST3.
 その他の、ステップST1~ステップST2、ステップST5~ステップST8における体格判定装置100の具体的な動作は、説明済みの具体的な動作のとおりであるため、重複した説明を省略する。 The other specific operations of the physique determination device 100 in steps ST1 to ST2 and steps ST5 to ST8 are the same as the specific operations that have already been explained, and therefore, redundant explanations will be omitted.
 このように、状況情報は車両情報とし、体格判定装置100は、車両情報を取得する車両情報取得部1033を備え、正規着座判定部104は、車両情報取得部1033が取得した車両情報に基づいて乗員の姿勢の状態は正規着座の状態であるか否かを判定してもよい。この場合も、体格判定装置100は、乗員の姿勢の状態が正規着座の状態であるときの撮像画像に基づいて体格判定用特徴量を算出し、当該体格判定用特徴量に基づいて乗員の体格を判定するため、乗員の体格判定の精度の低下を防ぐことができる。すなわち、体格判定装置100は、例えば、車両の走行中に乗員の姿勢が崩れた状態が発生したとしても、当該乗員の体格判定の精度の低下を防ぐことができる。 In this way, the situation information is vehicle information, the physique determination device 100 includes a vehicle information acquisition unit 1033 that acquires vehicle information, and the normal seating determination unit 104 uses vehicle information acquired by the vehicle information acquisition unit 1033. It may be determined whether the passenger's posture is in a normal seating state. In this case as well, the physique determination device 100 calculates the physique determination feature amount based on the captured image when the posture state of the occupant is the normal seating state, and calculates the physique determination feature amount of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, the physique determining device 100 can prevent the accuracy of determining the physique of the occupant from decreasing, even if the occupant's posture collapses while the vehicle is running.
 このように、体格判定装置100は、顔向き情報に限らず、骨格点情報、傾き情報、または、車両情報を、状況情報として取得してもよい。
 ただし、車両にて撮像された撮像画像に基づいて乗員の顔向きを検出する機能は、体格判定装置100以外の装置も有していることが多い。上述したように、例えば、DMSが備える乗員状態検出装置等が、乗員の顔向きを検出する機能を有していることがある。例えば、体格判定装置100は、顔向き情報を状況情報とすることで、当該顔向き情報を、体格判定装置100の外部の装置から取得することができ、顔向きを検出する機能を備える必要がない。また、体格判定装置100は、乗員の顔向き検出のための処理を行う必要がないため、乗員の体格判定にかかる処理の負荷を軽くすることができる。
In this way, the physique determination device 100 may acquire not only face orientation information but also skeletal point information, tilt information, or vehicle information as situation information.
However, devices other than the physique determination device 100 often have the function of detecting the direction of the occupant's face based on the captured image captured by the vehicle. As described above, for example, the occupant state detection device included in the DMS may have a function of detecting the direction of the occupant's face. For example, the physique determination device 100 can acquire the face orientation information from a device external to the physique determination device 100 by using the face orientation information as situation information, and is not required to have a function of detecting the facial orientation. do not have. Furthermore, since the physique determination device 100 does not need to perform processing for detecting the occupant's face orientation, it is possible to reduce the processing load associated with determining the occupant's physique.
 なお、体格判定装置100は、骨格点情報を状況情報とすることでも、状況情報取得のための機能の追加等をする必要はない。ただし、一般に、撮像画像に基づいて骨格点の位置を検出するよりも、撮像画像に基づいて目または口の位置を検出するほうが、検出精度は高い。例えば、乗員の顔向きから当該乗員は正面を向いているか否かを判定するほうが、乗員の骨格点から当該乗員は正面を向いているか否かを判定するよりも、判定精度が高くなる。なお、一般に、人の顔向きは、目または口の位置を用いて検出される。
 よって、体格判定装置100は、顔向き情報を状況情報としたほうが、骨格点情報を状況情報とする場合よりも、精度よく、乗員の姿勢の状態は正規着座の状態であるか否かを判定できる。
Note that the physique determination device 100 does not need to add a function for acquiring situation information even if the skeletal point information is used as situation information. However, in general, detecting the position of the eyes or mouth based on the captured image has higher detection accuracy than detecting the position of the skeletal point based on the captured image. For example, determining whether the occupant is facing forward based on the direction of the occupant's face has higher determination accuracy than determining whether the occupant is facing forward based on the occupant's skeletal points. Note that, generally, the direction of a person's face is detected using the position of the eyes or mouth.
Therefore, the physique determination device 100 determines whether the occupant's posture is in the normal seating state with more accuracy when the face orientation information is used as the situation information than when the skeletal point information is used as the situation information. can.
 また、以上の実施の形態1において、体格判定装置100は、顔向き情報、骨格点情報、傾き情報、および、車両情報のうちの2つ以上を、状況情報として取得するようにしてもよい。
 すなわち、体格判定装置100において、状況情報取得部103は、例えば、顔向き検出部1031、傾き検出部1032、および、車両情報取得部1033のうちの2つ以上を備える構成としてもよい。
Furthermore, in the first embodiment described above, the physique determination device 100 may acquire two or more of face orientation information, skeleton point information, tilt information, and vehicle information as situation information.
That is, in the physique determination device 100, the situation information acquisition section 103 may include, for example, two or more of the face direction detection section 1031, the tilt detection section 1032, and the vehicle information acquisition section 1033.
 また、以上の実施の形態1では、体格判定装置100において、正規着座判定部104は、状況情報に基づき、正規着座判定用条件に従って、乗員の姿勢の状態は正規着座の状態であるか否かを判定していたが、これは一例に過ぎない。
 例えば、正規着座判定部104は、状況情報と機械学習モデル(以下「正規着座判定用モデル」という。)とに基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定してもよい。
 例えば、正規着座判定用モデルは、状況情報を入力とし乗員の姿勢の状態は正規着座の状態であるか否かを示す情報を出力する機械学習モデルである。
 正規着座判定用モデルは、予め、管理者等によって生成され、体格判定装置100が参照可能な場所に記憶されている。
Further, in the first embodiment described above, in the physique determination device 100, the normal seating determination unit 104 determines whether the posture of the occupant is the normal seating state based on the situation information and according to the conditions for determining normal seating. However, this is just one example.
For example, the normal seating determination unit 104 determines whether the occupant's posture is the normal seating state based on the situation information and a machine learning model (hereinafter referred to as the "regular seating determination model"). Good too.
For example, the normal seating determination model is a machine learning model that receives situational information as input and outputs information indicating whether or not the passenger's posture is normal seating.
The normal seating determination model is generated in advance by an administrator or the like and stored in a location that can be referenced by the physique determination device 100.
 また、以上の実施の形態1において、正規着座判定部104は、状況情報に基づき、正規着座判定用条件と正規着座判定用モデルの両方を用いて、乗員の状態は正規着座の状態であるか否かを判定してもよい。
 例えば、正規着座判定部104は、状況情報に基づき、正規着座判定用条件に従って、乗員の姿勢の状態は正規着座の状態であるか否かの判定(第1判定)を行う。また、正規着座判定部104は、状況情報と正規着座判定用モデルとに基づき、乗員の姿勢の状態は正規着座の状態であるか否かの判定(第2判定)を行う。そして、正規着座判定部104は、第1判定および第2判定の両方で乗員の姿勢の状態は正規着座の状態であると判定した場合に乗員の姿勢の状態は正規着座の状態と判定し、第1判定または第2判定のいずれか一方で乗員の姿勢の状態は正規着座の状態ではないと判定した場合には、乗員の姿勢の状態は正規着座の状態ではないと判定する。
 また、例えば、第1判定の着座状態判定結果と第2判定の着座状態判定結果に重み付けを行って、乗員の姿勢の状態は正規着座の状態であるか否かを判定してもよい。
Further, in the first embodiment described above, the normal seating determination unit 104 uses both the normal seating determination conditions and the normal seating determination model based on the situation information to determine whether the occupant is in the normal seating state. It may be determined whether or not.
For example, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state (first determination) based on the situation information and in accordance with the conditions for determining normal seating. Further, the normal seating determination unit 104 determines whether the occupant's posture is in a normal seating state (second determination) based on the situation information and the normal seating determination model. The normal seating determination unit 104 determines that the occupant's posture is normal seating when it is determined in both the first determination and the second determination that the occupant's posture is normal seating; If it is determined in either the first determination or the second determination that the occupant's posture is not the normal seating condition, it is determined that the occupant's posture is not the normal seating condition.
Further, for example, it may be determined whether or not the occupant's posture is a normal seating state by weighting the seating state determination result of the first determination and the seating state determination result of the second determination.
 また、以上の実施の形態1では、体格判定装置100は、特徴量平準化部106を備えていたが、これは一例に過ぎない。体格判定装置100は、特徴量平準化部106を備えない構成としてもよい。
 この場合、体格判定部107は、特徴量算出部105が算出した体格判定用特徴量に基づき、乗員の体格を判定すればよい。
 また、図3のフローチャートを用いて説明した体格判定装置100の動作において、体格判定装置100は、ステップST6の処理を省略できる。ステップST7において、体格判定部107は、ステップST5にて特徴量算出部105が算出した体格判定用特徴量に基づき、乗員の体格を判定すればよい。
Further, in the first embodiment described above, the physique determination device 100 was provided with the feature leveling unit 106, but this is only an example. The physique determination device 100 may be configured without the feature leveling unit 106.
In this case, the physique determination unit 107 may determine the physique of the occupant based on the physique determination feature calculated by the feature calculation unit 105.
Furthermore, in the operation of the physique determining device 100 described using the flowchart of FIG. 3, the physique determining device 100 can omit the process of step ST6. In step ST7, the physique determination unit 107 may determine the physique of the occupant based on the physique determination feature calculated by the feature calculation unit 105 in step ST5.
 また、以上の実施の形態1では、体格判定装置100は、姿勢注意部108を備えていたが、これは一例に過ぎない。体格判定装置100は、姿勢注意部108を備えない構成としてもよい。また、体格判定装置100は、出力装置600と接続されていなくてもよい。
 この場合、図3のフローチャートを用いて説明した体格判定装置100の動作において、体格判定装置100は、ステップST8の処理を省略できる。
Further, in the first embodiment described above, the physique determination device 100 was provided with the posture attention section 108, but this is only an example. The physique determination device 100 may be configured without the posture attention section 108. Further, the physique determination device 100 does not need to be connected to the output device 600.
In this case, in the operation of the physique determining device 100 described using the flowchart of FIG. 3, the physique determining device 100 can omit the process of step ST8.
 また、以上の実施の形態1では、体格判定装置100は、シートベルト制御装置300、エアバッグ制御装置400、および、置き去り検知装置500と接続されていたが、これは一例に過ぎない。
 体格判定装置100は、シートベルト制御装置300、エアバッグ制御装置400、または、置き去り検知装置500のいずれかにのみ接続されていてもよいし、体格判定結果を使用して種々の処理を行う、シートベルト制御装置300、エアバッグ制御装置400、置き去り検知装置500以外の装置と接続されてもよい。
 例えば、体格判定装置100は、車両に搭載されている表示装置と接続されてもよい。表示装置は、例えば、体格判定装置100から出力された体格判定結果に応じた表示を行う。例えば、表示装置は、乗員の中に幼児がいる場合、子供が乗車している旨を示すアイコンを表示する。
Further, in the first embodiment described above, the physique determination device 100 is connected to the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500, but this is only an example.
The physique determination device 100 may be connected only to either the seatbelt control device 300, the airbag control device 400, or the left-behind detection device 500, or may perform various processes using the physique determination results. It may be connected to devices other than the seatbelt control device 300, the airbag control device 400, and the left-behind detection device 500.
For example, the physique determination device 100 may be connected to a display device mounted on a vehicle. The display device performs display according to the physique determination result output from the physique determination device 100, for example. For example, if an infant is among the occupants, the display device displays an icon indicating that a child is in the vehicle.
 また、以上の実施の形態1では、撮像装置200は1台であることを想定していたが、これは一例に過ぎない。実施の形態1において、撮像装置200は、車両に複数台設置されるようにしてもよい。例えば、複数台の撮像装置200が、車両内の各座席に着座している乗員をそれぞれ撮像可能に設置されてもよい。
 この場合、体格判定装置100は、複数台の撮像装置200から撮像画像を取得して、乗員の体格を判定する。
Further, in the first embodiment described above, it is assumed that there is one imaging device 200, but this is only an example. In the first embodiment, a plurality of imaging devices 200 may be installed in a vehicle. For example, a plurality of imaging devices 200 may be installed so as to be able to image each passenger seated in each seat in the vehicle.
In this case, the physique determination device 100 acquires captured images from the plurality of imaging devices 200 and determines the physique of the occupant.
 図6Aおよび図6Bは、実施の形態1に係る体格判定装置100のハードウェア構成の一例を示す図である。
 実施の形態1において、画像取得部101と、骨格点抽出部102と、状況情報取得部103と、顔向き検出部1031と、傾き検出部1032と、車両情報取得部1033と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108の機能は、処理回路1001により実現される。すなわち、体格判定装置100は、車両の乗員を撮像した撮像画像に基づいて乗員の体格を判定する制御を行うための処理回路1001を備える。
 処理回路1001は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリに格納されるプログラムを実行するプロセッサ1004であってもよい。
6A and 6B are diagrams illustrating an example of the hardware configuration of the physique determination device 100 according to the first embodiment.
In the first embodiment, an image acquisition unit 101, a skeleton point extraction unit 102, a situation information acquisition unit 103, a face orientation detection unit 1031, a tilt detection unit 1032, a vehicle information acquisition unit 1033, and a normal seating determination unit The functions of 104 , feature amount calculation section 105 , feature amount leveling section 106 , physique determination section 107 , and posture attention section 108 are realized by processing circuit 1001 . That is, the physique determination device 100 includes a processing circuit 1001 for performing control to determine the physique of the vehicle occupant based on a captured image of the vehicle occupant.
The processing circuit 1001 may be dedicated hardware as shown in FIG. 6A, or may be a processor 1004 that executes a program stored in memory as shown in FIG. 6B.
 処理回路1001が専用のハードウェアである場合、処理回路1001は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものが該当する。 When the processing circuit 1001 is dedicated hardware, the processing circuit 1001 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Circuit). Gate Array), or a combination of these.
 処理回路がプロセッサ1004の場合、画像取得部101と、骨格点抽出部102と、状況情報取得部103と、顔向き検出部1031と、傾き検出部1032と、車両情報取得部1033と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108の機能は、ソフトウェア、ファームウェア、または、ソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェアまたはファームウェアは、プログラムとして記述され、メモリ1005に記憶される。プロセッサ1004は、メモリ1005に記憶されたプログラムを読み出して実行することにより、画像取得部101と、骨格点抽出部102と、状況情報取得部103と、顔向き検出部1031と、傾き検出部1032と、車両情報取得部1033と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108の機能を実行する。すなわち、体格判定装置100は、プロセッサ1004により実行されるときに、上述の図3のステップST1~ステップST8が結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、画像取得部101と、骨格点抽出部102と、状況情報取得部103と、顔向き検出部1031と、傾き検出部1032と、車両情報取得部1033と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108の処理の手順または方法をコンピュータに実行させるものであるとも言える。ここで、メモリ1005とは、例えば、RAM、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)等の、不揮発性もしくは揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)等が該当する。 When the processing circuit is the processor 1004, the image acquisition unit 101, the skeleton point extraction unit 102, the situation information acquisition unit 103, the face orientation detection unit 1031, the tilt detection unit 1032, the vehicle information acquisition unit 1033, and the normal seating The functions of the determination unit 104, feature amount calculation unit 105, feature amount leveling unit 106, physique determination unit 107, and posture attention unit 108 are realized by software, firmware, or a combination of software and firmware. . Software or firmware is written as a program and stored in memory 1005. The processor 1004 reads out and executes the program stored in the memory 1005, thereby controlling the image acquisition unit 101, the skeleton point extraction unit 102, the situation information acquisition unit 103, the face direction detection unit 1031, and the tilt detection unit 1032. The functions of the vehicle information acquisition section 1033, normal seating determination section 104, feature amount calculation section 105, feature amount leveling section 106, physique determination section 107, and posture attention section 108 are executed. That is, the physique determination device 100 includes a memory 1005 for storing a program that, when executed by the processor 1004, results in the execution of steps ST1 to ST8 in FIG. 3 described above. Further, the programs stored in the memory 1005 include an image acquisition section 101, a skeleton point extraction section 102, a situation information acquisition section 103, a face orientation detection section 1031, a tilt detection section 1032, and a vehicle information acquisition section 1033. , normal seating determination section 104, feature amount calculation section 105, feature amount leveling section 106, physique determination section 107, and posture attention section 108 can be executed by a computer. Here, the memory 1005 is, for example, RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically non-volatile or volatile This includes semiconductor memory, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
 なお、画像取得部101と、骨格点抽出部102と、状況情報取得部103と、顔向き検出部1031と、傾き検出部1032と、車両情報取得部1033と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108の機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。例えば、画像取得部101については専用のハードウェアとしての処理回路1001でその機能を実現し、骨格点抽出部102と、状況情報取得部103と、顔向き検出部1031と、傾き検出部1032と、車両情報取得部1033と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108についてはプロセッサ1004がメモリ1005に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。
 また、体格判定装置100は、撮像装置200、シートベルト制御装置300、エアバッグ制御装置400、置き去り検知装置500、または、出力装置600等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
Note that the image acquisition unit 101, the skeleton point extraction unit 102, the situation information acquisition unit 103, the face orientation detection unit 1031, the tilt detection unit 1032, the vehicle information acquisition unit 1033, the normal seating determination unit 104, and the feature Regarding the functions of the quantity calculation unit 105, the feature quantity leveling unit 106, the physique determination unit 107, and the posture attention unit 108, some of them are realized by dedicated hardware, and some of them are realized by software or firmware. It's okay. For example, the function of the image acquisition unit 101 is realized by a processing circuit 1001 as dedicated hardware, and the function is realized by a skeletal point extraction unit 102, a situation information acquisition unit 103, a face orientation detection unit 1031, and a tilt detection unit 1032. The processor 1004 is stored in the memory 1005 for the vehicle information acquisition section 1033, normal seating determination section 104, feature amount calculation section 105, feature amount leveling section 106, physique determination section 107, and posture attention section 108. The function can be realized by reading and executing the program.
The physique determination device 100 also includes an input interface device 1002 that performs wired or wireless communication with devices such as an imaging device 200, a seatbelt control device 300, an airbag control device 400, an abandonment detection device 500, or an output device 600. and an output interface device 1003.
 なお、以上の実施の形態1では、体格判定装置100は、車両に搭載される車載装置とし、骨格点抽出部102と、状況情報取得部103と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108は、車載装置に備えられているものとした。
 これに限らず、骨格点抽出部102と、状況情報取得部103と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108のうち、一部が車両の車載装置に搭載され、その他が当該車載装置とネットワークを介して接続されるサーバに備えられるものとして、車載装置とサーバとで体格判定システムを構成するようにしてもよい。
 また、骨格点抽出部102と、状況情報取得部103と、正規着座判定部104と、特徴量算出部105と、特徴量平準化部106と、体格判定部107と、姿勢注意部108が全部サーバに備えられてもよい。
In the above first embodiment, the physique determination device 100 is an in-vehicle device mounted on a vehicle, and includes a skeleton point extraction section 102, a situation information acquisition section 103, a normal seating determination section 104, and a feature amount calculation section. 105, the feature value leveling unit 106, the physique determination unit 107, and the posture attention unit 108 are assumed to be included in the vehicle-mounted device.
However, the present invention is not limited to this, but includes the skeletal point extraction section 102, the situation information acquisition section 103, the normal seating determination section 104, the feature amount calculation section 105, the feature amount leveling section 106, the physique determination section 107, and the posture attention section. 108, part of which is installed in the on-vehicle device of the vehicle, and the other part is provided in a server connected to the in-vehicle device via a network, so that the in-vehicle device and the server constitute a physique determination system. Good too.
In addition, the skeletal point extraction section 102, the situation information acquisition section 103, the normal seating determination section 104, the feature amount calculation section 105, the feature amount leveling section 106, the physique determination section 107, and the posture attention section 108 are all It may be provided in the server.
 以上のように、実施の形態1によれば、体格判定装置100は、車両の乗員が撮像された撮像画像を取得する画像取得部101と、画像取得部101が取得した撮像画像に基づき、乗員の体の部位を示す乗員の骨格点を抽出する骨格点抽出部102と、車両または乗員の状況に関する状況情報に基づき、乗員の姿勢の状態は正規着座の状態であるか否かを判定する正規着座判定部104と、骨格点抽出部102が抽出した乗員の骨格点に関する情報と正規着座判定部104による乗員の姿勢の状態は正規着座の状態であるか否かの着座状態判定結果とに基づき、骨格点抽出部102が抽出した乗員の骨格点は乗員が正規着座の状態で撮像された撮像画像に基づき抽出された正規着座骨格点であるか否かを判定し、骨格点抽出部102が抽出した乗員の骨格点は正規着座骨格点であると判定した場合に、正規着座骨格点に関する情報に基づいて体格判定用特徴量を算出する特徴量算出部105と、特徴量算出部105が算出した体格判定用特徴量に基づき、乗員の体格を判定する体格判定部107を備えるように構成した。体格判定装置100は、乗員の姿勢の状態が正規着座の状態であるときの撮像画像に基づいて体格判定用特徴量を算出し、当該体格判定用特徴量に基づいて乗員の体格を判定するため、乗員の体格判定の精度の低下を防ぐことができる。すなわち、体格判定装置100は、乗員の姿勢が崩れた状態が発生したとしても、当該乗員の体格判定の精度の低下を防ぐことができる。 As described above, according to the first embodiment, the physique determination device 100 includes the image acquisition unit 101 that acquires a captured image of an occupant of a vehicle, and a skeletal point extracting unit 102 that extracts skeletal points of the occupant indicating the body parts of the occupant; The seating determination unit 104 determines whether or not the occupant's posture is a normal seating state based on the information regarding the occupant's skeletal points extracted by the skeletal point extraction unit 102 and the seating state determination result of the normal seating determination unit 104 as to whether or not the occupant's posture is in a normal seating state. , the skeletal point extraction unit 102 determines whether or not the skeletal points of the occupant extracted are normal seating skeletal points extracted based on an image taken with the occupant normally seated. When it is determined that the extracted skeletal point of the occupant is a normal sitting skeletal point, the feature calculating unit 105 calculates a feature for physique determination based on information about the normal sitting skeletal point, and the feature calculating unit 105 calculates The vehicle is configured to include a physique determination unit 107 that determines the physique of the occupant based on the physique determination feature quantity. The physique determination device 100 calculates a physique determination feature amount based on a captured image when the posture state of the occupant is in a normal seating state, and determines the occupant's physique based on the physique determination feature amount. , it is possible to prevent a decrease in the accuracy of determining the occupant's physique. That is, even if the occupant's posture collapses, the physique determination device 100 can prevent the accuracy of determining the occupant's physique from decreasing.
 なお、本開示は、実施の形態の任意の構成要素の変形、もしくは実施の形態の任意の構成要素の省略が可能である。 Note that in the present disclosure, any component of the embodiments can be modified or any component of the embodiments can be omitted.
 本開示に係る体格判定装置は、乗員の姿勢の状態が正規着座の状態であるときの撮像画像に基づいて体格判定用特徴量を算出し、当該体格判定用特徴量に基づいて乗員の体格を判定するため、乗員の体格判定の精度の低下を防ぐことができる。 A physique determination device according to the present disclosure calculates a physique determination feature amount based on a captured image when the posture state of an occupant is in a normally seated state, and determines the physique of the occupant based on the physique determination feature amount. Therefore, it is possible to prevent a decrease in the accuracy of determining the occupant's physique.
 100 体格判定装置、101 画像取得部、102 骨格点抽出部、103 状況情報取得部、1031 顔向き検出部、1032 傾き検出部、1033 車両情報取得部、104 正規着座判定部、105 特徴量算出部、106 特徴量平準化部、107 体格判定部、108 姿勢注意部、200 撮像装置、300 シートベルト制御装置、400 エアバッグ制御装置、500 置き去り検知装置、600 出力装置、1001 処理回路、1002 入力インタフェース装置、1003 出力インタフェース装置、1004 プロセッサ、1005 メモリ。 100 physique determination device, 101 image acquisition unit, 102 skeletal point extraction unit, 103 situation information acquisition unit, 1031 face orientation detection unit, 1032 tilt detection unit, 1033 vehicle information acquisition unit, 104 normal seating determination unit, 105 feature amount calculation unit , 106 feature leveling unit, 107 physique determination unit, 108 posture warning unit, 200 imaging device, 300 seatbelt control device, 400 airbag control device, 500 left behind detection device, 600 output device, 1001 processing circuit, 1002 input interface Device, 1003 Output interface device, 1004 Processor, 1005 Memory.

Claims (12)

  1.  車両の乗員が撮像された撮像画像を取得する画像取得部と、
     前記画像取得部が取得した前記撮像画像に基づき、前記乗員の体の部位を示す前記乗員の骨格点を抽出する骨格点抽出部と、
     前記車両または前記乗員の状況に関する状況情報に基づき、前記乗員の姿勢の状態は正規着座の状態であるか否かを判定する正規着座判定部と、
     前記骨格点抽出部が抽出した前記乗員の前記骨格点に関する情報と前記正規着座判定部による前記乗員の姿勢の状態は前記正規着座の状態であるか否かの着座状態判定結果とに基づき、前記骨格点抽出部が抽出した前記乗員の前記骨格点は前記乗員が前記正規着座の状態で撮像された前記撮像画像に基づき抽出された正規着座骨格点であるか否かを判定し、前記骨格点抽出部が抽出した前記乗員の前記骨格点は前記正規着座骨格点であると判定した場合に、前記正規着座骨格点に関する情報に基づいて体格判定用特徴量を算出する特徴量算出部と、
     前記特徴量算出部が算出した前記体格判定用特徴量に基づき、前記乗員の体格を判定する体格判定部
     とを備えた体格判定装置。
    an image acquisition unit that acquires a captured image of a vehicle occupant;
    a skeletal point extraction unit that extracts skeletal points of the occupant indicating body parts of the occupant based on the captured image acquired by the image acquisition unit;
    a normal seating determination unit that determines whether or not the occupant's posture is in a normal seating state based on situation information regarding the vehicle or the occupant's situation;
    Based on the information regarding the skeletal points of the occupant extracted by the skeletal point extracting unit and the seating state determination result of whether or not the occupant's posture state is the normal seating state by the normal seating determining unit, It is determined whether the skeletal points of the occupant extracted by the skeletal point extracting unit are normal seated skeletal points extracted based on the captured image of the occupant in the normally seated state, and the skeletal points are extracted from the skeletal points. a feature amount calculation unit that calculates a feature amount for physique determination based on information regarding the normal sitting skeletal point when the extracting unit determines that the skeletal point of the occupant is the normal sitting skeletal point;
    A physique determination unit that determines the physique of the occupant based on the physique determination feature calculated by the feature calculation unit.
  2.  前記状況情報は前記撮像画像に基づいて検出された前記乗員の顔向きに関する情報であり、
     前記正規着座判定部は、前記乗員の前記顔向きが正面方向を向いている場合、前記乗員の姿勢の状態は前記正規着座の状態であると判定する
     ことを特徴とする請求項1記載の体格判定装置。
    The situation information is information regarding the direction of the occupant's face detected based on the captured image,
    The physique according to claim 1, wherein the normal seating determination unit determines that the passenger's posture is the normal seating state when the face of the passenger is facing forward. Judgment device.
  3.  前記画像取得部が取得した前記撮像画像に基づき前記乗員の前記顔向きを検出する顔向き検出部を備え、
     前記正規着座判定部は、前記顔向き検出部が検出した前記乗員の前記顔向きが前記正面方向を向いている場合、前記乗員の姿勢の状態は前記正規着座の状態であると判定する
     ことを特徴とする請求項2記載の体格判定装置。
    comprising a face orientation detection unit that detects the face orientation of the occupant based on the captured image acquired by the image acquisition unit,
    The normal seating determination unit determines that the posture state of the occupant is the normal seating state when the face orientation of the occupant detected by the face orientation detection unit is facing the front direction. The physique determination device according to claim 2.
  4.  前記状況情報は前記乗員の前記骨格点を結ぶ骨格線の傾きに関する情報であり、
     前記骨格点抽出部が抽出した前記乗員の前記骨格点に関する情報に基づき、前記乗員の前記骨格線の傾きを検出する傾き検出部を備え、
     前記正規着座判定部は、前記傾き検出部が検出した前記乗員の前記骨格線の傾きが傾き判定用範囲内である場合、前記乗員の姿勢の状態は前記正規着座の状態であると判定する
     ことを特徴とする請求項1から請求項3のうちのいずれか1項記載の体格判定装置。
    The situation information is information regarding the inclination of a skeletal line connecting the skeletal points of the occupant,
    comprising an inclination detection unit that detects the inclination of the skeletal line of the occupant based on information regarding the skeletal points of the occupant extracted by the skeletal point extraction unit;
    The normal seating determination unit determines that the posture state of the occupant is the normal seating state when the inclination of the skeleton line of the occupant detected by the inclination detection unit is within a range for inclination determination. The physique determining device according to any one of claims 1 to 3, characterized in that:
  5.  前記乗員の前記骨格線は、前記乗員の両肩を示す前記骨格点を結ぶ線である
     ことを特徴とする請求項4記載の体格判定装置。
    The physique determining device according to claim 4, wherein the skeletal line of the occupant is a line connecting the skeletal points indicating both shoulders of the occupant.
  6.  前記状況情報は、前記車両のハンドル操舵角の状況、車速、前記車両のドアの開閉状況、前記車両のイグニッション状況、または、車載装置の操作状況を含む前記車両の状況に関する車両情報であり、
     前記車両情報を取得する車両情報取得部を備え、
     前記正規着座判定部は、前記車両情報取得部が取得した前記車両情報に基づき、前記車両の前記ハンドル操舵角が操舵角判定用範囲内である場合、前記車速が車速判定用範囲を超えている場合、前記車両の前記ドアが開いていない場合、イグニッションがオンにされた後の経過時間が開始判定用時間以内である場合、または、前記車載装置の操作がされていない場合、前記乗員の姿勢の状態は前記正規着座の状態であると判定する
     ことを特徴とする請求項1から請求項5のうちのいずれか1項記載の体格判定装置。
    The situation information is vehicle information related to the situation of the vehicle, including a steering angle situation of the vehicle, a vehicle speed, a door opening/closing situation of the vehicle, an ignition situation of the vehicle, or an operation situation of an on-vehicle device,
    comprising a vehicle information acquisition unit that acquires the vehicle information,
    The regular seating determination unit determines, based on the vehicle information acquired by the vehicle information acquisition unit, that if the steering wheel steering angle of the vehicle is within a steering angle determination range, the vehicle speed exceeds a vehicle speed determination range. In this case, if the door of the vehicle is not open, if the elapsed time after the ignition is turned on is within the start determination time, or if the in-vehicle device is not operated, the posture of the occupant The physique determining device according to any one of claims 1 to 5, wherein the state is determined to be the normal sitting state.
  7.  前記状況情報は前記骨格点抽出部が抽出した前記乗員の前記骨格点に関する情報であり、
     前記正規着座判定部は、前記骨格点抽出部が抽出した前記乗員の肘を示す前記骨格点の位置が肘位置判定用範囲内に位置している場合、前記乗員は前記正規着座の状態ではないと判定する
     ことを特徴とする請求項1から請求項6のうちのいずれか1項記載の体格判定装置。
    The situation information is information regarding the skeletal points of the occupant extracted by the skeletal point extraction unit,
    The normal seating determination unit determines that the occupant is not in the normal seating state when the position of the skeleton point indicating the elbow of the occupant extracted by the skeleton point extraction unit is located within an elbow position determination range. The physique determining device according to any one of claims 1 to 6, characterized in that it determines that.
  8.  前記正規着座判定部は、前記状況情報と、前記状況情報を入力とし前記乗員の姿勢の状態は前記正規着座の状態であるか否かを示す情報を出力する機械学習モデルとに基づき、前記乗員の姿勢の状態は前記正規着座の状態であるか否かを判定する
     ことを特徴とする請求項1記載の体格判定装置。
    The normal seating determination unit determines whether the occupant is seated based on the situation information and a machine learning model that receives the situation information and outputs information indicating whether or not the passenger's posture is in the normal seating state. The physique determining device according to claim 1, wherein it is determined whether or not the posture state is the normal sitting state.
  9.  平準化用期間遡って前記特徴量算出部が算出した複数の前記体格判定用特徴量を平準化する特徴量平準化部を備え、
     前記体格判定部は、前記特徴量平準化部が平準化した後の平準化後体格判定用特徴量に基づき、前記乗員の前記体格を判定する
     ことを特徴とする請求項1から請求項8のうちのいずれか1項記載の体格判定装置。
    comprising a feature leveling unit that levels the plurality of physique determination feature quantities calculated by the feature quantity calculation unit retroactively over a leveling period;
    The physique determination section determines the physique of the occupant based on the leveled physique determination feature amount leveled by the feature leveling section. The physique determination device according to any one of the items.
  10.  前記特徴量平準化部は、複数の前記体格判定用特徴量の平均値、中央値、または、設定されたパーセンタイルに対応する値を、前記平準化後体格判定用特徴量とする
     ことを特徴とする請求項9記載の体格判定装置。
    The feature amount leveling unit is characterized in that the average value, median value, or value corresponding to a set percentile of the plurality of the feature amounts for physique determination is set as the feature amount for physique determination after leveling. The physique determining device according to claim 9.
  11.  前記正規着座判定部によって判定された、前記乗員の姿勢の状態は前記正規着座の状態ではない状態が姿勢異常判定用時間継続している場合、前記乗員に対して前記正規着座の姿勢をとるよう促す姿勢注意情報を出力する姿勢注意部を備えた
     ことを特徴とする請求項1から請求項10のうちのいずれか1項記載の体格判定装置。
    If the state of the posture of the occupant determined by the normal seating determination unit is not the normal seating state for a period of time for determining posture abnormality, the occupant is instructed to assume the normal seating posture. The physique determination device according to any one of claims 1 to 10, further comprising a posture warning section that outputs posture warning information to prompt.
  12.  画像取得部が、車両の乗員が撮像された撮像画像を取得するステップと、
     骨格点抽出部が、前記画像取得部が取得した前記撮像画像に基づき、前記乗員の体の部位を示す前記乗員の骨格点を抽出するステップと、
     正規着座判定部が、前記車両または前記乗員の状況に関する状況情報に基づき、前記乗員の姿勢の状態は正規着座の状態であるか否かを判定するステップと、
     特徴量算出部が、前記骨格点抽出部が抽出した前記乗員の前記骨格点に関する情報と前記正規着座判定部による前記乗員の姿勢の状態は前記正規着座の状態であるか否かの着座状態判定結果とに基づき、前記骨格点抽出部が抽出した前記乗員の前記骨格点は前記乗員が前記正規着座の状態で撮像された前記撮像画像に基づき抽出された正規着座骨格点であるか否かを判定し、前記骨格点抽出部が抽出した前記乗員の前記骨格点は前記正規着座骨格点であると判定した場合に、前記正規着座骨格点に関する情報に基づいて体格判定用特徴量を算出するステップと、
     体格判定部が、前記特徴量算出部が算出した前記体格判定用特徴量に基づき、前記乗員の体格を判定するステップ
     とを備えた体格判定方法。
    a step in which the image acquisition unit acquires a captured image of an occupant of the vehicle;
    a skeletal point extraction unit extracting skeletal points of the occupant indicating body parts of the occupant based on the captured image acquired by the image acquisition unit;
    a step in which the normal seating determination unit determines whether or not the posture of the occupant is in a normal seating state based on situation information regarding the vehicle or the occupant;
    A feature amount calculation unit determines whether or not the posture of the occupant is the normal seating state based on the information regarding the skeletal points of the occupant extracted by the skeletal point extraction unit and the normal seating judgment unit. Based on the results, it is determined whether the skeletal points of the occupant extracted by the skeletal point extracting unit are normal seated skeletal points extracted based on the captured image of the occupant in the normally seated state. and calculating a feature amount for physique determination based on information regarding the regular sitting skeletal point when the skeletal point of the occupant extracted by the skeletal point extracting unit is determined to be the regular sitting skeletal point. and,
    A physique determining method, comprising: a physique determining section determining the physique of the occupant based on the physique determining feature calculated by the feature calculating section.
PCT/JP2022/030703 2022-08-12 2022-08-12 Physique determination device and physique determination method WO2024034109A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/030703 WO2024034109A1 (en) 2022-08-12 2022-08-12 Physique determination device and physique determination method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/030703 WO2024034109A1 (en) 2022-08-12 2022-08-12 Physique determination device and physique determination method

Publications (1)

Publication Number Publication Date
WO2024034109A1 true WO2024034109A1 (en) 2024-02-15

Family

ID=89851231

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/030703 WO2024034109A1 (en) 2022-08-12 2022-08-12 Physique determination device and physique determination method

Country Status (1)

Country Link
WO (1) WO2024034109A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10264751A (en) * 1997-03-21 1998-10-06 Autoliv Japan Kk Airbag device for passenger seat
JPH10315906A (en) * 1997-05-21 1998-12-02 Zexel Corp Occupant recognition method, occupant recognition device, air bag control method and air bag device
JP2008002838A (en) * 2006-06-20 2008-01-10 Takata Corp System for detecting vehicle occupant, actuator control system, and vehicle
JP2013112220A (en) * 2011-11-29 2013-06-10 Toyota Motor Corp Vehicle occupant restraining device
JP2021066276A (en) * 2019-10-18 2021-04-30 株式会社デンソー Device for determining physique of occupant
JP2021081836A (en) * 2019-11-15 2021-05-27 アイシン精機株式会社 Physical constitution estimation device and posture estimation device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10264751A (en) * 1997-03-21 1998-10-06 Autoliv Japan Kk Airbag device for passenger seat
JPH10315906A (en) * 1997-05-21 1998-12-02 Zexel Corp Occupant recognition method, occupant recognition device, air bag control method and air bag device
JP2008002838A (en) * 2006-06-20 2008-01-10 Takata Corp System for detecting vehicle occupant, actuator control system, and vehicle
JP2013112220A (en) * 2011-11-29 2013-06-10 Toyota Motor Corp Vehicle occupant restraining device
JP2021066276A (en) * 2019-10-18 2021-04-30 株式会社デンソー Device for determining physique of occupant
JP2021081836A (en) * 2019-11-15 2021-05-27 アイシン精機株式会社 Physical constitution estimation device and posture estimation device

Similar Documents

Publication Publication Date Title
US10369926B2 (en) Driver state sensing system, driver state sensing method, and vehicle including the same
US9822576B2 (en) Method for operating an activatable locking device for a door and/or a window, securing device for a vehicle, vehicle
CN106956619A (en) The system and method monitored for seat
JP6187155B2 (en) Gaze target estimation device
JP2021037216A (en) Eye closing determination device
JP2019189101A (en) Occupant information determination device
CN111601736B (en) Passenger detection device, passenger detection system, and passenger detection method
JP7134364B2 (en) physique determination device and physique determination method
JP5498183B2 (en) Behavior detection device
JP4858516B2 (en) Face orientation detection device
WO2024034109A1 (en) Physique determination device and physique determination method
JP2018117740A (en) Biological information detection apparatus
JP7267467B2 (en) ATTENTION DIRECTION DETERMINATION DEVICE AND ATTENTION DIRECTION DETERMINATION METHOD
JP7374373B2 (en) Physique determination device and physique determination method
US20240233405A9 (en) Occupant detection device, occupant detection system, and occupant detection method
CN115050089A (en) Correction device, correction method, drive recorder, vehicle, and storage medium
JP7320188B2 (en) Driver Abnormal Posture Detector
WO2023223442A1 (en) Physique determination device and physique determination method
JP7479575B2 (en) Physique estimation device and physique estimation method
CN112829645A (en) Self-adaptive adjustment method and system for vehicle
JP2020052869A (en) Identification device, identification method, and identification program
JP2005143895A (en) Device for judging psychological state of driver
WO2023084738A1 (en) Physique determination device and physique determination method
WO2024079779A1 (en) Passenger state determination device, passenger state determination system, passenger state determination method and program
JP7275409B2 (en) Occupant state determination device and occupant state determination method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22955024

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2024540207

Country of ref document: JP

Kind code of ref document: A