WO2022209286A1 - Determination device - Google Patents

Determination device Download PDF

Info

Publication number
WO2022209286A1
WO2022209286A1 PCT/JP2022/004470 JP2022004470W WO2022209286A1 WO 2022209286 A1 WO2022209286 A1 WO 2022209286A1 JP 2022004470 W JP2022004470 W JP 2022004470W WO 2022209286 A1 WO2022209286 A1 WO 2022209286A1
Authority
WO
WIPO (PCT)
Prior art keywords
determination
ground
foot
ankle
contact
Prior art date
Application number
PCT/JP2022/004470
Other languages
French (fr)
Japanese (ja)
Inventor
宏紀 寺島
克幸 永井
Original Assignee
Necソリューションイノベータ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necソリューションイノベータ株式会社 filed Critical Necソリューションイノベータ株式会社
Priority to JP2023510575A priority Critical patent/JPWO2022209286A1/ja
Priority to CN202280025925.4A priority patent/CN117119958A/en
Publication of WO2022209286A1 publication Critical patent/WO2022209286A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb

Definitions

  • the present invention relates to a determination device, a determination method, and a recording medium.
  • Patent Document 1 is a document describing human gait analysis.
  • a data acquisition unit that acquires two types of image data from a depth sensor, a skeleton information creation unit that creates skeleton information based on the image data acquired by the data acquisition unit, and a skeleton information creation unit that creates
  • a gait analysis device having a correction processing unit that corrects skeleton information and an analysis processing unit that analyzes a user's gait using the corrected skeleton information is described.
  • the contact timing is set to the frame when the displacement of the Y value of the thoracic and lumbar region, which is regarded as a three-dimensional position, changes from a negative value to a positive value.
  • the measurable section is limited to the size of the pressure gauge. Therefore, it is impossible to grasp the contact timing without limiting the measurable section.
  • an object of the present invention is to provide a determination device, a determination method, and a recording medium that solve the problem that it is difficult to grasp the timing at which a foot touches the ground based on image data.
  • the determination device which is one aspect of the present disclosure, Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. It has a configuration of having a determination unit.
  • the information processing device Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. take the configuration.
  • a recording medium that is another aspect of the present disclosure includes: information processing equipment, Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. It is a computer-readable recording medium in which a program for realizing processing is recorded.
  • FIG. 1 is a diagram illustrating a configuration example of a determination system according to a first embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram for explaining an example of image data acquired by the determination system
  • FIG. 2 is a block diagram showing a configuration example of a determination device shown in FIG. 1
  • FIG. FIG. 4 is a diagram showing an example of skeleton information
  • It is a figure which shows an example of grounding determination information.
  • FIG. 10 is a diagram for explaining an example of determination by a contact determination unit
  • FIG. 4 is a diagram for explaining an example of determination by a touch timing determination unit
  • 5 is a flow chart showing an example of processing by a ground contact determination unit
  • 6 is a flow chart showing an example of processing by a contact timing determination unit
  • FIG. 7 is a diagram illustrating a hardware configuration example of a determination device according to the second embodiment of the present disclosure
  • FIG. It is a block diagram which shows the structural example of a determination apparatus.
  • FIG. 1 is a diagram showing a configuration example of a determination system 100.
  • FIG. 2 is a diagram for explaining an example of image data acquired by the determination system 100.
  • FIG. 3 is a block diagram showing a configuration example of the determination device 300.
  • FIG. 4 is a diagram showing an example of the skeleton information 323.
  • FIG. 5 is a diagram showing an example of the contact determination information 324.
  • FIG. 6 is a diagram for explaining an example of determination by the contact determination unit 333.
  • FIG. 7 is a diagram for explaining an example of determination by the touchdown timing determining section 334.
  • FIG. 8 is a flow chart showing an example of processing by the contact determination unit 333.
  • FIG. 9 is a flow chart showing an example of processing by the contact timing determination section 334.
  • a determination system 100 that determines the timing at which the foot of a walking person touches the ground based on image data acquired using an imaging device such as the smartphone 200 will be described.
  • the determination system 100 acquires time-series image data showing how a person walks from the back side to the front side of the screen.
  • the determination system 100 recognizes the position of the skeleton from each of the plurality of image data. Then, the determination system 100 determines the timing at which the foot touches the ground based on the recognized change in position.
  • the timing of landing determined by the determination system 100 can be, for example, an index such as the angle of each joint at the timing when the foot touches the floor, whether the toe is raised from the floor, pitch, stride, various steps such as the step cycle. It can be used for gait analysis.
  • the touchdown timing determined by the determination system 100 may be used in addition to the above examples.
  • FIG. 1 shows a configuration example of the determination system 100.
  • the determination system 100 has, for example, a smart phone 200 and a determination device 300 .
  • the smartphone 200 and the determination device 300 are connected, for example, wirelessly or by wire so that they can communicate with each other.
  • the smart phone 200 functions as an imaging device that captures how a person walks.
  • the smartphone 200 may be a smartphone having general functions such as a camera function for acquiring image data, a touch panel for displaying a screen, and various sensors such as a GPS sensor and an acceleration sensor.
  • the smartphone 200 shoots a person walking in the front-back direction, such as from the back of the screen to the front.
  • smartphone 200 acquires time-series image data showing how a person walks in the front-to-back direction.
  • the smartphone 200 also transmits the acquired image data to the determination device 300 .
  • the smartphone 200 may associate information indicating the date and time when the image data was acquired with the image data, and transmit the associated data to the determination device 300 .
  • the determination device 300 is an information processing device that determines the timing at which the foot of a walking person touches the ground based on the image data acquired by the smartphone 200 .
  • the determination device 300 is a server device or the like.
  • the determination device 300 may be a single information processing device, or may be implemented on a cloud, for example.
  • FIG. 3 shows a configuration example of the determination device 300.
  • the determination device 300 has, for example, a communication I/F section 310, a storage section 320, and an arithmetic processing section 330 as main components.
  • the communication I/F unit 310 consists of a data communication circuit. Communication I/F unit 310 performs data communication with an external device, smartphone 200, or the like connected via a communication line.
  • the storage unit 320 is a storage device such as a hard disk or memory.
  • the storage unit 320 stores processing information and programs 326 necessary for various processes in the arithmetic processing unit 330 .
  • the program 326 realizes various processing units by being read and executed by the arithmetic processing unit 330 .
  • the program 326 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 310 and stored in the storage unit 320 .
  • Main information stored in the storage unit 320 includes, for example, a learned model 321, image information 322, skeleton information 323, contact determination information 324, contact timing information 325, and the like.
  • the trained model 321 is a trained model used when the skeleton recognition unit 332 performs skeleton recognition.
  • the learned model 321 is, for example, generated in advance by performing machine learning using teacher data such as image data containing skeletal coordinates in an external device, etc. It is acquired from a device or the like and stored in the storage unit 320 . Trained model 321 may be generated using known methods. The learned model 321 may be updated by re-learning processing using additional teacher data.
  • the image information 322 includes time-series image data acquired by the camera of the smartphone 200 .
  • the image information 322 is generated and updated when the image acquisition unit 331 acquires image data.
  • identification information for identifying image data is associated with image data.
  • the image information 322 may include information other than the above examples, such as information indicating the date and time when the smartphone 200 acquired the image data.
  • the skeleton information 323 includes information indicating the coordinates of each part of the person recognized by the skeleton recognition unit 332 (that is, indicating the position).
  • the skeleton information 323 includes, for example, information indicating the coordinates of each part of each piece of time-series image data.
  • the skeleton information 323 is generated/updated as a result of processing by the skeleton recognition unit 332 .
  • FIG. 4 shows an example of the skeleton information 323.
  • identification information is associated with position information of each part.
  • the identification information is, for example, information indicating image data used when recognizing skeleton information.
  • the identification information may correspond to identification information for identifying image data in the image information 322 .
  • the position information of each part includes information indicating the coordinates of each part in the image data, such as the position of the pelvis.
  • the part included in the position information of each part corresponds to the learned model 321 .
  • the pelvis, the center of the spine, ..., the right knee, the left knee, ..., the right ankle, the left ankle, ... is exemplified.
  • the position information of each part can include about 30 parts such as right shoulder, . . . , left elbow, .
  • the position information of each part includes at least information indicating the coordinates of the right ankle and the left ankle.
  • the parts included in the position information of each part may be other than those illustrated in FIG. 4 and the like.
  • the ground contact determination information 324 includes information indicating the determination result of the ground contact determination unit 333 .
  • the ground contact determination information 324 is generated and updated as a result of determination by the ground contact determination section 333 .
  • FIG. 5 shows an example of the contact determination information 324.
  • the identification information is, for example, information indicating image data for which grounding determination has been performed.
  • the identification information may correspond to identification information for identifying image data in the image information 322 .
  • the ground contact determination result indicates the ground contact determination result.
  • the ground contact determination result is "left foot” indicating the result of determining that the left foot is on the ground, "right foot” indicating the result of determining that the right foot is on the ground, and "determining that it is excluded from the ground contact determination". -" etc. are included.
  • the contact determination result may indicate other than the above examples.
  • the touchdown timing information 325 includes information indicating the determination result of the touchdown timing determination unit 334 .
  • the touchdown timing information 325 is generated and updated as a result of determination by the touchdown timing determination section 334 .
  • the ground contact timing information 325 includes information indicating the timing at which the foot touches the ground.
  • the information indicating the timing at which the foot touched the ground is, for example, at least one of identification information indicating the image data determined to have touched the ground, information indicating the acquisition date and time of the image data determined to have touched the ground, and the like. be.
  • the contact timing information 325 may be, for example, information indicating the foot that has touched the ground and information indicating the timing at which the foot touched the ground in association with each other.
  • the arithmetic processing unit 330 has an arithmetic device such as a CPU and its peripheral circuits.
  • the arithmetic processing unit 330 reads the program 326 from the storage unit 320 and executes it, so that the hardware and the program 326 work together to realize various processing units.
  • Main processing units realized by the arithmetic processing unit 330 include, for example, an image acquisition unit 331, a skeleton recognition unit 332, a contact determination unit 333, a contact timing determination unit 334, an output unit 335, and the like. .
  • the image acquisition unit 331 acquires image data acquired by the smartphone 200 from the smartphone 200 via the communication I/F unit 310 .
  • the image acquisition unit 331 acquires time-series image data from the smartphone 200 .
  • the image acquisition unit 331 also stores the acquired image data in the storage unit 320 as the image information 322 .
  • the skeleton recognition unit 332 uses the trained model 321 to recognize the skeleton of the person whose walking posture is to be measured in the image data. For example, the skeleton recognition unit 332 recognizes the skeleton of each piece of time-series image data.
  • the skeleton recognition unit 332 may be configured to recognize the skeleton of image data extracted based on a predetermined condition from time-series image data. For example, the skeleton recognition unit 332 recognizes each part such as the upper part of the spine, the right shoulder, the left shoulder, the right elbow, the left elbow, the right wrist, the left wrist, the right hand, the left hand, and so on.
  • the skeleton recognition unit 332 also calculates the coordinates of the recognized parts in the screen data. Then, the skeleton recognition unit 332 associates the recognition/calculation results with identification information for identifying the image data and stores them in the storage unit 320 as the skeleton information 323 .
  • the parts recognized by the skeleton recognition unit 332 correspond to the trained model 321 (teaching data used when learning the trained model 321). Therefore, the skeleton recognition unit 332 may recognize parts other than those exemplified above according to the trained model 321 .
  • the grounding determination unit 333 determines which foot is in contact with the ground based on the skeleton of the person indicated by the skeleton information 323 . For example, the contact determination unit 333 determines which of the right and left legs of the walking person is on the ground, based on the degree of change in the coordinates of the right and left ankles included in the skeleton information 323. do. For example, the contact determination unit 333 obtains information indicating the coordinates of the ankle by referring to the skeleton information 323 . Then, when the degree of change in the coordinates of the ankle satisfies a predetermined condition, the ground contact determination unit 333 determines that the foot having the ankle that satisfies the condition is in contact with the ground. Further, the contact determination unit 333 stores information indicating the result of the decision in the storage unit 320 as contact determination information 324 .
  • FIG. 6 shows an example of changes in the Y coordinates (coordinates in the vertical direction of the image data) of the right and left ankles.
  • the X axis indicates values corresponding to time
  • the Y axis indicates vertical coordinates (Y coordinates) in image data.
  • the larger the Y-axis value the lower the right or left ankle in the image data (that is, the closer the right or left ankle is to the front of the screen). It is shown that.
  • the ground contact determination unit 333 determines which foot is grounded based on the degree of change in the Y coordinates of the right ankle and the left ankle. In other words, the ground contact determination unit 333 determines that the foot having the ankle is in contact with the ground when it can be evaluated that the Y coordinate of the ankle is constant. Specifically, for example, the contact determination unit 333 compares the Y coordinate of the ankle in the image data to be determined with the Y coordinate of the ankle in the image data preceding the determination target in the time-series image data. .
  • the contact determination unit 333 assumes that the Y coordinate is constant and determines the ankle with the constant Y coordinate. It is determined that the foot that has the ground is in contact with the ground. Note that the contact determination unit 333 may determine that the Y coordinate is constant by a method other than the above-described example. Further, the range of displacement in which the ground contact determination unit 333 determines that the Y coordinate is constant may be set arbitrarily.
  • the contact determination unit 333 is configured to determine to exclude from contact determination when it is determined that the leg other than the leg determined to be in contact with the ground has started to move based on the Y-coordinate of the ankle. You may For example, in general walking, the left foot begins to move after the right foot touches the ground. After the Y coordinate of the right ankle becomes constant, the contact determination section 333 may make a decision to exclude from contact determination at the timing when the Y coordinate of the left ankle starts to increase.
  • the contact determination unit 333 determines that the displacement of the Y coordinate of the left ankle from the previous image data in the time-series image data When the displacement of the Y coordinate of the right ankle from the image data is exceeded, it can be determined that the left foot has started to move and can be excluded from the ground contact determination.
  • the contact determination unit 333 may be configured to determine whether or not the foot is contacting the ground by focusing on only the ankle having the larger Y coordinate between the right ankle and the left ankle.
  • the contact determination unit 333 may be configured to refer to the coordinates of the knee in addition to the coordinates of the ankle when determining the contact.
  • the contact determination unit 333 may be configured to make a decision to exclude from the contact decision when the coordinate change of the knee satisfies a predetermined condition.
  • the ground contact determination unit 333 determines whether the displacement of the Y coordinate of the knee from the previous image data in the time-series image data is 0 or less for the leg to be determined whether or not it is grounded. , it may be configured to make a determination to exclude it from the ground contact determination.
  • the contact determination unit 333 makes contact determination based on the degree of change in the Y coordinate of the ankle.
  • the contact determination unit 333 may be configured to perform contact determination based on the degree of change in the skeleton other than the above example, such as using the Y coordinate of the toe portion.
  • the ground contact timing determination unit 334 determines the timing at which the foot touches the ground based on the ground contact state of the foot determined based on the skeleton of the person indicated by the skeleton information 323 . For example, the contact timing determination unit 334 determines the timing at which the foot contacts the ground based on the change in the contacting foot indicated by the contact determination information 324 . Then, the contact timing determination section 334 stores the decision result as the contact timing information 325 in the storage section 320 .
  • the ground contact timing determination unit 334 determines that the foot has touched the ground at the timing when the grounded foot indicated by the ground contact determination result is switched to another foot. For example, referring to FIG. 7, the "left foot” is grounded in the identification information "124" and “125", and is excluded from the grounding determination in the identification information "126" and "127". After that, it is determined that the "right foot” is on the ground with the identification information "128". That is, in the case of FIG. 7, the foot on the ground is switched from the left foot to the right foot at the timing of the identification information "128". Therefore, the ground contact timing determination unit 334 determines that the right foot has grounded at the timing of the identification information "128".
  • the output unit 335 outputs contact timing information 325, contact determination information 324, and the like.
  • the output unit 335 can output at least one of the above information to an external device, the smart phone 200, or the like.
  • the contact determination unit 333 compares the Y coordinate of the knee in the image data to be determined with the Y coordinate of the knee in the image data preceding the determination target in the time-series image data (step S101). If the displacement of the knee from the Y coordinate in the previous image data exceeds 0 (step S101, Yes), the contact determination unit 333 confirms whether or not the displacement of the ankle satisfies the condition (step S102). .
  • the ground contact determination unit 333 determines that the foot satisfying the condition is in contact with the ground (step S103). On the other hand, if the displacement of the Y coordinate of the knee is 0 or less (step S101, No) or if the displacement of the ankle does not satisfy the condition (step S102, No), the ground contact determination unit 333 determines that the foot that does not satisfy the condition is grounded. decide not to.
  • the above is an operation example of the contact determination unit 333 .
  • the contact determination unit 333 determines that the leg other than the leg determined to be on the ground starts to move. If no determination is made or if the coordinate change of the knee satisfies the condition, it can be determined that the displacement of the ankle satisfies the condition. Further, the contact determination unit 333 may be configured to identify the ankle of interest based on the Y-coordinate of the ankle and perform the above-described confirmation on the identified ankle.
  • the contact timing determination unit 334 refers to the contact timing information 325 and checks whether or not the contacting foot indicated by the contact decision result has changed (step S201).
  • the contact timing determination unit 334 determines that the changed timing is the contact timing (step S202). On the other hand, if the grounded foot indicated by the determination result has not changed (step S201), the ground contact timing determining section 334 does not perform the above determination.
  • the above is a processing example of the contact timing determination unit 334 .
  • the determination device 300 has the skeleton recognition section 332 , the contact determination section 333 and the contact timing determination section 334 .
  • the contact timing determination unit 334 determines the timing at which the foot touches the ground based on the determination result of the contacting foot made by the contact determination unit 333 using the coordinates of the skeleton recognized by the skeleton recognition unit 332 . can judge. That is, according to the above configuration, it is possible to grasp the timing when the foot touches the ground based on the image data.
  • the determination device 300 determines the contact timing as a result of focusing on the part near the foot that is actually in contact with the ground. Therefore, the determination device 300 can perform determination closer to the actual condition than the method using the displacement of the thoracolumbar region.
  • determination device 300 may be configured to determine the contact timing based on the result of skeleton recognition by an external device that recognizes the skeleton. When configured in this manner, determination device 300 does not need to have the function of skeleton recognition section 332 .
  • FIG. 10 an information processing device
  • FIG. 10 shows a hardware configuration example of the determination device 400 .
  • the determination device 400 has the following hardware configuration as an example.
  • - CPU Central Processing Unit
  • 401 (arithmetic unit)
  • ROM Read Only Memory
  • RAM Random Access Memory
  • Program group 404 loaded into RAM 403 - Storage device 405 for storing program group 404 -
  • a drive device 406 that reads and writes a recording medium 410 outside the information processing device -
  • a communication interface 407 that connects to a communication network 411 outside the information processing apparatus
  • An input/output interface 408 for inputting/outputting data A bus 409 connecting each component
  • the determination device 400 can realize the function of the determination unit 421 shown in FIG.
  • the program group 404 is stored in the storage device 405 or the ROM 402 in advance, for example, and is loaded into the RAM 403 or the like and executed by the CPU 401 as necessary.
  • the program group 404 may be supplied to the CPU 401 via the communication network 411 or stored in the recording medium 410 in advance, and the drive device 406 may read the program and supply it to the CPU 401 .
  • FIG. 10 shows a hardware configuration example of the determination device 400 .
  • the hardware configuration of the determination device 400 is not limited to the case described above.
  • the determination device 400 may be composed of part of the configuration described above, such as not having the drive device 406 .
  • the determination unit 421 acquires information indicating the position where the predetermined part exists, which is acquired from each time-series image data, and determines whether the foot touches the ground based on the degree of change in the position where the acquired part exists. determine the timing.
  • the determination device 400 has the determination unit 421 .
  • the determination unit 421 can determine the timing at which the foot touches the ground based on the degree of change in the position of the part. This makes it possible to grasp the timing at which the foot touches the ground based on the image data.
  • An information processing device such as the determination device 400 described above can be realized by installing a predetermined program in the information processing device.
  • a program according to another aspect of the present invention acquires, in an information processing apparatus, information indicating a position where a predetermined site exists, which is acquired from each time-series image data, and acquires A program for realizing a process of determining the timing at which the foot touches the ground based on the degree of change in the position where the part exists.
  • the information processing device acquires information indicating the position where the predetermined part exists, which is acquired from each time-series image data, and acquires The timing at which the foot touches the ground is determined based on the degree of change in the position of the part.
  • (Appendix 1) Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists.
  • a determination device having a determination unit.
  • (Appendix 2) having a judgment unit that judges the foot that is in contact with the ground based on the degree of change in the position where the part exists; The determination device according to supplementary note 1, wherein the determination unit determines the timing at which the foot touches the ground based on the determination result of the determination unit.
  • the determination device (Appendix 3) The determination device according to appendix 2, wherein the determination unit determines that the foot has touched the ground at a timing at which it is determined that the grounded foot has changed, based on the determination result of the determination unit. (Appendix 4)
  • the information indicating the position where the part exists includes information indicating the position where the ankle exists, which is determined by recognizing the skeleton of the person in the image data, The determination device according to appendix 2 or appendix 3, wherein the judging unit judges which foot is in contact with the ground based on how the position of the ankle changes.
  • the information indicating the position where the part exists includes information indicating the position where the knee exists, which is determined by recognizing the skeleton of the person in the image data, 7.
  • the determination device according to any one of appendices 4 to 7, wherein the judgment unit judges which foot is in contact with the ground based on the degree of change in the position of the ankle and the degree of change in the position of the knee.
  • the information processing device Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. judgment method.
  • the programs described in each of the above embodiments and supplementary notes are stored in a storage device or recorded in a computer-readable recording medium.
  • the recording medium is a portable medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A determination device 400 has a determination unit 421 that acquires information acquired from each piece of time-series image data and indicating the position where a predetermined portion exists, and determines a timing when the foot contacts the ground on the basis of a change in the acquired position at which the portion exists.

Description

判定装置judgment device
 本発明は、判定装置、判定方法、記録媒体に関する。 The present invention relates to a determination device, a determination method, and a recording medium.
 データに基づいてユーザの歩行分析を行うことが知られている。 It is known to perform user gait analysis based on data.
 人物の歩行分析について記載された文献として、例えば、特許文献1がある。特許文献1には、デプスセンサから2種類の画像データを取得するデータ取得部と、データ取得部が取得した画像データに基づいて骨格情報を作成する骨格情報作成部と、骨格情報作成部が作成した骨格情報を補正する補正処理部と、補正後の骨格情報を用いてユーザの歩行を分析する分析処理部と、を有する歩行分析装置が記載されている。 For example, Patent Document 1 is a document describing human gait analysis. In Patent Document 1, a data acquisition unit that acquires two types of image data from a depth sensor, a skeleton information creation unit that creates skeleton information based on the image data acquired by the data acquisition unit, and a skeleton information creation unit that creates A gait analysis device having a correction processing unit that corrects skeleton information and an analysis processing unit that analyzes a user's gait using the corrected skeleton information is described.
国際公開第2017/170832号WO2017/170832
 様々な歩行分析を行う際には、足が接地したタイミングを把握することが必要である。しかしながら、例えば、特許文献1に記載されているようなデプスセンサを用いずに画像データに基づく分析を行う場合、奥行き方向の情報を取得することが出来ないため、接地のタイミングを把握することが難しかった。具体的には、例えば、特許文献1では、3次元位置として捉えた胸腰部のY値の変位が負の値から正の値になった時のフレームを接地タイミングとしている。しかしながら、デプスセンサなどを用いずに一方向から取得した画像データに基づく分析を行う場合、奥行き方向の情報を取得することが出来ないために3次元的に骨格を捉えることが出来なくなる。その結果、例えば、人物を横から撮影した画像データの場合、胸腰部よりカメラ側にある腕や肩が邪魔になり胸腰部を認識することが難しくなり、また、人物を正面から撮影した画像データの場合、画面奥から手前に歩く人物は奥と手前で画面に映る大きさが変わってしまうため特許文献1に記載されているような胸腰部の変位を利用した方法をそのまま使用することが出来なくなっていた。 When performing various gait analyses, it is necessary to grasp the timing when the foot touches the ground. However, for example, when performing analysis based on image data without using a depth sensor as described in Patent Literature 1, since information in the depth direction cannot be obtained, it is difficult to grasp the timing of ground contact. rice field. Specifically, for example, in Japanese Unexamined Patent Application Publication No. 2002-100000, the contact timing is set to the frame when the displacement of the Y value of the thoracic and lumbar region, which is regarded as a three-dimensional position, changes from a negative value to a positive value. However, when performing analysis based on image data acquired from one direction without using a depth sensor or the like, it is impossible to acquire information in the depth direction, and therefore the skeleton cannot be captured three-dimensionally. As a result, for example, in the case of image data of a person photographed from the side, the arms and shoulders closer to the camera than the thoracic and lumbar region interfere with recognition of the thoracic and lumbar region, making it difficult to recognize the thoracic and lumbar region. In this case, since the size of a person walking from the back of the screen to the front changes on the screen between the back and front, the method using the displacement of the thoracic and lumbar region as described in Patent Document 1 can be used as it is. was gone.
 なお、接地タイミングの把握方法としては、例えば、床に圧力計を敷いて判定する方法もある。しかしながら、このような方法の場合、計測可能な区間が圧力計の大きさに限られてしまう。そのため、計測可能な区間を限定することなく接地タイミングの把握を行うことは出来ない。 As a method of grasping the touchdown timing, for example, there is also a method of judging by laying a pressure gauge on the floor. However, in the case of such a method, the measurable section is limited to the size of the pressure gauge. Therefore, it is impossible to grasp the contact timing without limiting the measurable section.
 そこで、本発明の目的は、画像データに基づいて足が接地したタイミングを把握することが難しい、という課題を解決する判定装置、判定方法、記録媒体を提供することにある。 Therefore, an object of the present invention is to provide a determination device, a determination method, and a recording medium that solve the problem that it is difficult to grasp the timing at which a foot touches the ground based on image data.
 かかる目的を達成するため本開示の一形態である判定装置は、
 時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する判定部を有する
 という構成をとる。
In order to achieve such an object, the determination device, which is one aspect of the present disclosure,
Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. It has a configuration of having a determination unit.
 また、本開示の他の形態である判定方法は、
 情報処理装置が、
 時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する
 という構成をとる。
In addition, a determination method, which is another form of the present disclosure,
The information processing device
Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. take the configuration.
 また、本開示の他の形態である記録媒体は、
 情報処理装置に、
 時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する
 処理を実現するためのプログラムを記録した、コンピュータが読み取り可能な記録媒体である。
In addition, a recording medium that is another aspect of the present disclosure includes:
information processing equipment,
Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. It is a computer-readable recording medium in which a program for realizing processing is recorded.
 上述したような各構成によると、画像データに基づいて足が接地したタイミングを把握することが可能となる。 According to each configuration as described above, it is possible to grasp the timing at which the foot touches the ground based on the image data.
本開示の第1の実施形態における判定システムの構成例を示す図である。1 is a diagram illustrating a configuration example of a determination system according to a first embodiment of the present disclosure; FIG. 判定システムにおいて取得する画像データの一例を説明するための図である。FIG. 4 is a diagram for explaining an example of image data acquired by the determination system; FIG. 図1で示す判定装置の構成例を示すブロック図である。2 is a block diagram showing a configuration example of a determination device shown in FIG. 1; FIG. 骨格情報の一例を示す図である。FIG. 4 is a diagram showing an example of skeleton information; 接地判断情報の一例を示す図である。It is a figure which shows an example of grounding determination information. 接地判断部による判断例を説明するための図である。FIG. 10 is a diagram for explaining an example of determination by a contact determination unit; 接地タイミング判定部による判定例を説明するための図である。FIG. 4 is a diagram for explaining an example of determination by a touch timing determination unit; 接地判断部による処理の一例を示すフローチャートである。5 is a flow chart showing an example of processing by a ground contact determination unit; 接地タイミング判定部による処理の一例を示すフローチャートである。6 is a flow chart showing an example of processing by a contact timing determination unit; 本開示の第2の実施形態における判定装置のハードウェア構成例を示す図である。FIG. 7 is a diagram illustrating a hardware configuration example of a determination device according to the second embodiment of the present disclosure; FIG. 判定装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of a determination apparatus.
[第1の実施形態]
 本開示の第1の実施形態について、図1から図9までを参照して説明する。図1は、判定システム100の構成例を示す図である。図2は、判定システム100において取得する画像データの一例を説明するための図である。図3は、判定装置300の構成例を示すブロック図である。図4は、骨格情報323の一例を示す図である。図5は、接地判断情報324の一例を示す図である。図6は、接地判断部333による判断例を説明するための図である。図7は、接地タイミング判定部334による判定例を説明するための図である。図8は、接地判断部333による処理の一例を示すフローチャートである。図9は、接地タイミング判定部334による処理の一例を示すフローチャートである。
[First embodiment]
A first embodiment of the present disclosure will be described with reference to FIGS. 1 to 9. FIG. FIG. 1 is a diagram showing a configuration example of a determination system 100. As shown in FIG. FIG. 2 is a diagram for explaining an example of image data acquired by the determination system 100. As shown in FIG. FIG. 3 is a block diagram showing a configuration example of the determination device 300. As shown in FIG. FIG. 4 is a diagram showing an example of the skeleton information 323. As shown in FIG. FIG. 5 is a diagram showing an example of the contact determination information 324. As shown in FIG. FIG. 6 is a diagram for explaining an example of determination by the contact determination unit 333. In FIG. FIG. 7 is a diagram for explaining an example of determination by the touchdown timing determining section 334. In FIG. FIG. 8 is a flow chart showing an example of processing by the contact determination unit 333. As shown in FIG. FIG. 9 is a flow chart showing an example of processing by the contact timing determination section 334. As shown in FIG.
 本開示の第1の実施形態においては、スマートフォン200などの撮像装置を用いて取得した画像データに基づいて、歩行する人物の足が接地したタイミングを判定する判定システム100について説明する。判定システム100では、画面の奥側から手前側に向かって人物が歩く様子を示す時系列の画像データを取得する。後述するように、判定システム100は、複数の画像データそれぞれから骨格の位置を認識する。そして、判定システム100は、認識した位置の変化具合に基づいて、足が接地したタイミングを判定する。 In the first embodiment of the present disclosure, a determination system 100 that determines the timing at which the foot of a walking person touches the ground based on image data acquired using an imaging device such as the smartphone 200 will be described. The determination system 100 acquires time-series image data showing how a person walks from the back side to the front side of the screen. As will be described later, the determination system 100 recognizes the position of the skeleton from each of the plurality of image data. Then, the determination system 100 determines the timing at which the foot touches the ground based on the recognized change in position.
 なお、判定システム100が判定した接地のタイミングは、例えば、足が床についたタイミングでの各関節の角度やつま先が床から上がっているかなどの指標、ピッチ、ストライド、一歩行周期などの様々な歩行分析を行う際に活用することが出来る。判定システム100が判定した接地のタイミングは、上記例示した以外に活用されてもよい。 It should be noted that the timing of landing determined by the determination system 100 can be, for example, an index such as the angle of each joint at the timing when the foot touches the floor, whether the toe is raised from the floor, pitch, stride, various steps such as the step cycle. It can be used for gait analysis. The touchdown timing determined by the determination system 100 may be used in addition to the above examples.
 図1は、判定システム100の構成例を示している。図1を参照すると、判定システム100は、例えば、スマートフォン200と、判定装置300と、を有している。図1で示すように、スマートフォン200と判定装置300とは、例えば、無線、または、有線により、互いに通信可能なよう接続されている。 FIG. 1 shows a configuration example of the determination system 100. FIG. Referring to FIG. 1 , the determination system 100 has, for example, a smart phone 200 and a determination device 300 . As shown in FIG. 1, the smartphone 200 and the determination device 300 are connected, for example, wirelessly or by wire so that they can communicate with each other.
 スマートフォン200は、人物が歩く様子を撮影する撮像装置として機能する。スマートフォン200は、画像データを取得するカメラ機能、画面表示を行うタッチパネル、GPSセンサや加速度センサなどの各種センサ、などの一般的な機能を有するスマートフォンであって構わない。 The smart phone 200 functions as an imaging device that captures how a person walks. The smartphone 200 may be a smartphone having general functions such as a camera function for acquiring image data, a touch panel for displaying a screen, and various sensors such as a GPS sensor and an acceleration sensor.
 本実施形態の場合、図2で示すように、スマートフォン200は、画面奥から手前など人物が手前奥方向に歩く様子を撮影する。換言すると、スマートフォン200は、人物が手前奥方向に歩く様子を示す時系列の画像データを取得する。また、スマートフォン200は、取得した画像データを判定装置300へと送信する。スマートフォン200は、画像データを取得した日時を示す情報などを画像データに対応付けて、対応づけたデータを判定装置300へと送信してもよい。 In the case of this embodiment, as shown in FIG. 2, the smartphone 200 shoots a person walking in the front-back direction, such as from the back of the screen to the front. In other words, smartphone 200 acquires time-series image data showing how a person walks in the front-to-back direction. The smartphone 200 also transmits the acquired image data to the determination device 300 . The smartphone 200 may associate information indicating the date and time when the image data was acquired with the image data, and transmit the associated data to the determination device 300 .
 判定装置300は、スマートフォン200が取得した画像データに基づいて、歩行する人物の足が地面に接地したタイミングを判定する情報処理装置である。例えば、判定装置300は、サーバ装置などである。判定装置300は、1台の情報処理装置であってもよいし、例えば、クラウド上などで実現されてもよい。 The determination device 300 is an information processing device that determines the timing at which the foot of a walking person touches the ground based on the image data acquired by the smartphone 200 . For example, the determination device 300 is a server device or the like. The determination device 300 may be a single information processing device, or may be implemented on a cloud, for example.
 図3は、判定装置300の構成例を示している。図3を参照すると、判定装置300は、主な構成要素として、例えば、通信I/F部310と、記憶部320と、演算処理部330と、を有している。 FIG. 3 shows a configuration example of the determination device 300. FIG. Referring to FIG. 3, the determination device 300 has, for example, a communication I/F section 310, a storage section 320, and an arithmetic processing section 330 as main components.
 通信I/F部310は、データ通信回路からなる。通信I/F部310は、通信回線を介して接続された外部装置やスマートフォン200などとの間でデータ通信を行う。 The communication I/F unit 310 consists of a data communication circuit. Communication I/F unit 310 performs data communication with an external device, smartphone 200, or the like connected via a communication line.
 記憶部320は、ハードディスクやメモリなどの記憶装置である。記憶部320は、演算処理部330における各種処理に必要な処理情報やプログラム326を記憶する。プログラム326は、演算処理部330に読み込まれて実行されることにより各種処理部を実現する。プログラム326は、通信I/F部310などのデータ入出力機能を介して外部装置や記録媒体から予め読み込まれ、記憶部320に保存されている。記憶部320で記憶される主な情報としては、例えば、学習済みモデル321、画像情報322、骨格情報323、接地判断情報324、接地タイミング情報325などがある。 The storage unit 320 is a storage device such as a hard disk or memory. The storage unit 320 stores processing information and programs 326 necessary for various processes in the arithmetic processing unit 330 . The program 326 realizes various processing units by being read and executed by the arithmetic processing unit 330 . The program 326 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 310 and stored in the storage unit 320 . Main information stored in the storage unit 320 includes, for example, a learned model 321, image information 322, skeleton information 323, contact determination information 324, contact timing information 325, and the like.
 学習済みモデル321は、骨格認識部332が骨格認識を行う際に用いる、学習済みのモデルである。学習済みモデル321は、例えば、外部装置などにおいて、骨格座標が入った画像データなどの教師データを用いた機械学習を行うことにより予め生成されており、通信I/F部310などを介して外部装置などから取得され、記憶部320に格納されている。学習済みモデル321は、既知の方法を用いて生成されたものであってよい。学習済みモデル321は、追加の教師データを用いた再学習処理などにより更新されても構わない。 The trained model 321 is a trained model used when the skeleton recognition unit 332 performs skeleton recognition. The learned model 321 is, for example, generated in advance by performing machine learning using teacher data such as image data containing skeletal coordinates in an external device, etc. It is acquired from a device or the like and stored in the storage unit 320 . Trained model 321 may be generated using known methods. The learned model 321 may be updated by re-learning processing using additional teacher data.
 画像情報322は、スマートフォン200が有するカメラが取得した時系列の画像データを含んでいる。画像情報322は、画像取得部331が画像データを取得することなどにより生成・更新される。 The image information 322 includes time-series image data acquired by the camera of the smartphone 200 . The image information 322 is generated and updated when the image acquisition unit 331 acquires image data.
 例えば、画像情報322では、画像データを識別するための識別情報と、画像データと、が対応付けられている。画像情報322は、画像データをスマートフォン200が取得した日時を示す情報など上記例示した以外の情報を含んでもよい。 For example, in the image information 322, identification information for identifying image data is associated with image data. The image information 322 may include information other than the above examples, such as information indicating the date and time when the smartphone 200 acquired the image data.
 骨格情報323は、骨格認識部332により認識された人物の各部位の座標を示す(つまり、位置を示す)情報を含んでいる。骨格情報323は、例えば、時系列の各画像データについての各部位の座標を示す情報を含んでいる。例えば、骨格情報323は、骨格認識部332による処理の結果として生成・更新される。 The skeleton information 323 includes information indicating the coordinates of each part of the person recognized by the skeleton recognition unit 332 (that is, indicating the position). The skeleton information 323 includes, for example, information indicating the coordinates of each part of each piece of time-series image data. For example, the skeleton information 323 is generated/updated as a result of processing by the skeleton recognition unit 332 .
 図4は、骨格情報323の一例を示している。図4を参照すると、骨格情報323では、例えば、識別情報と、各部位の位置情報と、が対応づけられている。ここで、識別情報は、骨格情報を認識する際に用いた画像データを示す情報などである。識別情報は、画像情報322における画像データを識別するための識別情報に対応するものであってよい。また、各部位の位置情報は、骨盤の位置など、画像データ中における各部位の座標を示す情報を含んでいる。 FIG. 4 shows an example of the skeleton information 323. Referring to FIG. 4, in skeleton information 323, for example, identification information is associated with position information of each part. Here, the identification information is, for example, information indicating image data used when recognizing skeleton information. The identification information may correspond to identification information for identifying image data in the image information 322 . Also, the position information of each part includes information indicating the coordinates of each part in the image data, such as the position of the pelvis.
 なお、各部位の位置情報に含まれる部位は、学習済みモデル321に応じたものである。例えば、図4では、骨盤、背骨中央、……、右膝、左膝、……、右足首、左足首、……
が例示されている。各部位の位置情報には、例えば、右肩、……、左ひじ、……、など、30か所程度の部位を含むことが出来る(例示した以外でも構わない)。本実施形態の場合、各部位の位置情報には、少なくとも右足首と左足首の座標を示す情報が含まれている。各部位の位置情報に含まれる部位は、図4などで例示した以外であってもよい。
It should be noted that the part included in the position information of each part corresponds to the learned model 321 . For example, in FIG. 4, the pelvis, the center of the spine, ..., the right knee, the left knee, ..., the right ankle, the left ankle, ...
is exemplified. The position information of each part can include about 30 parts such as right shoulder, . . . , left elbow, . In the case of this embodiment, the position information of each part includes at least information indicating the coordinates of the right ankle and the left ankle. The parts included in the position information of each part may be other than those illustrated in FIG. 4 and the like.
 接地判断情報324は、接地判断部333による判断の結果を示す情報を含んでいる。例えば、接地判断情報324は、接地判断部333による判断の結果として生成・更新される。 The ground contact determination information 324 includes information indicating the determination result of the ground contact determination unit 333 . For example, the ground contact determination information 324 is generated and updated as a result of determination by the ground contact determination section 333 .
 図5は、接地判断情報324の一例を示している。図5で示すように、例えば、接地判断情報324では、識別情報と接地判断結果とが対応付けられている。ここで、識別情報は、接地判断を行った画像データを示す情報などである。識別情報は、画像情報322における画像データを識別するための識別情報に対応するものであってよい。また、接地判断結果は、接地判断の結果を示している。例えば、接地判断結果は、左足が接地していると判断した結果を示す「左足」、右足が接地していると判断した結果を示す「右足」、接地判断から除外すると判断した旨を示す「―」などが含まれる。接地判断結果は、上記例示した以外を示してもよい。 FIG. 5 shows an example of the contact determination information 324. FIG. As shown in FIG. 5, for example, in contact determination information 324, identification information and contact determination results are associated with each other. Here, the identification information is, for example, information indicating image data for which grounding determination has been performed. The identification information may correspond to identification information for identifying image data in the image information 322 . Further, the ground contact determination result indicates the ground contact determination result. For example, the ground contact determination result is "left foot" indicating the result of determining that the left foot is on the ground, "right foot" indicating the result of determining that the right foot is on the ground, and "determining that it is excluded from the ground contact determination". -" etc. are included. The contact determination result may indicate other than the above examples.
 接地タイミング情報325は、接地タイミング判定部334による判定の結果を示す情報を含んでいる。例えば、接地タイミング情報325は、接地タイミング判定部334による判定の結果として生成・更新される。 The touchdown timing information 325 includes information indicating the determination result of the touchdown timing determination unit 334 . For example, the touchdown timing information 325 is generated and updated as a result of determination by the touchdown timing determination section 334 .
 例えば、接地タイミング情報325は、足が接地したタイミングを示す情報を含んでいる。ここで、足が接地したタイミングを示す情報とは、例えば、接地したと判定した画像データを示す識別情報、接地したと判定した画像データの取得日時を示す情報、などのうちの少なくとも1つである。接地タイミング情報325は、例えば、接地した足を示す情報と、足が接地したタイミングを示す情報と、を対応付けたものなどであってもよい。 For example, the ground contact timing information 325 includes information indicating the timing at which the foot touches the ground. Here, the information indicating the timing at which the foot touched the ground is, for example, at least one of identification information indicating the image data determined to have touched the ground, information indicating the acquisition date and time of the image data determined to have touched the ground, and the like. be. The contact timing information 325 may be, for example, information indicating the foot that has touched the ground and information indicating the timing at which the foot touched the ground in association with each other.
 演算処理部330は、CPUなどの演算装置とその周辺回路を有する。演算処理部330は、記憶部320からプログラム326を読み込んで実行することにより、上記ハードウェアとプログラム326とを協働させて各種処理部を実現する。演算処理部330で実現される主な処理部としては、例えば、画像取得部331と、骨格認識部332と、接地判断部333と、接地タイミング判定部334と、出力部335と、などがある。 The arithmetic processing unit 330 has an arithmetic device such as a CPU and its peripheral circuits. The arithmetic processing unit 330 reads the program 326 from the storage unit 320 and executes it, so that the hardware and the program 326 work together to realize various processing units. Main processing units realized by the arithmetic processing unit 330 include, for example, an image acquisition unit 331, a skeleton recognition unit 332, a contact determination unit 333, a contact timing determination unit 334, an output unit 335, and the like. .
 画像取得部331は、通信I/F部310を介して、スマートフォン200から当該スマートフォン200が取得した画像データを取得する。例えば、画像取得部331は、スマートフォン200から時系列の画像データを取得する。また、画像取得部331は、取得した画像データを画像情報322として記憶部320に格納する。 The image acquisition unit 331 acquires image data acquired by the smartphone 200 from the smartphone 200 via the communication I/F unit 310 . For example, the image acquisition unit 331 acquires time-series image data from the smartphone 200 . The image acquisition unit 331 also stores the acquired image data in the storage unit 320 as the image information 322 .
 骨格認識部332は、学習済みモデル321を用いて、画像データ中において歩行姿勢測定の対象となる人物の骨格を認識する。例えば、骨格認識部332は、時系列の各画像データについて骨格を認識する。骨格認識部332は、時系列の画像データから所定の条件に基づいて抽出した画像データについて骨格の認識を行うよう構成してもよい。例えば、骨格認識部332は、背骨上部、右肩、左肩、右ひじ、左ひじ、右手首、左手首、右手、左手、……、などの各部位を認識する。また、骨格認識部332は、認識した各部位の画面データにおける座標を算出する。そして、骨格認識部332は、認識・算出した結果を、画像データを識別するための識別情報などと対応付けて、骨格情報323として記憶部320に格納する。 The skeleton recognition unit 332 uses the trained model 321 to recognize the skeleton of the person whose walking posture is to be measured in the image data. For example, the skeleton recognition unit 332 recognizes the skeleton of each piece of time-series image data. The skeleton recognition unit 332 may be configured to recognize the skeleton of image data extracted based on a predetermined condition from time-series image data. For example, the skeleton recognition unit 332 recognizes each part such as the upper part of the spine, the right shoulder, the left shoulder, the right elbow, the left elbow, the right wrist, the left wrist, the right hand, the left hand, and so on. The skeleton recognition unit 332 also calculates the coordinates of the recognized parts in the screen data. Then, the skeleton recognition unit 332 associates the recognition/calculation results with identification information for identifying the image data and stores them in the storage unit 320 as the skeleton information 323 .
 なお、骨格認識部332が認識する部位は、学習済みモデル321(学習済みモデル321を学習する際に用いられた教師データ)に応じたものとなる。そのため、骨格認識部332は、学習済みモデル321に応じて、上記例示した以外の部位を認識しても構わない。 The parts recognized by the skeleton recognition unit 332 correspond to the trained model 321 (teaching data used when learning the trained model 321). Therefore, the skeleton recognition unit 332 may recognize parts other than those exemplified above according to the trained model 321 .
 接地判断部333は、骨格情報323が示す人物の骨格に基づいて、接地している足を判断する。例えば、接地判断部333は、骨格情報323に含まれる右足首や左足首の座標の変化具合に基づいて、歩行している人物について右足と左足のうちのいずれの足を接地しているかを判断する。例えば、接地判断部333は、骨格情報323を参照することで、足首の座標などを示す情報を取得する。そして、接地判断部333は、足首の座標の変化具合が所定の条件を満たす場合に、当該条件を満たす足首を有する足が接地していると判断する。また、接地判断部333は、判断の結果を示す情報を接地判断情報324として記憶部320に格納する。 The grounding determination unit 333 determines which foot is in contact with the ground based on the skeleton of the person indicated by the skeleton information 323 . For example, the contact determination unit 333 determines which of the right and left legs of the walking person is on the ground, based on the degree of change in the coordinates of the right and left ankles included in the skeleton information 323. do. For example, the contact determination unit 333 obtains information indicating the coordinates of the ankle by referring to the skeleton information 323 . Then, when the degree of change in the coordinates of the ankle satisfies a predetermined condition, the ground contact determination unit 333 determines that the foot having the ankle that satisfies the condition is in contact with the ground. Further, the contact determination unit 333 stores information indicating the result of the decision in the storage unit 320 as contact determination information 324 .
 図6は、右足首と左足首のY座標(画像データの上下方向の座標)の変化の一例を示している。例えば、図6は、X軸が時間に応じた値を示しており、Y軸が画像データにおける上下方向の座標(Y座標)を示している。例えば、図6の場合、Y軸の値が大きいほど、右足首や左足首が画像データの下方側に位置していること(つまり、右足首や左足首が画面の手前側に位置していること)を示している。 FIG. 6 shows an example of changes in the Y coordinates (coordinates in the vertical direction of the image data) of the right and left ankles. For example, in FIG. 6, the X axis indicates values corresponding to time, and the Y axis indicates vertical coordinates (Y coordinates) in image data. For example, in the case of FIG. 6, the larger the Y-axis value, the lower the right or left ankle in the image data (that is, the closer the right or left ankle is to the front of the screen). It is shown that.
 一般に、人物が歩行する場合、右足と左足のうちの一方が地面についたままで他方が前方へと進む、右足と左足の両方が接地、右足と左足のうちの他方が地面についたままで一方が前方へと進む、右足と左足の両方が接地、という運動を交互に繰り返す。つまり、本実施形態の場合、右足首のY座標が一定かつ左足首のY座標が増える、右足首のY座標と左足首のY座標が一定、左足首のY座標が一定かつ右足首のY座標が増える、右足首のY座標と左足首のY座標が一定、という状態を繰り返すことが想定される。 In general, when a person walks, one of the right and left feet remains on the ground and the other moves forward, both the right and left feet touch the ground, and one of the right and left feet remains on the ground and one moves forward. , and both right and left feet touch the ground, repeating this exercise alternately. That is, in the case of this embodiment, the Y coordinate of the right ankle is constant and the Y coordinate of the left ankle is increased, the Y coordinate of the right ankle and the Y coordinate of the left ankle are constant, the Y coordinate of the left ankle is constant and the Y coordinate of the right ankle is constant. It is assumed that the state in which the coordinates increase and the Y coordinate of the right ankle and the Y coordinate of the left ankle are constant is repeated.
 そこで、接地判断部333は、右足首と左足首のY座標の変化具合に基づいて、いずれの足が接地しているか判断する。換言すると、接地判断部333は、足首のY座標が一定であると評価できる場合に、当該足首を有する足が接地していると判断する。具体的には、例えば、接地判断部333は、判断対象となる画像データにおける足首のY座標と時系列の画像データにおいて判断対象よりも1つ前の画像データにおける足首のY座標とを比較する。そして、1つ前の画像データにおける足首のY座標からの変位(変化量)が所定値以下である場合に、接地判断部333は、Y座標が一定であるとして当該Y座標が一定な足首を有する足が接地していると判断する。なお、接地判断部333は、上記例示した以外の方法によりY座標が一定であると判断してもよい。また、Y座標が一定であると接地判断部333が判断する変位の範囲は任意に設定してよい。 Therefore, the ground contact determination unit 333 determines which foot is grounded based on the degree of change in the Y coordinates of the right ankle and the left ankle. In other words, the ground contact determination unit 333 determines that the foot having the ankle is in contact with the ground when it can be evaluated that the Y coordinate of the ankle is constant. Specifically, for example, the contact determination unit 333 compares the Y coordinate of the ankle in the image data to be determined with the Y coordinate of the ankle in the image data preceding the determination target in the time-series image data. . Then, when the displacement (amount of change) of the ankle from the Y coordinate in the previous image data is equal to or less than a predetermined value, the contact determination unit 333 assumes that the Y coordinate is constant and determines the ankle with the constant Y coordinate. It is determined that the foot that has the ground is in contact with the ground. Note that the contact determination unit 333 may determine that the Y coordinate is constant by a method other than the above-described example. Further, the range of displacement in which the ground contact determination unit 333 determines that the Y coordinate is constant may be set arbitrarily.
 また、接地判断部333は、足首のY座標に基づいて、接地していると判断した足ではない方の足が動き出したと判断される場合に、接地判断から除外する旨の判断を行うよう構成してもよい。例えば、一般に歩行の場合、右足が接地した後、左足が動き出す。接地判断部333は、右足首のY座標が一定となった後、左足首のY座標が増え始めたタイミングで、接地判断から除外する旨の判断を行ってよい。具体的には、例えば、接地判断部333は、右足首のY座標が一定となった後、時系列の画像データにおいて1つ前の画像データからの左足首のY座標の変位が1つ前の画像データからの右足首のY座標の変位を超えた場合に、左足が動き出したと判断して接地判断から除外することが出来る。 Further, the contact determination unit 333 is configured to determine to exclude from contact determination when it is determined that the leg other than the leg determined to be in contact with the ground has started to move based on the Y-coordinate of the ankle. You may For example, in general walking, the left foot begins to move after the right foot touches the ground. After the Y coordinate of the right ankle becomes constant, the contact determination section 333 may make a decision to exclude from contact determination at the timing when the Y coordinate of the left ankle starts to increase. Specifically, for example, after the Y coordinate of the right ankle becomes constant, the contact determination unit 333 determines that the displacement of the Y coordinate of the left ankle from the previous image data in the time-series image data When the displacement of the Y coordinate of the right ankle from the image data is exceeded, it can be determined that the left foot has started to move and can be excluded from the ground contact determination.
 また、一般に、人物が歩行する際、新たに接地する足は既に接地している足よりも前方に進む。つまり、本実施形態の場合、新たに接地する足の足首のY座標は既に接地している足の足首のY座標よりも大きくなっている。そのため、接地判断部333は、右足首と左足首のうちのY座標が大きい足首のみに着目して接地しているか否かの判断を行うよう構成してもよい。 Also, in general, when a person walks, the foot that newly touches the ground advances further than the foot that has already touched the ground. That is, in the case of this embodiment, the Y coordinate of the ankle of the newly grounded foot is larger than the Y coordinate of the ankle of the already grounded foot. Therefore, the contact determination unit 333 may be configured to determine whether or not the foot is contacting the ground by focusing on only the ankle having the larger Y coordinate between the right ankle and the left ankle.
 また、接地判断部333は、接地判断を行う際、足首の座標の他に膝の座標も参照するように構成してよい。例えば、接地判断部333は、膝の座標変化が所定の条件を満たす場合に、接地判断から除外する旨の判断を行うよう構成してもよい。具体的には、例えば、接地判断部333は、接地しているか否か判断する対象となる足について、時系列の画像データにおいて1つ前の画像データからの膝のY座標の変位が0以下である場合、接地判断から除外する旨の判断を行うよう構成してもよい。 Further, the contact determination unit 333 may be configured to refer to the coordinates of the knee in addition to the coordinates of the ankle when determining the contact. For example, the contact determination unit 333 may be configured to make a decision to exclude from the contact decision when the coordinate change of the knee satisfies a predetermined condition. Specifically, for example, the ground contact determination unit 333 determines whether the displacement of the Y coordinate of the knee from the previous image data in the time-series image data is 0 or less for the leg to be determined whether or not it is grounded. , it may be configured to make a determination to exclude it from the ground contact determination.
 例えば、以上のように、接地判断部333は、足首のY座標の変化具合に基づいて接地判断を行う。なお、接地判断部333は、つま先部分のY座標を用いるなど、上記例示した以外の骨格の変化具合に基づいて接地判断を行うよう構成してもよい。 For example, as described above, the contact determination unit 333 makes contact determination based on the degree of change in the Y coordinate of the ankle. Note that the contact determination unit 333 may be configured to perform contact determination based on the degree of change in the skeleton other than the above example, such as using the Y coordinate of the toe portion.
 接地タイミング判定部334は、骨格情報323が示す人物の骨格に基づいて判断される足の接地状況に基づいて足が地面に接地したタイミングを判定する。例えば、接地タイミング判定部334は、接地判断情報324が示す接地する足の変化に基づいて、足が地面に接地したタイミングを判定する。そして、接地タイミング判定部334は、判定した結果を接地タイミング情報325として記憶部320に格納する。 The ground contact timing determination unit 334 determines the timing at which the foot touches the ground based on the ground contact state of the foot determined based on the skeleton of the person indicated by the skeleton information 323 . For example, the contact timing determination unit 334 determines the timing at which the foot contacts the ground based on the change in the contacting foot indicated by the contact determination information 324 . Then, the contact timing determination section 334 stores the decision result as the contact timing information 325 in the storage section 320 .
 例えば、接地タイミング判定部334は、接地判断結果が示す接地している足が別の足に切り替わるタイミングで足が地面に接地したと判定する。例えば、図7を参照すると、識別情報「124」「125」において「左足」が接地しており、識別情報「126」「127」において接地判断から除外されている。その後、識別情報「128」で「右足」が接地していると判断されている。つまり、図7の場合、識別情報「128」のタイミングで接地している足が左足から右足へと切り替わっている。そこで、接地タイミング判定部334は、識別情報「128」のタイミングで右足が接地したと判定する。 For example, the ground contact timing determination unit 334 determines that the foot has touched the ground at the timing when the grounded foot indicated by the ground contact determination result is switched to another foot. For example, referring to FIG. 7, the "left foot" is grounded in the identification information "124" and "125", and is excluded from the grounding determination in the identification information "126" and "127". After that, it is determined that the "right foot" is on the ground with the identification information "128". That is, in the case of FIG. 7, the foot on the ground is switched from the left foot to the right foot at the timing of the identification information "128". Therefore, the ground contact timing determination unit 334 determines that the right foot has grounded at the timing of the identification information "128".
 出力部335は、接地タイミング情報325や接地判断情報324などを出力する。例えば、出力部335は、外部装置やスマートフォン200などに対して、上記各情報のうちの少なくとも1つを出力することが出来る。 The output unit 335 outputs contact timing information 325, contact determination information 324, and the like. For example, the output unit 335 can output at least one of the above information to an external device, the smart phone 200, or the like.
 以上が、判定装置300の構成例である。続いて、図8、図9を参照して、判定装置300の動作について説明する。 The above is a configuration example of the determination device 300 . Next, the operation of the determination device 300 will be described with reference to FIGS. 8 and 9. FIG.
 まずは、図8を参照して、接地判断部333の動作例について説明する。図8を参照すると、接地判断部333は、判断対象となる画像データにおける膝のY座標と時系列の画像データにおいて判断対象よりも1つ前の画像データにおける膝のY座標とを比較する(ステップS101)。1つ前の画像データにおける膝のY座標からの変位が0を超えている場合(ステップS101、Yes)、接地判断部333は、足首の変位が条件を満たすか否か確認する(ステップS102)。 First, with reference to FIG. 8, an operation example of the contact determination unit 333 will be described. Referring to FIG. 8, the contact determination unit 333 compares the Y coordinate of the knee in the image data to be determined with the Y coordinate of the knee in the image data preceding the determination target in the time-series image data ( step S101). If the displacement of the knee from the Y coordinate in the previous image data exceeds 0 (step S101, Yes), the contact determination unit 333 confirms whether or not the displacement of the ankle satisfies the condition (step S102). .
 足首の変位が条件を満たす場合(ステップS102、Yes)、接地判断部333は、条件を満たす足が接地していると判断する(ステップS103)。一方、膝のY座標の変位が0以下である場合(ステップS101、No)や足首の変位が条件を満たさない場合(ステップS102、No)、接地判断部333は、条件を満たさない足は接地していないと判断する。 When the displacement of the ankle satisfies the condition (step S102, Yes), the ground contact determination unit 333 determines that the foot satisfying the condition is in contact with the ground (step S103). On the other hand, if the displacement of the Y coordinate of the knee is 0 or less (step S101, No) or if the displacement of the ankle does not satisfy the condition (step S102, No), the ground contact determination unit 333 determines that the foot that does not satisfy the condition is grounded. decide not to.
 以上が、接地判断部333の動作例である。なお、接地判断部333は、例えば、足首のY座標の変化具合に基づいてY座標が一定であると評価できる場合であって、接地していると判断した足ではない方の足が動き出したと判断されない場合や膝の座標変化が条件を満たす場合などに、足首の変位が条件を満たすと判断することが出来る。また、接地判断部333は、足首のY座標に着目する足首を特定して、特定した足首に対して上述した確認を行うよう構成してもよい。 The above is an operation example of the contact determination unit 333 . For example, when the Y-coordinate of the ankle can be evaluated to be constant based on the degree of change in the Y-coordinate of the ankle, the contact determination unit 333 determines that the leg other than the leg determined to be on the ground starts to move. If no determination is made or if the coordinate change of the knee satisfies the condition, it can be determined that the displacement of the ankle satisfies the condition. Further, the contact determination unit 333 may be configured to identify the ankle of interest based on the Y-coordinate of the ankle and perform the above-described confirmation on the identified ankle.
 続いて、図9を参照して、接地タイミング判定部334の動作例について説明する。図9を参照すると、接地タイミング判定部334は、接地タイミング情報325を参照して接地判断結果が示す接地している足が変わったか否か確認する(ステップS201)。 Next, with reference to FIG. 9, an operation example of the ground contact timing determination section 334 will be described. Referring to FIG. 9, the contact timing determination unit 334 refers to the contact timing information 325 and checks whether or not the contacting foot indicated by the contact decision result has changed (step S201).
 足が変わった場合(ステップS201、Yes)、接地タイミング判定部334は、当該変わったタイミングが接地のタイミングであると判定する(ステップS202)。一方、判断結果が示す接地している足が変わっていなかった場合(ステップS201)、接地タイミング判定部334は、上記判定を行わない。 When the foot has changed (step S201, Yes), the contact timing determination unit 334 determines that the changed timing is the contact timing (step S202). On the other hand, if the grounded foot indicated by the determination result has not changed (step S201), the ground contact timing determining section 334 does not perform the above determination.
 以上が、接地タイミング判定部334の処理例である。 The above is a processing example of the contact timing determination unit 334 .
 このように、判定装置300は、骨格認識部332と接地判断部333と接地タイミング判定部334とを有している。このような構成によると、接地タイミング判定部334は、骨格認識部332により認識された骨格の座標を用いた、接地判断部333による接地する足の判断結果に基づいて、足が接地するタイミングを判定することが出来る。つまり、上記構成によると、画像データに基づいて足が接地したタイミングを把握することが可能となる。 As described above, the determination device 300 has the skeleton recognition section 332 , the contact determination section 333 and the contact timing determination section 334 . According to such a configuration, the contact timing determination unit 334 determines the timing at which the foot touches the ground based on the determination result of the contacting foot made by the contact determination unit 333 using the coordinates of the skeleton recognized by the skeleton recognition unit 332 . can judge. That is, according to the above configuration, it is possible to grasp the timing when the foot touches the ground based on the image data.
 また、判定装置300は、実際に接地している足に近い部位に着目した結果として、接地タイミングの判定を行っている。そのため、判定装置300は、胸腰部の変位を用いた方法と比較して、より実態に近い判定を行うことが出来る。 In addition, the determination device 300 determines the contact timing as a result of focusing on the part near the foot that is actually in contact with the ground. Therefore, the determination device 300 can perform determination closer to the actual condition than the method using the displacement of the thoracolumbar region.
 なお、判定装置300は、骨格認識を行う外部装置による骨格認識の結果に基づいて接地タイミングの判定を行うよう構成してもよい。このように構成する場合、判定装置300は、骨格認識部332としての機能を持たなくてもよい。 Note that the determination device 300 may be configured to determine the contact timing based on the result of skeleton recognition by an external device that recognizes the skeleton. When configured in this manner, determination device 300 does not need to have the function of skeleton recognition section 332 .
[第2の実施形態]
 次に、図10、図11を参照して、本開示の第2の実施形態について説明する。本開示の第2の実施形態では、情報処理装置である判定装置400の構成の概要について説明する。
[Second embodiment]
Next, a second embodiment of the present disclosure will be described with reference to FIGS. 10 and 11. FIG. In a second embodiment of the present disclosure, an overview of the configuration of a determination device 400, which is an information processing device, will be described.
 図10は、判定装置400のハードウェア構成例を示している。図10を参照すると、判定装置400は、一例として、以下のようなハードウェア構成を有している。
 ・CPU(Central Processing Unit)401(演算装置)
 ・ROM(Read Only Memory)402(記憶装置)
 ・RAM(Random Access Memory)403(記憶装置)
 ・RAM403にロードされるプログラム群404
 ・プログラム群404を格納する記憶装置405
 ・情報処理装置外部の記録媒体410の読み書きを行うドライブ装置406
 ・情報処理装置外部の通信ネットワーク411と接続する通信インタフェース407
 ・データの入出力を行う入出力インタフェース408
 ・各構成要素を接続するバス409
FIG. 10 shows a hardware configuration example of the determination device 400 . Referring to FIG. 10, the determination device 400 has the following hardware configuration as an example.
- CPU (Central Processing Unit) 401 (arithmetic unit)
・ROM (Read Only Memory) 402 (storage device)
・RAM (Random Access Memory) 403 (storage device)
Program group 404 loaded into RAM 403
- Storage device 405 for storing program group 404
- A drive device 406 that reads and writes a recording medium 410 outside the information processing device
- A communication interface 407 that connects to a communication network 411 outside the information processing apparatus
An input/output interface 408 for inputting/outputting data
A bus 409 connecting each component
 また、判定装置400は、プログラム群404をCPU401が取得して当該CPU401が実行することで、図11に示す判定部421としての機能を実現することが出来る。なお、プログラム群404は、例えば、予め記憶装置405やROM402に格納されており、必要に応じてCPU401がRAM403などにロードして実行する。また、プログラム群404は、通信ネットワーク411を介してCPU401に供給されてもよいし、予め記録媒体410に格納されており、ドライブ装置406が該プログラムを読み出してCPU401に供給してもよい。 Also, the determination device 400 can realize the function of the determination unit 421 shown in FIG. The program group 404 is stored in the storage device 405 or the ROM 402 in advance, for example, and is loaded into the RAM 403 or the like and executed by the CPU 401 as necessary. The program group 404 may be supplied to the CPU 401 via the communication network 411 or stored in the recording medium 410 in advance, and the drive device 406 may read the program and supply it to the CPU 401 .
 なお、図10は、判定装置400のハードウェア構成例を示している。判定装置400のハードウェア構成は上述した場合に限定されない。例えば、判定装置400は、ドライブ装置406を有さないなど、上述した構成の一部から構成されてもよい。 Note that FIG. 10 shows a hardware configuration example of the determination device 400 . The hardware configuration of the determination device 400 is not limited to the case described above. For example, the determination device 400 may be composed of part of the configuration described above, such as not having the drive device 406 .
 判定部421は、時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する。 The determination unit 421 acquires information indicating the position where the predetermined part exists, which is acquired from each time-series image data, and determines whether the foot touches the ground based on the degree of change in the position where the acquired part exists. determine the timing.
 このように、判定装置400は判定部421を有している。このような構成によると、判定部421は、部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定することが出来る。これにより、画像データに基づいて足が接地したタイミングを把握することが可能となる。 As described above, the determination device 400 has the determination unit 421 . With such a configuration, the determination unit 421 can determine the timing at which the foot touches the ground based on the degree of change in the position of the part. This makes it possible to grasp the timing at which the foot touches the ground based on the image data.
 なお、上述した判定装置400などの情報処理装置は、当該情報処理装置に所定のプログラムが組み込まれることで実現できる。具体的に、本発明の他の形態であるプログラムは、情報処理装置に、時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する、処理を実現するためのプログラムである。 An information processing device such as the determination device 400 described above can be realized by installing a predetermined program in the information processing device. Specifically, a program according to another aspect of the present invention acquires, in an information processing apparatus, information indicating a position where a predetermined site exists, which is acquired from each time-series image data, and acquires A program for realizing a process of determining the timing at which the foot touches the ground based on the degree of change in the position where the part exists.
 また、上述した情報処理装置により実行される判定方法は、情報処理装置が、時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する、というものである。 Further, in the determination method executed by the information processing device described above, the information processing device acquires information indicating the position where the predetermined part exists, which is acquired from each time-series image data, and acquires The timing at which the foot touches the ground is determined based on the degree of change in the position of the part.
 上述した構成を有する、プログラム(又は記録媒体)、又は、判定方法などの発明であっても、上述した場合と同様の作用・効果を有するために、上述した本発明の目的を達成することが出来る。 Even an invention such as a program (or a recording medium) or a determination method having the above-described configuration can achieve the above-described object of the present invention because it has the same actions and effects as the above-described case. I can.
 <付記>
 上記実施形態の一部又は全部は、以下の付記のようにも記載されうる。以下、本発明における判定装置などの概略を説明する。但し、本発明は、以下の構成に限定されない。
<Appendix>
Some or all of the above embodiments may also be described as the following appendices. The outline of the determination device and the like according to the present invention will be described below. However, the present invention is not limited to the following configurations.
(付記1)
 時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する判定部を有する
 判定装置。
(付記2)
 部位が存在する位置の変化具合に基づいて接地している足を判断する判断部を有し、
 前記判定部は、前記判断部による判断の結果に基づいて足が接地したタイミングを判定する
 付記1に記載の判定装置。
(付記3)
 前記判定部は、前記判断部による判断の結果に基づいて、接地している足が切り替わったと判断されるタイミングで足が接地したと判定する
 付記2に記載の判定装置。
(付記4)
 部位が存在する位置を示す情報には、画像データ中の人物の骨格を認識することで判断される足首が存在する位置を示す情報が含まれており、
 前記判断部は、足首の位置の変化具合に基づいて接地している足を判断する
 付記2または付記3に記載の判定装置。
(付記5)
 前記判断部は、足首の位置の変化具合に基づいて足首の位置が一定であると評価できる場合に当該足首を有する足が接地していると判断する
 付記4に記載の判定装置。
(付記6)
 前記判断部は、足首の位置の変化具合に基づいて接地していると判断した足ではない方の足が動き出したと判断される場合に、接地判断から除外する旨の判断を行う
 付記4または付記5に記載の判定装置。
(付記7)
 前記判断部は、足首の位置に基づいて着目対象となる足を判断し、着目対象となる足について接地しているか否か判断する
 付記4から付記6までのうちのいずれか1項に記載の判定装置。
(付記8)
 部位が存在する位置を示す情報には、画像データ中の人物の骨格を認識することで判断される膝が存在する位置を示す情報が含まれており、
 前記判断部は、足首の位置の変化具合と膝の位置の変化具合に基づいて接地している足を判断する
 付記4から付記7までのうちのいずれか1項に記載の判定装置。
(付記9)
 情報処理装置が、
 時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する
 判定方法。
(付記10)
 情報処理装置に、
 時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する
 処理を実現するためのプログラム。
(Appendix 1)
Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. A determination device having a determination unit.
(Appendix 2)
having a judgment unit that judges the foot that is in contact with the ground based on the degree of change in the position where the part exists;
The determination device according to supplementary note 1, wherein the determination unit determines the timing at which the foot touches the ground based on the determination result of the determination unit.
(Appendix 3)
The determination device according to appendix 2, wherein the determination unit determines that the foot has touched the ground at a timing at which it is determined that the grounded foot has changed, based on the determination result of the determination unit.
(Appendix 4)
The information indicating the position where the part exists includes information indicating the position where the ankle exists, which is determined by recognizing the skeleton of the person in the image data,
The determination device according to appendix 2 or appendix 3, wherein the judging unit judges which foot is in contact with the ground based on how the position of the ankle changes.
(Appendix 5)
The determination device according to appendix 4, wherein the determination unit determines that the foot having the ankle is in contact with the ground when it can be evaluated that the position of the ankle is constant based on the degree of change in the position of the ankle.
(Appendix 6)
When it is determined that the leg other than the leg determined to be on the ground based on the degree of change in the position of the ankle has started to move, the determination unit determines to exclude it from the grounding determination. 5. The determination device according to 5.
(Appendix 7)
The determination unit according to any one of appendices 4 to 6, wherein the determination unit determines the target foot based on the position of the ankle, and determines whether or not the target foot is in contact with the ground. judgment device.
(Appendix 8)
The information indicating the position where the part exists includes information indicating the position where the knee exists, which is determined by recognizing the skeleton of the person in the image data,
7. The determination device according to any one of appendices 4 to 7, wherein the judgment unit judges which foot is in contact with the ground based on the degree of change in the position of the ankle and the degree of change in the position of the knee.
(Appendix 9)
The information processing device
Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. judgment method.
(Appendix 10)
information processing equipment,
Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. A program for realizing processing.
 なお、上記各実施形態及び付記において記載したプログラムは、記憶装置に記憶されていたり、コンピュータが読み取り可能な記録媒体に記録されていたりする。例えば、記録媒体は、フレキシブルディスク、光ディスク、光磁気ディスク、及び、半導体メモリ等の可搬性を有する媒体である。 The programs described in each of the above embodiments and supplementary notes are stored in a storage device or recorded in a computer-readable recording medium. For example, the recording medium is a portable medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
 以上、上記各実施形態を参照して本願発明を説明したが、本願発明は、上述した実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明の範囲内で当業者が理解しうる様々な変更をすることが出来る。 Although the present invention has been described with reference to the above-described embodiments, the present invention is not limited to the above-described embodiments. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
 なお、本発明は、日本国にて2021年3月29日に特許出願された特願2021-055312の特許出願に基づく優先権主張の利益を享受するものであり、当該特許出願に記載された内容は、全て本明細書に含まれるものとする。 In addition, the present invention enjoys the benefit of the priority claim based on the patent application of Japanese Patent Application No. 2021-055312 filed on March 29, 2021 in Japan, and is described in the patent application. The contents are hereby incorporated by reference in their entirety.
100 判定システム
200 スマートフォン
300 判定装置
310 通信I/F部
320 記憶部
321 学習済みモデル
322 画像情報
323 骨格情報
324 接地判断情報
325 接地タイミング情報
326 プログラム
330 演算処理部
331 画像取得部
332 骨格認識部
333 接地判断部
334 接地タイミング判定部
335 出力部
400 判定装置
401 CPU
402 ROM
403 RAM
404 プログラム群
405 記憶装置
406 ドライブ装置
407 通信インタフェース
408 入出力インタフェース
409 バス
410 記録媒体
411 通信ネットワーク
421 判定部
100 determination system 200 smart phone 300 determination device 310 communication I/F unit 320 storage unit 321 trained model 322 image information 323 skeleton information 324 contact determination information 325 contact timing information 326 program 330 arithmetic processing unit 331 image acquisition unit 332 skeleton recognition unit 333 Contact determination unit 334 Contact timing determination unit 335 Output unit 400 Determination device 401 CPU
402 ROMs
403 RAM
404 program group 405 storage device 406 drive device 407 communication interface 408 input/output interface 409 bus 410 recording medium 411 communication network 421 determination unit

Claims (10)

  1.  時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する判定部を有する
     判定装置。
    Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. A determination device having a determination unit.
  2.  部位が存在する位置の変化具合に基づいて接地している足を判断する判断部を有し、
     前記判定部は、前記判断部による判断の結果に基づいて足が接地したタイミングを判定する
     請求項1に記載の判定装置。
    having a judgment unit that judges the foot that is in contact with the ground based on the degree of change in the position where the part exists;
    The determination device according to claim 1, wherein the determination unit determines timing when the foot touches the ground based on the result of determination by the determination unit.
  3.  前記判定部は、前記判断部による判断の結果に基づいて、接地している足が切り替わったと判断されるタイミングで足が接地したと判定する
     請求項2に記載の判定装置。
    The determination device according to claim 2, wherein the determination unit determines that the foot has touched the ground at a timing at which it is determined that the grounded foot has changed, based on the determination result of the determination unit.
  4.  部位が存在する位置を示す情報には、画像データ中の人物の骨格を認識することで判断される足首が存在する位置を示す情報が含まれており、
     前記判断部は、足首の位置の変化具合に基づいて接地している足を判断する
     請求項2または請求項3に記載の判定装置。
    The information indicating the position where the part exists includes information indicating the position where the ankle exists, which is determined by recognizing the skeleton of the person in the image data,
    The determination device according to claim 2 or 3, wherein the determining unit determines which foot is in contact with the ground based on how the position of the ankle changes.
  5.  前記判断部は、足首の位置の変化具合に基づいて足首の位置が一定であると評価できる場合に当該足首を有する足が接地していると判断する
     請求項4に記載の判定装置。
    5. The determination device according to claim 4, wherein the determination unit determines that the foot having the ankle is in contact with the ground when it can be evaluated that the position of the ankle is constant based on the degree of change in the position of the ankle.
  6.  前記判断部は、足首の位置の変化具合に基づいて接地していると判断した足ではない方の足が動き出したと判断される場合に、接地判断から除外する旨の判断を行う
     請求項4または請求項5に記載の判定装置。
    5. When it is determined that the leg other than the leg determined to be on the ground based on the degree of change in the position of the ankle has started to move, the determination unit determines to exclude it from the grounding determination. The determination device according to claim 5.
  7.  前記判断部は、足首の位置に基づいて着目対象となる足を判断し、着目対象となる足について接地しているか否か判断する
     請求項4から請求項6までのうちのいずれか1項に記載の判定装置。
    7. The determination unit determines whether or not the target foot is in contact with the ground based on the position of the ankle. Determination device as described.
  8.  部位が存在する位置を示す情報には、画像データ中の人物の骨格を認識することで判断される膝が存在する位置を示す情報が含まれており、
     前記判断部は、足首の位置の変化具合と膝の位置の変化具合に基づいて接地している足を判断する
     請求項4から請求項7までのうちのいずれか1項に記載の判定装置。
    The information indicating the position where the part exists includes information indicating the position where the knee exists, which is determined by recognizing the skeleton of the person in the image data,
    8. The determination device according to any one of claims 4 to 7, wherein the determining unit determines which foot is in contact with the ground based on the degree of change in the position of the ankle and the degree of change in the position of the knee.
  9.  情報処理装置が、
     時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する
     判定方法。
    The information processing device
    Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. judgment method.
  10.  情報処理装置に、
     時系列の各画像データから取得された、予め定められた部位が存在する位置を示す情報を取得して、取得した部位が存在する位置の変化具合に基づいて、足が接地したタイミングを判定する
     処理を実現するためのプログラムを記録した、コンピュータが読み取り可能な記録媒体。
    information processing equipment,
    Information indicating the position where a predetermined part exists is acquired from each time-series image data, and the timing at which the foot touches the ground is determined based on the degree of change in the position where the acquired part exists. A computer-readable recording medium in which a program for realizing processing is recorded.
PCT/JP2022/004470 2021-03-29 2022-02-04 Determination device WO2022209286A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023510575A JPWO2022209286A1 (en) 2021-03-29 2022-02-04
CN202280025925.4A CN117119958A (en) 2021-03-29 2022-02-04 Determination device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-055312 2021-03-29
JP2021055312 2021-03-29

Publications (1)

Publication Number Publication Date
WO2022209286A1 true WO2022209286A1 (en) 2022-10-06

Family

ID=83458812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/004470 WO2022209286A1 (en) 2021-03-29 2022-02-04 Determination device

Country Status (3)

Country Link
JP (1) JPWO2022209286A1 (en)
CN (1) CN117119958A (en)
WO (1) WO2022209286A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010172394A (en) * 2009-01-27 2010-08-12 Panasonic Electric Works Co Ltd Walking state display
JP2015186515A (en) * 2014-03-26 2015-10-29 本田技研工業株式会社 Upper body exercise measurement system and upper body exercise measurement method
JP2017140393A (en) * 2016-02-12 2017-08-17 タタ コンサルタンシー サービシズ リミテッドTATA Consultancy Services Limited System and method for analyzing gait and postural balance of a person
JP2018139902A (en) * 2017-02-28 2018-09-13 株式会社ニコン Detection system, detection method, detection program, processing device and exercise mat
CN110934597A (en) * 2019-12-30 2020-03-31 龙岩学院 Operation method of abnormal gait monitoring equipment
US20210059565A1 (en) * 2019-09-04 2021-03-04 Richard Morris Gait-based assessment of neurodegeneration

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010172394A (en) * 2009-01-27 2010-08-12 Panasonic Electric Works Co Ltd Walking state display
JP2015186515A (en) * 2014-03-26 2015-10-29 本田技研工業株式会社 Upper body exercise measurement system and upper body exercise measurement method
JP2017140393A (en) * 2016-02-12 2017-08-17 タタ コンサルタンシー サービシズ リミテッドTATA Consultancy Services Limited System and method for analyzing gait and postural balance of a person
JP2018139902A (en) * 2017-02-28 2018-09-13 株式会社ニコン Detection system, detection method, detection program, processing device and exercise mat
US20210059565A1 (en) * 2019-09-04 2021-03-04 Richard Morris Gait-based assessment of neurodegeneration
CN110934597A (en) * 2019-12-30 2020-03-31 龙岩学院 Operation method of abnormal gait monitoring equipment

Also Published As

Publication number Publication date
CN117119958A (en) 2023-11-24
JPWO2022209286A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
CN110969114B (en) Human body action function detection system, detection method and detector
EP3315068B1 (en) Device, method, and computer program for providing posture feedback during an exercise
KR101930652B1 (en) Gait analysis system and computer program recorded on recording medium
KR20160031246A (en) Method and apparatus for gait task recognition
JP6649323B2 (en) Gait analysis system and method
KR102436906B1 (en) Electronic device for identifying human gait pattern and method there of
WO2017222072A1 (en) Posture analysis device, posture analysis method, and computer-readable recording medium
CN112528957A (en) Human motion basic information detection method and system and electronic equipment
KR20230166319A (en) Device, method and program recording medium for identifying human 3d whole-body pose/motion
US20220222975A1 (en) Motion recognition method, non-transitory computer-readable recording medium and information processing apparatus
KR20160076488A (en) Apparatus and method of measuring the probability of muscular skeletal disease
KR101818198B1 (en) Apparatus and method for evaluating Taekwondo motion using multi-directional recognition
KR20160075884A (en) Skeleton tracking method and keleton tracking system using the method
Pagnon et al. Pose2Sim: An open-source Python package for multiview markerless kinematics
WO2022209286A1 (en) Determination device
KR102310964B1 (en) Electronic Device, Method, and System for Diagnosing Musculoskeletal Symptoms
KR101788960B1 (en) Automatic Measuring System for Range of Motion and Automatic Measuring Method for range of Motion Using the Same
CN112602100B (en) Action analysis device, action analysis method, storage medium, and action analysis system
WO2022209288A1 (en) Calculation device
WO2022209287A1 (en) Calculation device
CN110781857B (en) Motion monitoring method, device, system and storage medium
CN114053679A (en) Exercise training method and system
US11263780B2 (en) Apparatus, method, and program with verification of detected position information using additional physical characteristic points
CN113836991A (en) Motion recognition system, motion recognition method, and storage medium
JP3625558B2 (en) Motion measurement device using moving images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22779515

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023510575

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22779515

Country of ref document: EP

Kind code of ref document: A1