WO2022209287A1 - Calculation device - Google Patents

Calculation device Download PDF

Info

Publication number
WO2022209287A1
WO2022209287A1 PCT/JP2022/004471 JP2022004471W WO2022209287A1 WO 2022209287 A1 WO2022209287 A1 WO 2022209287A1 JP 2022004471 W JP2022004471 W JP 2022004471W WO 2022209287 A1 WO2022209287 A1 WO 2022209287A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
length
height
information
calculation unit
Prior art date
Application number
PCT/JP2022/004471
Other languages
French (fr)
Japanese (ja)
Inventor
宏紀 寺島
克幸 永井
Original Assignee
Necソリューションイノベータ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necソリューションイノベータ株式会社 filed Critical Necソリューションイノベータ株式会社
Priority to JP2023510576A priority Critical patent/JPWO2022209287A1/ja
Publication of WO2022209287A1 publication Critical patent/WO2022209287A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to a calculation device, a calculation method, and a recording medium.
  • Patent Document 1 is a document describing human gait analysis.
  • a data acquisition unit that acquires two types of image data from a depth sensor, a skeleton information creation unit that creates skeleton information based on the image data acquired by the data acquisition unit, and a skeleton information creation unit that creates
  • a gait analysis device having a correction processing unit that corrects skeleton information and an analysis processing unit that analyzes a user's gait using the corrected skeleton information is described.
  • An example of an index that you would like to calculate when performing gait analysis is the curvature of the waist.
  • an index value corresponding to the bending of the waist cannot be obtained. was difficult to calculate.
  • an object of the present invention is to provide a determination device, a determination method, and a recording medium that solve the problem that it is difficult to calculate an index value corresponding to the curvature of the waist based on image data.
  • a computing device which is one aspect of the present disclosure, a height calculation unit that calculates the height of a person in the image data based on the image data;
  • a hand length which is a length connecting the left and right hands of a person in the image data, is obtained, and based on the obtained hand length and the height calculated by the height calculation unit, a curvature calculation unit that calculates an index value indicating the curvature of the waist of the person; It has a configuration of
  • a calculation method which is another aspect of the present disclosure, The information processing device Based on the image data, calculate the height of the person in the image data, A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height. Calculating an index value indicating
  • a recording medium that is another aspect of the present disclosure includes: information processing equipment, Based on the image data, calculate the height of the person in the image data, A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height. It is a computer-readable recording medium in which a program for realizing a process of calculating an index value indicating is recorded.
  • FIG. 1 is a diagram illustrating a configuration example of a determination system according to a first embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram for explaining an example of image data acquired by the determination system
  • FIG. 2 is a block diagram showing a configuration example of a determination device shown in FIG. 1
  • FIG. FIG. 4 is a diagram showing an example of skeleton information
  • It is a figure which shows an example of grounding determination information.
  • FIG. 10 is a diagram for explaining an example of determination by a contact determination unit
  • FIG. 4 is a diagram for explaining an example of determination by a touch timing determination unit
  • 5 is a flow chart showing an example of processing by a ground contact determination unit
  • 6 is a flow chart showing an example of processing by a contact timing determination unit;
  • FIG. 11 is a diagram illustrating a configuration example of a calculation system according to a second embodiment of the present disclosure
  • FIG. 11 is a block diagram showing a configuration example of the calculation device shown in FIG. 10
  • FIG. It is a figure which shows an example of hand length and height.
  • FIG. 15 is a block diagram showing a configuration example of the calculation device shown in FIG. 14
  • FIG. 14 It is a figure which shows an example of height.
  • FIG. 13 is a diagram illustrating a hardware configuration example of a computing device according to the fourth embodiment of the present disclosure
  • FIG. It is a block diagram which shows the structural example of a calculation apparatus.
  • FIG. 11 is a block diagram showing another configuration example of the calculation device;
  • FIG. 1 is a diagram showing a configuration example of a determination system 100.
  • FIG. 2 is a diagram for explaining an example of image data acquired by the determination system 100.
  • FIG. 3 is a block diagram showing a configuration example of the determination device 300.
  • FIG. 4 is a diagram showing an example of the skeleton information 323.
  • FIG. 5 is a diagram showing an example of the contact determination information 324.
  • FIG. 6 is a diagram for explaining an example of determination by the contact determination unit 333.
  • FIG. 7 is a diagram for explaining an example of determination by the touchdown timing determining section 334.
  • FIG. 8 is a flow chart showing an example of processing by the contact determination unit 333.
  • FIG. 9 is a flow chart showing an example of processing by the contact timing determination section 334.
  • a determination system 100 that determines the timing at which the foot of a walking person touches the ground based on image data acquired using an imaging device such as the smartphone 200 will be described.
  • the determination system 100 acquires time-series image data showing how a person walks from the back side to the front side of the screen.
  • the determination system 100 recognizes the position of the skeleton from each of the plurality of image data. Then, the determination system 100 determines the timing at which the foot touches the ground based on the recognized change in position.
  • the timing of landing determined by the determination system 100 can be, for example, an index such as the angle of each joint at the timing when the foot touches the floor, whether the toe is raised from the floor, pitch, stride, various steps such as the step cycle. It can be used for gait analysis.
  • the touchdown timing determined by the determination system 100 may be used in addition to the above examples.
  • FIG. 1 shows a configuration example of the determination system 100.
  • the determination system 100 has, for example, a smart phone 200 and a determination device 300 .
  • the smartphone 200 and the determination device 300 are connected, for example, wirelessly or by wire so that they can communicate with each other.
  • the smart phone 200 functions as an imaging device that captures how a person walks.
  • the smartphone 200 may be a smartphone having general functions such as a camera function for acquiring image data, a touch panel for displaying a screen, and various sensors such as a GPS sensor and an acceleration sensor.
  • the smartphone 200 shoots a person walking in the front-back direction, such as from the back of the screen to the front.
  • smartphone 200 acquires time-series image data showing how a person walks in the front-to-back direction.
  • the smartphone 200 also transmits the acquired image data to the determination device 300 .
  • the smartphone 200 may associate information indicating the date and time when the image data was acquired with the image data, and transmit the associated data to the determination device 300 .
  • the determination device 300 is an information processing device that determines the timing at which the foot of a walking person touches the ground based on the image data acquired by the smartphone 200 .
  • the determination device 300 is a server device or the like.
  • the determination device 300 may be a single information processing device, or may be implemented on a cloud, for example.
  • FIG. 3 shows a configuration example of the determination device 300.
  • the determination device 300 has, for example, a communication I/F section 310, a storage section 320, and an arithmetic processing section 330 as main components.
  • the communication I/F unit 310 consists of a data communication circuit. Communication I/F unit 310 performs data communication with an external device, smartphone 200, or the like connected via a communication line.
  • the storage unit 320 is a storage device such as a hard disk or memory.
  • the storage unit 320 stores processing information and programs 326 necessary for various processes in the arithmetic processing unit 330 .
  • the program 326 realizes various processing units by being read and executed by the arithmetic processing unit 330 .
  • the program 326 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 310 and stored in the storage unit 320 .
  • Main information stored in the storage unit 320 includes, for example, a learned model 321, image information 322, skeleton information 323, contact determination information 324, contact timing information 325, and the like.
  • the trained model 321 is a trained model used when the skeleton recognition unit 332 performs skeleton recognition.
  • the learned model 321 is, for example, generated in advance by performing machine learning using teacher data such as image data containing skeletal coordinates in an external device, etc. It is acquired from a device or the like and stored in the storage unit 320 . Trained model 321 may be generated using known methods. The learned model 321 may be updated by re-learning processing using additional teacher data.
  • the image information 322 includes time-series image data acquired by the camera of the smartphone 200 .
  • the image information 322 is generated and updated when the image acquisition unit 331 acquires image data.
  • identification information for identifying image data is associated with image data.
  • the image information 322 may include information other than the above examples, such as information indicating the date and time when the smartphone 200 acquired the image data.
  • the skeleton information 323 includes information indicating the coordinates of each part of the person recognized by the skeleton recognition unit 332 .
  • the skeleton information 323 includes, for example, information indicating the coordinates of each part of each piece of time-series image data.
  • the skeleton information 323 is generated/updated as a result of processing by the skeleton recognition unit 332 .
  • FIG. 4 shows an example of the skeleton information 323.
  • identification information is associated with position information of each part.
  • the identification information is, for example, information indicating image data used when recognizing skeleton information.
  • the identification information may correspond to identification information for identifying image data in the image information 322 .
  • the position information of each part includes information indicating the coordinates of each part in the image data, such as the position of the pelvis.
  • the part included in the position information of each part corresponds to the learned model 321 .
  • the pelvis, the center of the spine, ..., the right knee, the left knee, ..., the right ankle, the left ankle, ... is exemplified.
  • the position information of each part can include about 30 parts such as right shoulder, . . . , left elbow, .
  • the position information of each part includes at least information indicating the coordinates of the right ankle and the left ankle.
  • the parts included in the position information of each part may be other than those illustrated in FIG. 4 and the like.
  • the ground contact determination information 324 includes information indicating the determination result of the ground contact determination unit 333 .
  • the ground contact determination information 324 is generated and updated as a result of determination by the ground contact determination section 333 .
  • FIG. 5 shows an example of the contact determination information 324.
  • the identification information is, for example, information indicating image data for which grounding determination has been performed.
  • the identification information may correspond to identification information for identifying image data in the image information 322 .
  • the ground contact determination result indicates the ground contact determination result.
  • the ground contact determination result is "left foot” indicating the result of determining that the left foot is on the ground, "right foot” indicating the result of determining that the right foot is on the ground, and "determining that it is excluded from the ground contact determination". -" etc. are included.
  • the contact determination result may indicate other than the above examples.
  • the touchdown timing information 325 includes information indicating the determination result of the touchdown timing determination unit 334 .
  • the touchdown timing information 325 is generated and updated as a result of determination by the touchdown timing determination section 334 .
  • the ground contact timing information 325 includes information indicating the timing at which the foot touches the ground.
  • the information indicating the timing at which the foot touched the ground is, for example, at least one of identification information indicating the image data determined to have touched the ground, information indicating the acquisition date and time of the image data determined to have touched the ground, and the like. be.
  • the contact timing information 325 may be, for example, information indicating the foot that has touched the ground and information indicating the timing at which the foot touched the ground in association with each other.
  • the arithmetic processing unit 330 has an arithmetic device such as a CPU and its peripheral circuits.
  • the arithmetic processing unit 330 reads the program 326 from the storage unit 320 and executes it, so that the hardware and the program 326 work together to realize various processing units.
  • Main processing units realized by the arithmetic processing unit 330 include, for example, an image acquisition unit 331, a skeleton recognition unit 332, a contact determination unit 333, a contact timing determination unit 334, an output unit 335, and the like. .
  • the image acquisition unit 331 acquires image data acquired by the smartphone 200 from the smartphone 200 via the communication I/F unit 310 .
  • the image acquisition unit 331 acquires time-series image data from the smartphone 200 .
  • the image acquisition unit 331 also stores the acquired image data in the storage unit 320 as the image information 322 .
  • the skeleton recognition unit 332 uses the trained model 321 to recognize the skeleton of the person whose walking posture is to be measured in the image data. For example, the skeleton recognition unit 332 recognizes the skeleton of each piece of time-series image data.
  • the skeleton recognition unit 332 may be configured to recognize the skeleton of image data extracted based on a predetermined condition from time-series image data. For example, the skeleton recognition unit 332 recognizes each part such as the upper part of the spine, the right shoulder, the left shoulder, the right elbow, the left elbow, the right wrist, the left wrist, the right hand, the left hand, and so on.
  • the skeleton recognition unit 332 also calculates the coordinates of the recognized parts in the screen data.
  • the skeleton recognition unit 332 associates the recognition/calculation results with identification information for identifying the image data and stores them in the storage unit 320 as the skeleton information 323 . In this way, the skeleton recognition unit 332 acquires information indicating the position of each part of the person in the image data.
  • the parts recognized by the skeleton recognition unit 332 correspond to the trained model 321 (teaching data used when learning the trained model 321). Therefore, the skeleton recognition unit 332 may recognize parts other than those exemplified above according to the trained model 321 .
  • the grounding determination unit 333 determines which foot is in contact with the ground based on the skeleton of the person indicated by the skeleton information 323 . For example, the contact determination unit 333 determines which of the right and left legs of the walking person is on the ground, based on the degree of change in the coordinates of the right and left ankles included in the skeleton information 323. do. For example, the contact determination unit 333 refers to the skeleton information 323 and acquires information indicating the position such as the coordinates of the ankle. Then, when the degree of change in the coordinates of the ankle satisfies a predetermined condition, the ground contact determination unit 333 determines that the foot having the ankle that satisfies the condition is in contact with the ground. Further, the contact determination unit 333 stores information indicating the result of the decision in the storage unit 320 as contact determination information 324 .
  • FIG. 6 shows an example of changes in the Y coordinates (coordinates in the vertical direction of the image data) of the right and left ankles.
  • the X axis indicates values corresponding to time
  • the Y axis indicates vertical coordinates (Y coordinates) in image data.
  • the larger the Y-axis value the lower the right or left ankle in the image data (that is, the closer the right or left ankle is to the front of the screen). It is shown that.
  • the ground contact determination unit 333 determines which foot is grounded based on the degree of change in the Y coordinates of the right ankle and the left ankle. In other words, the ground contact determination unit 333 determines that the foot having the ankle is in contact with the ground when it can be evaluated that the Y coordinate of the ankle is constant. Specifically, for example, the contact determination unit 333 compares the Y coordinate of the ankle in the image data to be determined with the Y coordinate of the ankle in the image data preceding the determination target in the time-series image data. .
  • the contact determination unit 333 assumes that the Y coordinate is constant and determines the ankle with the constant Y coordinate. It is determined that the foot that has the ground is in contact with the ground. Note that the contact determination unit 333 may determine that the Y coordinate is constant by a method other than the above-described example. Further, the range of displacement in which the ground contact determination unit 333 determines that the Y coordinate is constant may be set arbitrarily.
  • the contact determination unit 333 is configured to determine to exclude from contact determination when it is determined that the leg other than the leg determined to be in contact with the ground has started to move based on the Y-coordinate of the ankle. You may For example, in general walking, the left foot begins to move after the right foot touches the ground. After the Y coordinate of the right ankle becomes constant, the contact determination section 333 may make a decision to exclude from contact determination at the timing when the Y coordinate of the left ankle starts to increase.
  • the contact determination unit 333 determines that the displacement of the Y coordinate of the left ankle from the previous image data in the time-series image data When the displacement of the Y coordinate of the right ankle from the image data is exceeded, it can be determined that the left foot has started to move and can be excluded from the ground contact determination.
  • the contact determination unit 333 may be configured to determine whether or not the foot is contacting the ground by focusing on only the ankle having the larger Y coordinate between the right ankle and the left ankle.
  • the contact determination unit 333 may be configured to refer to the coordinates of the knee in addition to the coordinates of the ankle when determining the contact.
  • the contact determination unit 333 may be configured to make a decision to exclude from the contact decision when the coordinate change of the knee satisfies a predetermined condition.
  • the ground contact determination unit 333 determines whether the displacement of the Y coordinate of the knee from the previous image data in the time-series image data is 0 or less for the leg to be determined whether or not it is grounded. , it may be configured to make a determination to exclude it from the ground contact determination.
  • the contact determination unit 333 makes contact determination based on the degree of change in the Y coordinate of the ankle.
  • the contact determination unit 333 may be configured to perform contact determination based on the degree of change in the skeleton other than the above example, such as using the Y coordinate of the toe portion.
  • the ground contact timing determination unit 334 determines the timing at which the foot touches the ground based on the placement of the foot determined based on the skeleton of the person indicated by the skeleton information 323 . For example, the contact timing determination unit 334 determines the timing at which the foot contacts the ground based on the change in the contacting foot indicated by the contact determination information 324 . Then, the contact timing determination section 334 stores the decision result as the contact timing information 325 in the storage section 320 .
  • the ground contact timing determination unit 334 determines that the foot has touched the ground at the timing when the grounded foot indicated by the ground contact determination result is switched to another foot. For example, referring to FIG. 7, the "left foot” is grounded in the identification information "124" and “125", and is excluded from the grounding determination in the identification information "126" and "127". After that, it is determined that the "right foot” is on the ground with the identification information "128". That is, in the case of FIG. 7, the foot on the ground is switched from the left foot to the right foot at the timing of the identification information "128". Therefore, the ground contact timing determination unit 334 determines that the right foot has grounded at the timing of the identification information "128".
  • the output unit 335 outputs contact timing information 325, contact determination information 324, and the like.
  • the output unit 335 can output at least one of the above information to an external device, the smart phone 200, or the like.
  • the contact determination unit 333 compares the Y coordinate of the knee in the image data to be determined with the Y coordinate of the knee in the image data preceding the determination target in the time-series image data (step S101). If the displacement of the knee from the Y coordinate in the previous image data exceeds 0 (step S101, Yes), the contact determination unit 333 confirms whether or not the displacement of the ankle satisfies the condition (step S102). .
  • the ground contact determination unit 333 determines that the foot satisfying the condition is in contact with the ground (step S103). On the other hand, if the displacement of the Y coordinate of the knee is 0 or less (step S101, No) or if the displacement of the ankle does not satisfy the condition (step S102, No), the ground contact determination unit 333 determines that the foot that does not satisfy the condition is grounded. decide not to.
  • the above is an operation example of the contact determination unit 333 .
  • the contact determination unit 333 determines that the leg other than the leg determined to be on the ground starts to move. If no determination is made or if the coordinate change of the knee satisfies the condition, it can be determined that the displacement of the ankle satisfies the condition. Further, the contact determination unit 333 may be configured to identify the ankle of interest based on the Y-coordinate of the ankle and perform the above-described confirmation on the identified ankle.
  • the contact timing determination unit 334 refers to the contact timing information 325 and checks whether or not the contacting foot indicated by the contact decision result has changed (step S201).
  • the contact timing determination unit 334 determines that the changed timing is the contact timing (step S202). On the other hand, if the grounded foot indicated by the determination result has not changed (step S201), the ground contact timing determining section 334 does not perform the above determination.
  • the above is a processing example of the contact timing determination unit 334 .
  • the determination device 300 has the skeleton recognition section 332 , the contact determination section 333 and the contact timing determination section 334 .
  • the contact timing determination unit 334 determines the timing at which the foot touches the ground based on the determination result of the contacting foot made by the contact determination unit 333 using the coordinates of the skeleton recognized by the skeleton recognition unit 332 . can judge. That is, according to the above configuration, it is possible to grasp the timing when the foot touches the ground based on the image data.
  • determination device 300 may be configured to determine the contact timing based on the result of skeleton recognition by an external device that recognizes the skeleton. When configured in this manner, determination device 300 does not need to have the function of skeleton recognition section 332 .
  • FIG. 10 is a diagram showing a configuration example of the calculation system 400.
  • FIG. 11 is a block diagram showing a configuration example of the calculation device 500.
  • FIG. 12 is a diagram showing an example of hand length and height.
  • FIG. 13 is a flow chart showing an operation example of the computing device 500 .
  • a calculation system 400 that calculates an index value indicating the bending of a person's waist in image data based on image data acquired using an imaging device such as the smartphone 200 will be described.
  • the calculation system 400 acquires image data showing a person walking from the back side to the front side of the screen, and recognizes the position of the skeleton from the image data. Then, the calculation system 400 calculates an index value indicating bending of the waist based on the recognition result.
  • the hand length which is the sum of the intra-image distances of the skeletal lengths that connect the parts existing between the left and right hands, and the intra-image distance from the head to the toes.
  • Get the height which is the distance.
  • the calculation system 400 calculates an index value indicating bending of the waist based on the acquired hand length and height. Assuming that both the hand length and the height are the height, the height changes as the waist bends, but the hand length does not change even when the waist is bent. Also, in general, the length of the person when both hands are spread out to the left and right corresponds to the height.
  • the calculation system 400 calculates an index value indicating the bending of the waist by using the above.
  • the calculation system 400 adds a predetermined correction value to the hand length in accordance with the attributes of the person such as age, height, gender, etc., and then corrects the bending of the waist based on the corrected hand length and height. It may be configured to calculate an index value to indicate.
  • calculation system 400 described in this embodiment can have the function of the determination system 100 described in the first embodiment. That is, the calculation system 400 may have a configuration for determining the contact timing.
  • FIG. 10 shows a configuration example of the calculation system 400.
  • the calculation system 400 includes, for example, a smart phone 200 and a calculation device 500 .
  • the smartphone 200 and the computing device 500 are connected, for example, wirelessly or by wire so that they can communicate with each other.
  • the configuration of the smartphone 200 is the same as in the first embodiment. Therefore, description of smartphone 200 is omitted.
  • the calculation device 500 is an information processing device that calculates an index value indicating bending of the waist based on image data acquired by the smartphone 200 .
  • the computing device 500 is a server device or the like.
  • the computing device 500 may be a single information processing device, or may be implemented on a cloud, for example.
  • FIG. 11 shows a configuration example of the calculation device 500.
  • the calculation device 500 has, for example, a communication I/F section 510, a storage section 520, and an arithmetic processing section 530 as main components.
  • the communication I/F unit 510 consists of a data communication circuit. Communication I/F unit 510 performs data communication with an external device, smartphone 200, or the like connected via a communication line.
  • the storage unit 520 is a storage device such as a hard disk or memory.
  • the storage unit 520 stores processing information and programs 527 necessary for various processes in the arithmetic processing unit 530 .
  • the program 527 realizes various processing units by being read into the arithmetic processing unit 530 and executed.
  • the program 527 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 510 and stored in the storage unit 520 .
  • Main information stored in the storage unit 520 includes, for example, a trained model 521, image information 522, skeleton information 523, hand length information 524, height information 525, waist bending value information 526, and the like.
  • the learned model 521, image information 522, and skeleton information 523 are the same information as the learned model 321, image information 322, and skeleton information 323 described in the first embodiment.
  • the hand length information 524 indicates the hand length, which is the sum of the in-image distances of the skeletal lengths connecting the parts existing between the left and right hands, such as palms, wrists, elbows, and shoulders.
  • the hand length information 524 is calculated in advance, for example, based on image data acquired with the hand extended downward so that the hand length is the longest, and is transmitted via the communication I/F unit 510 or the like. It is acquired from an external device or the like and stored in the storage unit 520 .
  • the effort length information 524 may be generated/updated as a result of calculation processing by the effort length calculator 533 .
  • the effort length information 524 includes information indicating the effort length.
  • the hand length information 524 may include information indicating the hand length acquired for each time-series image data.
  • the effort length information 524 may be information in which identification information indicating image data used when calculating the effort length and information indicating the effort length are associated with each other.
  • the height information 525 indicates the height of the person in the image data, which is the intra-image distance connecting the head (forehead) and the toes (for example, the lower one of the left and right).
  • the height information 525 is generated/updated as a result of calculation processing by the height calculator 534, for example.
  • the height information 525 includes information indicating the height acquired for each time-series image data.
  • the height information 525 may be, for example, a correspondence between identification information indicating image data used when calculating the height and information indicating the height.
  • the height information 525 may be information that associates identification information indicating image data that satisfies a predetermined condition with information indicating height.
  • the waist curvature value information 526 indicates the waist curvature value, which is an index value indicating the curvature of the waist.
  • the waist curvature value information 526 is generated and updated as a result of calculation processing by the curvature calculator 535, for example.
  • the waist curvature value information 526 includes information indicating the waist curvature value.
  • the waist curvature value information 526 may include information indicating the waist curvature value acquired for each time-series image data.
  • the waist curvature value information 526 may be, for example, a correspondence between identification information indicating image data used in calculating the waist curvature value and information indicating the waist curvature value.
  • the waist curvature value information 526 may be information that associates identification information indicating image data that satisfies a predetermined condition with information indicating the waist curvature value.
  • the arithmetic processing unit 530 has an arithmetic device such as a CPU and its peripheral circuits.
  • the arithmetic processing unit 530 reads the program 527 from the storage unit 520 and executes it, so that the hardware and the program 527 cooperate to realize various processing units.
  • Main processing units realized by the arithmetic processing unit 530 include, for example, an image acquisition unit 531, a skeleton recognition unit 532, a hand length calculation unit 533, a height calculation unit 534, a bend calculation unit 535, and an output unit 536 .
  • the image acquisition unit 531 and skeleton recognition unit 532 can perform the same processing as the image acquisition unit 331 and skeleton recognition unit 332 described in the first embodiment.
  • the hand length calculation unit 533 calculates the hand length, which is the sum of the in-image distances of the skeleton lengths connecting the left and right hands. Then, the effort length calculation unit 533 stores the calculated effort length as the effort length information 524 in the storage unit 520 .
  • FIG. 12 shows an example of the hand length.
  • hand length calculation unit 533 obtains information indicating the positions of parts such as palms, wrists, elbows, and shoulders between the left and right hands by referring to skeleton information 523 . Then, based on the acquired information, the hand length calculation unit 533 calculates the total sum of the distances between adjacent parts when connecting the adjacent parts with a straight line. For example, as described above, the hand length calculator 533 calculates the hand length based on the position of each part between the left and right hands.
  • the effort length calculation unit 533 can perform the above-described effort length calculation processing for each piece of time-series image data.
  • the effort length calculation unit 533 may be configured to calculate the effort length for each image data.
  • the effort length calculator 533 may be configured to calculate the effort length in image data extracted under arbitrary conditions from time-series image data.
  • the effort length calculation unit 533 may calculate the effort length using parts other than those exemplified above. Also, the information indicating the length of effort can be calculated and acquired in advance. If the information indicating the length of effort is acquired in advance, the calculation device 500 does not need to have the length-of-handling calculation section 533 .
  • the height calculation unit 534 calculates the height, which is the in-image distance from the head to the toes, based on the skeleton information 523 . Height calculator 534 then stores the calculated height as height information 525 in storage 520 .
  • FIG. 12 shows an example of height.
  • the height calculation unit 534 obtains information indicating the positions of the forehead and the left and right toes, which are the highest parts, by referring to the skeleton information 523 . Then, the height calculation unit 534 calculates the distance obtained by connecting the forehead and the lower one of the left and right toes with a straight line based on the acquired information. Alternatively, the height calculation unit 534 calculates the difference in the Y-axis direction (vertical direction in FIG. 12) between the forehead and the lower one of the left and right toes based on the acquired information. For example, as described above, the height calculator 534 calculates the height based on the positions of the head and feet.
  • the height calculation unit 534 can perform the above-described height calculation processing on each piece of time-series image data, similar to the effort length calculation unit 533 .
  • the height calculator 534 may be configured to calculate the height of each piece of image data.
  • the height calculator 534 may be configured to calculate the height of image data extracted under arbitrary conditions from time-series image data.
  • the height calculation unit 534 may calculate the height using parts other than those exemplified above. For example, the height calculation unit 534 may calculate the length between parts including the waist between the neck and the ankles as the height.
  • the curvature calculation unit 535 calculates a waist curvature value, which is an index value indicating the curvature of the waist, based on the length of hand indicated by the length of hand information 524 and the height indicated by the height information 525 . Then, the curvature calculation unit 535 stores the calculated waist curvature value in the storage unit 520 as the waist curvature value information 526 .
  • the curvature calculation unit 535 calculates the waist curvature value by dividing the hand length by the height. That is, the curvature calculation unit 535 calculates the waist curvature value by calculating (hand length)/(height). For example, the curvature calculation unit 535 can calculate the waist curvature value using the hand length and height calculated for each image data.
  • the bend calculation unit 535 uses the length of hand selected from the length of hand information 524 based on a predetermined standard, such as using the longest length of hand among the lengths of hand included in the length of hand information 524.
  • a curvature value may be calculated.
  • the hand length used by the bending calculator 535 to calculate the waist bending value may be a hand length selected according to a predetermined standard from among the hand lengths included in the hand length information 524. .
  • the criteria for selection may be set arbitrarily, such as using the longest one.
  • the effort length information 524 may include only one previously acquired effort length. Therefore, the hand length used when the curvature calculator 535 calculates the waist curvature value may be, for example, a constant value.
  • the curvature calculation unit 535 may use the determination result of the contact timing described in the first embodiment to specify image data for which the waist curvature value is to be calculated.
  • the curvature calculation unit 535 may be configured to calculate the waist curvature value using the height acquired based on the image data determined to be the contact timing.
  • the curvature calculation unit 535 may be configured to calculate the curvature value of the waist using the height acquired based on the image data for a predetermined number of frames from the image data determined to be the contact timing. In this way, by configuring to calculate the bending value of the waist (based on the height) based on the image data specified based on the determination result of the contact timing, the index to be compared with other people can be obtained. , a more appropriate waist bending value can be calculated.
  • the curvature calculation unit 535 may be configured to calculate a waist curvature value for each piece of time-series image data and to calculate an average value of the calculated waist curvature values.
  • the output unit 536 outputs hand length information 524, height information 525, waist bending value information 526, and the like.
  • the output unit 536 can output at least one of the above information to an external device, the smart phone 200, or the like.
  • the above is an example of the configuration of the calculation device 500. Next, an operation example of the calculation device 500 will be described with reference to FIG. 13 .
  • the bend calculation unit 535 acquires the length of effort indicated by the length of effort information 524 and the height indicated by the height information 525 (step S301).
  • the curvature calculation unit 535 calculates a waist curvature value, which is an index value indicating the curvature of the waist, based on the hand length and height (step S302). For example, the curvature calculation unit 535 calculates the waist curvature value by dividing the hand length by the height.
  • the above is an operation example of the calculation device 500 .
  • the calculation device 500 has the bend calculation section 535 . According to such a configuration, the calculation device 500 can calculate the waist bending value based on the acquired hand length and height. That is, according to the above configuration, it is possible to calculate the index value corresponding to the bending of the waist based on the image data.
  • calculation device 500 can have the function of the determination device 300 described in the first embodiment. Further, the calculation device 500 can have modifications similar to the determination device 300 described in the first embodiment.
  • FIG. 14 is a diagram showing a configuration example of the calculation system 600.
  • FIG. 15 is a block diagram showing a configuration example of the calculation device 700.
  • FIG. 16 is a diagram showing an example of height. 17 and 18 are diagrams showing examples of leg components.
  • FIG. 19 is a flow chart showing an operation example of the computing device 700 .
  • a calculation system 600 that calculates an index value indicating the state of a target of interest based on image data acquired using an imaging device such as the smartphone 200 will be described.
  • the calculation system 600 acquires image data showing how a person walks from the back side to the front side of the screen, and calculates the position of the skeleton from the image data. recognize.
  • Calculation system 600 then calculates an index value indicating the state of the object based on the recognition result.
  • an index value indicating the state of the toe is calculated.
  • the calculation system 600 calculates the difference in the X-axis direction and the difference in the Y-axis direction between the positions of the ankle and the tip of the foot, which are parts present in the foot of interest, and the comparative length to be compared. and get.
  • the calculation system 600 calculates an index value indicating the direction of the toe based on the difference in the X-axis direction and the comparative length, and an index value indicating the rise of the toe based on the difference in the Y-axis direction and the comparative length.
  • the calculation system 600 calculates an index value that solves the above problem by comparing the difference in the X-axis direction and the difference in the Y-axis direction between the positions of the ankle and the toe with the comparison length, which is the length to be compared. make it possible.
  • the calculation system 600 may use the difference in the X-axis direction and the difference in the Y-axis direction of the positions of parts existing in the object of interest other than the ankles and toes.
  • the comparative length is the length between parts where the length in the image data varies depending on the proximity of the person to the camera, which is the imaging device, when the image data is acquired, but the length is constant in the person. It means that.
  • the comparative lengths are, for example, the height, which is the distance in the image from the head to the toes described in the second embodiment (the bending of the waist is not considered), the hand length, and the image from the head to the pelvis.
  • One of height, which is the inner distance, and so on can be used.
  • the comparative length may be a length between sites other than those exemplified above.
  • the calculation system 600 described in this embodiment can have the functions of the determination system 100 described in the first embodiment and the calculation system 400 described in the second embodiment.
  • the calculation system 600 may have a configuration for determining contact timing, a configuration for calculating a waist bending value, and the like.
  • FIG. 14 shows a configuration example of the calculation system 600.
  • the calculation system 600 has, for example, a smart phone 200 and a calculation device 700 .
  • the smartphone 200 and the computing device 700 are connected, for example, wirelessly or by wire so that they can communicate with each other.
  • the configuration of the smartphone 200 is the same as in the first embodiment. Therefore, description of smartphone 200 is omitted.
  • the calculation device 700 is an information processing device that calculates, based on the image data acquired by the smartphone 200, an index value indicating the toe lift and the toe direction.
  • the computing device 700 is a server device or the like.
  • the computing device 700 may be a single information processing device, or may be implemented on a cloud, for example.
  • Calculation device 700 may calculate only one of the index value indicating the toe-up and the index value indicating the direction of the toe.
  • FIG. 15 shows a configuration example of the calculation device 700.
  • the calculation device 700 has, for example, a communication I/F section 710, a storage section 720, and an arithmetic processing section 730 as main components.
  • the communication I/F unit 710 consists of a data communication circuit. Communication I/F unit 710 performs data communication with an external device, smartphone 200, or the like connected via a communication line.
  • the storage unit 720 is a storage device such as a hard disk or memory.
  • the storage unit 720 stores processing information and programs 727 necessary for various processes in the arithmetic processing unit 730 .
  • the program 727 realizes various processing units by being read into the arithmetic processing unit 730 and executed.
  • the program 727 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 710 and stored in the storage unit 720 .
  • Main information stored in the storage unit 720 includes, for example, a trained model 721, image information 722, skeleton information 723, comparative length information 724, foot component information 725, toe data information 726, and the like.
  • the learned model 721, image information 722, and skeleton information 723 are the same information as the learned model 321, image information 322, and skeleton information 323 described in the first embodiment.
  • the comparative length information 724 is the length between parts that is constant in the person, although the length in the image data varies depending on the proximity of the person to the camera, which is the imaging device, when the image data is acquired. It shows the comparative length, which is the length.
  • the comparative length information 724 indicates one of the height, which is the distance in the image from the head to the toes, the hand length, the second height, which is the distance in the image from the head to the pelvis, and so on. ing.
  • the comparative length information 724 is generated and updated as a result of calculation processing by the comparative length calculator 733, for example.
  • the comparison length information 724 includes information indicating the comparison length acquired for each time-series image data.
  • the comparison length information 724 may be information that associates identification information indicating image data used when calculating the comparison length with information indicating the comparison length.
  • the foot component information 725 indicates inter-part lengths such as the X component, which is the difference between the positions of the ankle and the tip of the foot in the X-axis direction, and the Y component, which is the difference between the positions of the ankle and the tip of the foot in the Y-axis direction.
  • the foot component information 725 is generated and updated as a result of calculation processing by the foot component calculator 734, for example. Note that the foot component information 725 may include only one of the X component and the Y component.
  • the foot component information 725 includes information indicating the X component and the Y component acquired for each time-series image data.
  • the comparison length information 724 may be information that associates identification information indicating image data used when calculating the X component and the Y component with information indicating the X component and the Y component.
  • the toe data information 726 indicates an index value indicating the rise of the toe and an index value indicating the direction of the toe.
  • the toe data information 726 is generated and updated as a result of calculation processing by the toe data calculator 735, for example.
  • the toe data information 726 includes an index value indicating the toe lift and an index value indicating the direction of the toe.
  • the toe data information 726 includes an index value indicating the rise of the toe and an index value indicating the direction of the toe acquired for each time-series image data.
  • the height information 525 is, for example, information in which identification information indicating image data used when calculating an index value is associated with information indicating an index value indicating a raised toe or an index value indicating the direction of the toe. It's okay.
  • the toe data information 726 is information that associates identification information indicating image data that satisfies a predetermined condition with information indicating an index value indicating a raised toe or an index value indicating the direction of the toe. good too.
  • the arithmetic processing unit 730 has an arithmetic device such as a CPU and its peripheral circuits.
  • the arithmetic processing unit 730 reads the program 727 from the storage unit 720 and executes it, thereby realizing various processing units by cooperating the hardware and the program 727 .
  • Main processing units realized by the arithmetic processing unit 730 include, for example, an image acquisition unit 731, a skeleton recognition unit 732, a comparison length calculation unit 733, a foot component calculation unit 734, and a toe data calculation unit 735. , output 736, and so on.
  • the image acquisition unit 731 and skeleton recognition unit 732 can perform the same processing as the image acquisition unit 331 and skeleton recognition unit 332 described in the first embodiment.
  • the comparative length calculation unit 733 calculates the height, which is the distance in the image from the head to the toes, the hand length, the second height, which is the distance in the image from the head to the pelvis, and the like. A comparison length, which is one of them, is calculated. Then, the comparison length calculator 733 stores the calculated comparison length as the comparison length information 724 in the storage unit 720 .
  • FIG. 16 shows an example of height, which is an example of comparative length.
  • the comparative length calculation unit 733 obtains information indicating the positions of the forehead and the left and right toes, which are the highest parts, by referring to the skeleton information 523 . Then, the comparative length calculator 733 calculates the distance obtained by connecting the forehead and the lower one of the left and right toes with a straight line based on the acquired information. Alternatively, the comparative length calculator 733 calculates the difference in the Y-axis direction (vertical direction in FIG. 12) between the forehead and the lower one of the left and right toes based on the acquired information.
  • the comparative length calculator 733 calculates the height based on the positions of the head and feet.
  • the comparison length calculation unit 733 may be configured to calculate, as a comparison length, a hand length, a second height that is an in-image distance from the head to the pelvis, and the like, through similar processing.
  • the foot component calculation unit 734 refers to the skeleton information 523 to calculate the X component, which is the difference between the positions of the ankle and the tip of the foot in the X-axis direction, and the Y component, which is the difference between the positions of the ankle and the tip of the foot in the Y-axis direction. at least one of the site-to-site lengths of Then, foot component calculation section 734 stores the calculated foot component as foot component information 725 in storage section 720 .
  • FIG. 17 shows an example of the Y component.
  • the foot component calculation unit 734 obtains information indicating the positions of the ankle and the toe by referring to the skeleton information 523 . Then, the foot component calculator 734 calculates the difference between the ankle and the toe in the Y-axis direction (vertical direction in FIG. 17) based on the acquired information. For example, as described above, the foot component calculator 734 calculates the Y component based on the positions of the ankle and the tip of the foot.
  • FIG. 18 shows an example of the X component.
  • foot component calculation section 734 obtains information indicating the positions of the ankle and toe by referring to skeleton information 523 . Then, the foot component calculator 734 calculates the difference between the ankle and the toe in the X-axis direction (horizontal direction in FIG. 18) based on the acquired information. For example, as described above, the foot component calculator 734 calculates the X component based on the ankle and toe positions.
  • the foot component calculation unit 734 may be configured, for example, to calculate the Y component and the X component of each of the left and right feet, or only the Y component and the X component of the left and right foot that are positioned below. It may be configured to calculate the components. Further, the foot component calculation unit 734 may be configured to determine whether or not to calculate the foot component using the determination result of the contact timing described in the first embodiment. For example, the foot component calculator 734 can be configured to calculate the Y component and the X component based on the image data determined as the contact timing. The foot component calculator 734 may be configured to calculate the Y component and the X component based on a predetermined frame of image data from the image data determined to be the contact timing.
  • the toe data calculation unit 735 calculates an index value indicating the rise of the toe based on the comparison length indicated by the comparison length 724 and the Y component indicated by the foot component information 725. Also, the toe data calculator 735 calculates an index value indicating the direction of the toe based on the comparative length indicated by the comparative length 724 and the X component indicated by the foot component information 725 . Then, the toe data calculation unit 735 stores the calculated index value indicating the upward movement of the toe and the calculated index value indicating the direction of the toe in the storage unit 720 as the toe data information 726 .
  • the toe data calculation unit 735 calculates an index value indicating the rise of the toe by dividing the Y component by the comparison length. That is, the toe data calculation unit 735 calculates the index value indicating the rise of the toe by calculating (Y component)/(comparative length). For example, the toe data calculation unit 735 can calculate an index value indicating the rise of the toe using the Y component and the comparison length calculated for each image data.
  • the toe data calculation unit 735 calculates an index value indicating the direction of the toe by dividing the X component by the comparison length. That is, the toe data calculator 735 calculates an index value indicating the direction of the toe by calculating (X component)/(comparative length). For example, the toe data calculation unit 735 can calculate an index value indicating the direction of the toe using the X component and the comparison length calculated for each image data.
  • the toe data calculation unit 735 uses the determination result of the contact timing described in the first embodiment to calculate the image data for which the index value indicating the toe lift and the index value indicating the direction of the toe are to be calculated. may be specified. For example, the toe data calculation unit 735 calculates an index value indicating the rise of the toe and an index value indicating the direction of the toe using the Y component and the X component obtained based on the image data determined as the contact timing and the comparison length.
  • the toe data calculation unit 735 calculates an index value indicating the rise of the toe and the direction of the toe by using the Y component and the X component obtained based on the image data for a predetermined frame from the image data determined as the contact timing and the comparison length. It may be configured to calculate an index value to indicate. In this way, by performing calculation based on image data specified based on the determination result of the contact timing, it becomes possible to evaluate the toe lift and the direction of the toe when the foot touches the ground.
  • the output unit 736 outputs comparative length information 724, foot component information 725, toe data information 726, and the like.
  • the output unit 736 can output at least one of the above information to an external device, the smartphone 200, or the like.
  • the above is a configuration example of the calculation device 700 .
  • an operation example of the calculation device 700 will be described with reference to FIG. 19 .
  • the toe data calculation unit 735 acquires the comparative length indicated by the comparative length information and the Y component and the X component indicated by the foot component information 725 (step S401).
  • the toe data calculation unit 735 calculates toe data based on the comparison length and the Y component and the X component. For example, the toe data calculation unit 735 calculates an index value indicating the rise of the toe based on the comparison length and the Y component. Also, for example, the toe data calculation unit 735 calculates an index value indicating the direction of the toe based on the comparison length and the X component.
  • the above is an operation example of the calculation device 700 .
  • the computing device 700 has a toe data computing section 735.
  • the calculation device 700 can calculate the index value indicating the toe-up and the index value indicating the direction of the toe based on the obtained comparison length and the Y component and the X component.
  • the calculation device 700 is configured to calculate the length between parts existing in the target of interest other than the ankle and the toe as the Y component and the X component. you can
  • the calculation device 700 can have a function as the determination device 300 described in the first embodiment. Further, the calculation device 700 can have modifications similar to the determination device 300 described in the first embodiment. Similarly, the computing device 700 can have the function of the computing device 500 described in the second embodiment. Also, the computing device 700 can have modifications similar to the computing device 500 described in the second embodiment.
  • FIG. 8 an overview of the configuration of a computing device 800, which is an information processing device, will be described.
  • FIG. 20 shows a hardware configuration example of the computing device 800 .
  • the computing device 800 has, as an example, the following hardware configuration.
  • - CPU Central Processing Unit
  • 801 Arimetic unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • Program group 804 loaded into RAM 803 - Storage device 805 for storing program group 804 -
  • a drive device 806 that reads and writes a recording medium 810 external to the information processing device -
  • a communication interface 807 that connects to a communication network 811 outside the information processing apparatus
  • An input/output interface 808 for inputting/outputting data A bus 809 connecting each component
  • the calculation device 800 can realize the functions of the height calculation unit 821 and the curve calculation unit 822 shown in FIG.
  • the program group 804 is stored in the storage device 805 or the ROM 802 in advance, for example, and is loaded into the RAM 803 or the like and executed by the CPU 801 as necessary.
  • the program group 804 may be supplied to the CPU 8801 via the communication network 811 or may be stored in the recording medium 810 in advance, and the drive device 806 may read the program and supply it to the CPU 801 .
  • FIG. 20 shows a hardware configuration example of the computing device 800 .
  • the hardware configuration of the computing device 800 is not limited to the case described above.
  • the computing device 800 may consist of some of the configurations described above, such as not having the drive device 806 .
  • the height calculator 821 calculates the height of the person in the image data based on the image data.
  • the bend calculation unit 822 acquires the hand length, which is the length connecting the left and right hands of the person in the image data. Also, the curve calculation unit 822 calculates an index value indicating the curve of the waist of the person in the image data based on the obtained hand length and the height calculated by the height calculation unit 821 .
  • the calculation device 800 has a height calculation section 821 and a curvature calculation section 822.
  • the bend calculator 822 is configured to acquire the hand length.
  • the curve calculation unit 822 calculates an index value indicating the curve of the waist of the person in the image data based on the obtained hand length and the height calculated by the height calculation unit 821. can do As a result, it is possible to calculate an index value corresponding to the bending of the waist based on the image data.
  • An information processing device such as the computing device 800 described above can be realized by installing a predetermined program in the information processing device.
  • a program which is another aspect of the present invention, causes an information processing apparatus to calculate the height of a person in the image data based on the image data, and connect the left and right hands of the person in the image data.
  • a program for realizing a process of acquiring a hand length, which is a length, and calculating an index value indicating the bending of the waist of a person in image data based on the acquired hand length and the calculated height. is.
  • the information processing device calculates the height of the person in the image data based on the image data, and connects the right and left hands of the person in the image data.
  • the hand length which is the length, is acquired, and an index value indicating the bending of the waist of the person in the image data is calculated based on the acquired hand length and the calculated height.
  • the calculation device 800 having the curvature calculation section 822 described above has the same functions and effects as those described above. can achieve the same purpose.
  • Calculation device 800 realizes the functions of component calculation unit 831, comparison length calculation unit 832, and index value calculation unit 833 shown in FIG. may be configured to
  • the component calculation unit 831 calculates, based on the image data, the inter-part length between the parts present in the target of interest in the person in the image data.
  • the comparison length calculator 832 calculates a comparison length to be compared based on the image data.
  • the index value calculation unit 833 calculates an index value indicating the state of the object of interest based on the inter-part length calculated by the component calculation unit 831 and the comparison length calculated by the comparison length calculation unit 832 .
  • the calculation device 800 can have a component calculation section 831 , a comparison length calculation section 832 and an index value calculation section 833 .
  • the index value calculation unit 833 calculates the index value indicating the state of the object of interest based on the inter-part length calculated by the component calculation unit 831 and the comparison length calculated by the comparison length calculation unit 832. value can be calculated. As a result, it becomes possible to calculate an index value corresponding to the state of the target of interest based on the image data.
  • An information processing device such as the computing device 800 described above can be realized by installing a predetermined program in the information processing device.
  • a program which is another aspect of the present invention, causes an information processing device to calculate, based on image data, an inter-part length between parts existing in a target of interest in a person in the image data, Based on the image data, a comparative length to be compared is calculated, and based on the calculated length between parts and the calculated comparative length, an index value indicating the state of the target of interest is calculated. It is a program for
  • the information processing device calculates, based on the image data, the inter-part length between the parts existing in the target of interest in the person in the image data, Based on the image data, a comparative length to be compared is calculated, and an index value indicating the state of the object of interest is calculated based on the calculated length between parts and the calculated comparative length.
  • the calculation device having the above-described index value calculation unit 833 in order to have the same action and effect as the above-described case. It can achieve the same purpose as the 800.
  • (Appendix 1) a height calculation unit that calculates the height of a person in the image data based on the image data;
  • a hand length which is a length connecting the left and right hands of a person in the image data, is obtained, and based on the obtained hand length and the height calculated by the height calculation unit, a curvature calculation unit that calculates an index value indicating the curvature of the waist of the person;
  • a computing device A computing device.
  • Appendix 2 The calculation device according to appendix 1, wherein the curve calculation unit calculates the index value by dividing the hand length by the height.
  • (Appendix 3) a hand length calculation unit that calculates the hand length of the person in the image data based on the image data; According to appendix 1 or appendix 2, the bend calculation unit calculates the index value based on the hand length calculated by the hand length calculation unit and the height calculated by the height calculation unit. calculator.
  • (Appendix 4) The calculation device according to appendix 3, wherein the hand length calculation unit calculates the hand length based on information indicating coordinates of each part of the person in the image data recognized based on the image data.
  • the bend calculation unit calculates the index based on the length of effort selected according to a predetermined criterion from among the lengths of effort calculated by the length of effort calculation unit and the height calculated by the height calculation unit.
  • the calculation device according to appendix 3 or appendix 4, which calculates a value.
  • the bend calculation unit acquires the length of effort stored in advance in a storage device, and calculates the index value based on the acquired length of effort and the height calculated by the height calculation unit.
  • the calculation device according to appendix 1 or appendix 2.
  • the bend calculation unit obtains contact timing information indicating the timing at which the person's foot contacts the ground in the image data, and calculates the height calculated based on the image data specified based on the contact timing information.
  • the calculation device according to any one of appendices 1 to 6, wherein the index value is calculated based on the (Appendix 8)
  • the height calculation unit calculates the height based on information indicating coordinates of each part of the person in the image data recognized based on the image data.
  • the computing device according to item 1.
  • the information processing device Based on the image data, calculate the height of the person in the image data, A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height.
  • Calculation method for calculating an index value that indicates (Appendix 10) information processing equipment Based on the image data, calculate the height of the person in the image data, A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height.
  • a program for realizing processing that calculates an index value that indicates
  • the programs described in each of the above embodiments and supplementary notes are stored in a storage device or recorded in a computer-readable recording medium.
  • the recording medium is a portable medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory.

Abstract

A calculation device 800 has: a height calculation unit 821 that calculates, on the basis of image data, a height of a person in the image data, from the head to the tips of feet; and a bending calculation unit 822 for acquiring a hand-to-hand length that is the skeleton length between left and right hands of the person in the image data, and for calculating an index value indicating bending of the waist of the person in the image data on the basis of the acquired hand-to-hand length and the height calculated by the height calculation unit 821.

Description

算出装置calculator
 本発明は、算出装置、算出方法、記録媒体に関する。 The present invention relates to a calculation device, a calculation method, and a recording medium.
 データに基づいてユーザの歩行分析を行うことが知られている。 It is known to perform user gait analysis based on data.
 人物の歩行分析について記載された文献として、例えば、特許文献1がある。特許文献1には、デプスセンサから2種類の画像データを取得するデータ取得部と、データ取得部が取得した画像データに基づいて骨格情報を作成する骨格情報作成部と、骨格情報作成部が作成した骨格情報を補正する補正処理部と、補正後の骨格情報を用いてユーザの歩行を分析する分析処理部と、を有する歩行分析装置が記載されている。 For example, Patent Document 1 is a document describing human gait analysis. In Patent Document 1, a data acquisition unit that acquires two types of image data from a depth sensor, a skeleton information creation unit that creates skeleton information based on the image data acquired by the data acquisition unit, and a skeleton information creation unit that creates A gait analysis device having a correction processing unit that corrects skeleton information and an analysis processing unit that analyzes a user's gait using the corrected skeleton information is described.
国際公開第2017/170832号WO2017/170832
 歩行分析を行う際に算出したい指標の一例として腰の曲がりがある。しかしながら、例えば、特許文献1に記載されているようなデプスセンサを用いずに画像データに基づく分析を行う場合、奥行き方向の情報を取得することが出来ないため、腰の曲がりに応じた指標値を算出することが難しかった。 An example of an index that you would like to calculate when performing gait analysis is the curvature of the waist. However, for example, when performing an analysis based on image data without using a depth sensor as described in Patent Document 1, since information in the depth direction cannot be obtained, an index value corresponding to the bending of the waist cannot be obtained. was difficult to calculate.
 そこで、本発明の目的は、画像データに基づいて腰の曲がりに応じた指標値を算出することが難しい、という課題を解決する判定装置、判定方法、記録媒体を提供することにある。 Therefore, an object of the present invention is to provide a determination device, a determination method, and a recording medium that solve the problem that it is difficult to calculate an index value corresponding to the curvature of the waist based on image data.
 かかる目的を達成するため本開示の一形態である算出装置は、
 画像データに基づいて、当該画像データ内の人物の高さを算出する高さ算出部と、
 前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する曲がり算出部と、
 を有する
 という構成をとる。
In order to achieve such an object, a computing device, which is one aspect of the present disclosure,
a height calculation unit that calculates the height of a person in the image data based on the image data;
A hand length, which is a length connecting the left and right hands of a person in the image data, is obtained, and based on the obtained hand length and the height calculated by the height calculation unit, a curvature calculation unit that calculates an index value indicating the curvature of the waist of the person;
It has a configuration of
 また、本開示の他の形態である算出方法は、
 情報処理装置が、
 画像データに基づいて、当該画像データ内の人物の高さを算出し、
 前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する
 という構成をとる。
In addition, a calculation method, which is another aspect of the present disclosure,
The information processing device
Based on the image data, calculate the height of the person in the image data,
A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height. Calculating an index value indicating
 また、本開示の他の形態である記録媒体は、
 情報処理装置に、
 画像データに基づいて、当該画像データ内の人物の高さを算出し、
 前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する
 処理を実現するためのプログラムを記録した、コンピュータが読み取り可能な記録媒体である。
In addition, a recording medium that is another aspect of the present disclosure includes:
information processing equipment,
Based on the image data, calculate the height of the person in the image data,
A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height. It is a computer-readable recording medium in which a program for realizing a process of calculating an index value indicating is recorded.
 上述したような各構成によると、画像データに基づいて腰の曲がりに応じた指標値を算出することが可能となる。 According to each configuration as described above, it is possible to calculate an index value corresponding to the bending of the waist based on the image data.
本開示の第1の実施形態における判定システムの構成例を示す図である。1 is a diagram illustrating a configuration example of a determination system according to a first embodiment of the present disclosure; FIG. 判定システムにおいて取得する画像データの一例を説明するための図である。FIG. 4 is a diagram for explaining an example of image data acquired by the determination system; FIG. 図1で示す判定装置の構成例を示すブロック図である。2 is a block diagram showing a configuration example of a determination device shown in FIG. 1; FIG. 骨格情報の一例を示す図である。FIG. 4 is a diagram showing an example of skeleton information; 接地判断情報の一例を示す図である。It is a figure which shows an example of grounding determination information. 接地判断部による判断例を説明するための図である。FIG. 10 is a diagram for explaining an example of determination by a contact determination unit; 接地タイミング判定部による判定例を説明するための図である。FIG. 4 is a diagram for explaining an example of determination by a touch timing determination unit; 接地判断部による処理の一例を示すフローチャートである。5 is a flow chart showing an example of processing by a ground contact determination unit; 接地タイミング判定部による処理の一例を示すフローチャートである。6 is a flow chart showing an example of processing by a contact timing determination unit; 本開示の第2の実施形態における算出システムの構成例を示す図である。FIG. 11 is a diagram illustrating a configuration example of a calculation system according to a second embodiment of the present disclosure; FIG. 図10で示す算出装置の構成例を示すブロック図である。11 is a block diagram showing a configuration example of the calculation device shown in FIG. 10; FIG. 手間長さと高さの一例を示す図である。It is a figure which shows an example of hand length and height. 算出装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of a calculation apparatus. 本開示の第3の実施形態における算出システムの構成例を示す図である。FIG. 11 is a diagram illustrating a configuration example of a calculation system according to a third embodiment of the present disclosure; FIG. 図14で示す算出装置の構成例を示すブロック図である。15 is a block diagram showing a configuration example of the calculation device shown in FIG. 14; FIG. 高さの一例を示す図である。It is a figure which shows an example of height. 成分の一例を示す図である。It is a figure which shows an example of a component. 成分の他の一例を示す図である。It is a figure which shows another example of a component. 算出装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of a calculation apparatus. 本開示の第4の実施形態における算出装置のハードウェア構成例を示す図である。FIG. 13 is a diagram illustrating a hardware configuration example of a computing device according to the fourth embodiment of the present disclosure; FIG. 算出装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of a calculation apparatus. 算出装置の他の構成例を示すブロック図である。FIG. 11 is a block diagram showing another configuration example of the calculation device;
[第1の実施形態]
 本開示の第1の実施形態について、図1から図9までを参照して説明する。図1は、判定システム100の構成例を示す図である。図2は、判定システム100において取得する画像データの一例を説明するための図である。図3は、判定装置300の構成例を示すブロック図である。図4は、骨格情報323の一例を示す図である。図5は、接地判断情報324の一例を示す図である。図6は、接地判断部333による判断例を説明するための図である。図7は、接地タイミング判定部334による判定例を説明するための図である。図8は、接地判断部333による処理の一例を示すフローチャートである。図9は、接地タイミング判定部334による処理の一例を示すフローチャートである。
[First embodiment]
A first embodiment of the present disclosure will be described with reference to FIGS. 1 to 9. FIG. FIG. 1 is a diagram showing a configuration example of a determination system 100. As shown in FIG. FIG. 2 is a diagram for explaining an example of image data acquired by the determination system 100. As shown in FIG. FIG. 3 is a block diagram showing a configuration example of the determination device 300. As shown in FIG. FIG. 4 is a diagram showing an example of the skeleton information 323. As shown in FIG. FIG. 5 is a diagram showing an example of the contact determination information 324. As shown in FIG. FIG. 6 is a diagram for explaining an example of determination by the contact determination unit 333. In FIG. FIG. 7 is a diagram for explaining an example of determination by the touchdown timing determining section 334. In FIG. FIG. 8 is a flow chart showing an example of processing by the contact determination unit 333. As shown in FIG. FIG. 9 is a flow chart showing an example of processing by the contact timing determination section 334. As shown in FIG.
 本開示の第1の実施形態においては、スマートフォン200などの撮像装置を用いて取得した画像データに基づいて、歩行する人物の足が接地したタイミングを判定する判定システム100について説明する。判定システム100では、画面の奥側から手前側に向かって人物が歩く様子を示す時系列の画像データを取得する。後述するように、判定システム100は、複数の画像データそれぞれから骨格の位置を認識する。そして、判定システム100は、認識した位置の変化具合に基づいて、足が接地したタイミングを判定する。 In the first embodiment of the present disclosure, a determination system 100 that determines the timing at which the foot of a walking person touches the ground based on image data acquired using an imaging device such as the smartphone 200 will be described. The determination system 100 acquires time-series image data showing how a person walks from the back side to the front side of the screen. As will be described later, the determination system 100 recognizes the position of the skeleton from each of the plurality of image data. Then, the determination system 100 determines the timing at which the foot touches the ground based on the recognized change in position.
 なお、判定システム100が判定した接地のタイミングは、例えば、足が床についたタイミングでの各関節の角度やつま先が床から上がっているかなどの指標、ピッチ、ストライド、一歩行周期などの様々な歩行分析を行う際に活用することが出来る。判定システム100が判定した接地のタイミングは、上記例示した以外に活用されてもよい。 It should be noted that the timing of landing determined by the determination system 100 can be, for example, an index such as the angle of each joint at the timing when the foot touches the floor, whether the toe is raised from the floor, pitch, stride, various steps such as the step cycle. It can be used for gait analysis. The touchdown timing determined by the determination system 100 may be used in addition to the above examples.
 図1は、判定システム100の構成例を示している。図1を参照すると、判定システム100は、例えば、スマートフォン200と、判定装置300と、を有している。図1で示すように、スマートフォン200と判定装置300とは、例えば、無線、または、有線により、互いに通信可能なよう接続されている。 FIG. 1 shows a configuration example of the determination system 100. FIG. Referring to FIG. 1 , the determination system 100 has, for example, a smart phone 200 and a determination device 300 . As shown in FIG. 1, the smartphone 200 and the determination device 300 are connected, for example, wirelessly or by wire so that they can communicate with each other.
 スマートフォン200は、人物が歩く様子を撮影する撮像装置として機能する。スマートフォン200は、画像データを取得するカメラ機能、画面表示を行うタッチパネル、GPSセンサや加速度センサなどの各種センサ、などの一般的な機能を有するスマートフォンであって構わない。 The smart phone 200 functions as an imaging device that captures how a person walks. The smartphone 200 may be a smartphone having general functions such as a camera function for acquiring image data, a touch panel for displaying a screen, and various sensors such as a GPS sensor and an acceleration sensor.
 本実施形態の場合、図2で示すように、スマートフォン200は、画面奥から手前など人物が手前奥方向に歩く様子を撮影する。換言すると、スマートフォン200は、人物が手前奥方向に歩く様子を示す時系列の画像データを取得する。また、スマートフォン200は、取得した画像データを判定装置300へと送信する。スマートフォン200は、画像データを取得した日時を示す情報などを画像データに対応付けて、対応づけたデータを判定装置300へと送信してもよい。 In the case of this embodiment, as shown in FIG. 2, the smartphone 200 shoots a person walking in the front-back direction, such as from the back of the screen to the front. In other words, smartphone 200 acquires time-series image data showing how a person walks in the front-to-back direction. The smartphone 200 also transmits the acquired image data to the determination device 300 . The smartphone 200 may associate information indicating the date and time when the image data was acquired with the image data, and transmit the associated data to the determination device 300 .
 判定装置300は、スマートフォン200が取得した画像データに基づいて、歩行する人物の足が地面に接地したタイミングを判定する情報処理装置である。例えば、判定装置300は、サーバ装置などである。判定装置300は、1台の情報処理装置であってもよいし、例えば、クラウド上などで実現されてもよい。 The determination device 300 is an information processing device that determines the timing at which the foot of a walking person touches the ground based on the image data acquired by the smartphone 200 . For example, the determination device 300 is a server device or the like. The determination device 300 may be a single information processing device, or may be implemented on a cloud, for example.
 図3は、判定装置300の構成例を示している。図3を参照すると、判定装置300は、主な構成要素として、例えば、通信I/F部310と、記憶部320と、演算処理部330と、を有している。 FIG. 3 shows a configuration example of the determination device 300. FIG. Referring to FIG. 3, the determination device 300 has, for example, a communication I/F section 310, a storage section 320, and an arithmetic processing section 330 as main components.
 通信I/F部310は、データ通信回路からなる。通信I/F部310は、通信回線を介して接続された外部装置やスマートフォン200などとの間でデータ通信を行う。 The communication I/F unit 310 consists of a data communication circuit. Communication I/F unit 310 performs data communication with an external device, smartphone 200, or the like connected via a communication line.
 記憶部320は、ハードディスクやメモリなどの記憶装置である。記憶部320は、演算処理部330における各種処理に必要な処理情報やプログラム326を記憶する。プログラム326は、演算処理部330に読み込まれて実行されることにより各種処理部を実現する。プログラム326は、通信I/F部310などのデータ入出力機能を介して外部装置や記録媒体から予め読み込まれ、記憶部320に保存されている。記憶部320で記憶される主な情報としては、例えば、学習済みモデル321、画像情報322、骨格情報323、接地判断情報324、接地タイミング情報325などがある。 The storage unit 320 is a storage device such as a hard disk or memory. The storage unit 320 stores processing information and programs 326 necessary for various processes in the arithmetic processing unit 330 . The program 326 realizes various processing units by being read and executed by the arithmetic processing unit 330 . The program 326 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 310 and stored in the storage unit 320 . Main information stored in the storage unit 320 includes, for example, a learned model 321, image information 322, skeleton information 323, contact determination information 324, contact timing information 325, and the like.
 学習済みモデル321は、骨格認識部332が骨格認識を行う際に用いる、学習済みのモデルである。学習済みモデル321は、例えば、外部装置などにおいて、骨格座標が入った画像データなどの教師データを用いた機械学習を行うことにより予め生成されており、通信I/F部310などを介して外部装置などから取得され、記憶部320に格納されている。学習済みモデル321は、既知の方法を用いて生成されたものであってよい。学習済みモデル321は、追加の教師データを用いた再学習処理などにより更新されても構わない。 The trained model 321 is a trained model used when the skeleton recognition unit 332 performs skeleton recognition. The learned model 321 is, for example, generated in advance by performing machine learning using teacher data such as image data containing skeletal coordinates in an external device, etc. It is acquired from a device or the like and stored in the storage unit 320 . Trained model 321 may be generated using known methods. The learned model 321 may be updated by re-learning processing using additional teacher data.
 画像情報322は、スマートフォン200が有するカメラが取得した時系列の画像データを含んでいる。画像情報322は、画像取得部331が画像データを取得することなどにより生成・更新される。 The image information 322 includes time-series image data acquired by the camera of the smartphone 200 . The image information 322 is generated and updated when the image acquisition unit 331 acquires image data.
 例えば、画像情報322では、画像データを識別するための識別情報と、画像データと、が対応付けられている。画像情報322は、画像データをスマートフォン200が取得した日時を示す情報など上記例示した以外の情報を含んでもよい。 For example, in the image information 322, identification information for identifying image data is associated with image data. The image information 322 may include information other than the above examples, such as information indicating the date and time when the smartphone 200 acquired the image data.
 骨格情報323は、骨格認識部332により認識された人物の各部位の座標を示す情報を含んでいる。骨格情報323は、例えば、時系列の各画像データについての各部位の座標を示す情報を含んでいる。例えば、骨格情報323は、骨格認識部332による処理の結果として生成・更新される。 The skeleton information 323 includes information indicating the coordinates of each part of the person recognized by the skeleton recognition unit 332 . The skeleton information 323 includes, for example, information indicating the coordinates of each part of each piece of time-series image data. For example, the skeleton information 323 is generated/updated as a result of processing by the skeleton recognition unit 332 .
 図4は、骨格情報323の一例を示している。図4を参照すると、骨格情報323では、例えば、識別情報と、各部位の位置情報と、が対応づけられている。ここで、識別情報は、骨格情報を認識する際に用いた画像データを示す情報などである。識別情報は、画像情報322における画像データを識別するための識別情報に対応するものであってよい。また、各部位の位置情報は、骨盤の位置など、画像データ中における各部位の座標を示す情報を含んでいる。 FIG. 4 shows an example of the skeleton information 323. Referring to FIG. 4, in skeleton information 323, for example, identification information is associated with position information of each part. Here, the identification information is, for example, information indicating image data used when recognizing skeleton information. The identification information may correspond to identification information for identifying image data in the image information 322 . Also, the position information of each part includes information indicating the coordinates of each part in the image data, such as the position of the pelvis.
 なお、各部位の位置情報に含まれる部位は、学習済みモデル321に応じたものである。例えば、図4では、骨盤、背骨中央、……、右膝、左膝、……、右足首、左足首、……
が例示されている。各部位の位置情報には、例えば、右肩、……、左ひじ、……、など、30か所程度の部位を含むことが出来る(例示した以外でも構わない)。本実施形態の場合、各部位の位置情報には、少なくとも右足首と左足首の座標を示す情報が含まれている。各部位の位置情報に含まれる部位は、図4などで例示した以外であってもよい。
It should be noted that the part included in the position information of each part corresponds to the learned model 321 . For example, in FIG. 4, the pelvis, the center of the spine, ..., the right knee, the left knee, ..., the right ankle, the left ankle, ...
is exemplified. The position information of each part can include about 30 parts such as right shoulder, . . . , left elbow, . In the case of this embodiment, the position information of each part includes at least information indicating the coordinates of the right ankle and the left ankle. The parts included in the position information of each part may be other than those illustrated in FIG. 4 and the like.
 接地判断情報324は、接地判断部333による判断の結果を示す情報を含んでいる。例えば、接地判断情報324は、接地判断部333による判断の結果として生成・更新される。 The ground contact determination information 324 includes information indicating the determination result of the ground contact determination unit 333 . For example, the ground contact determination information 324 is generated and updated as a result of determination by the ground contact determination section 333 .
 図5は、接地判断情報324の一例を示している。図5で示すように、例えば、接地判断情報324では、識別情報と接地判断結果とが対応付けられている。ここで、識別情報は、接地判断を行った画像データを示す情報などである。識別情報は、画像情報322における画像データを識別するための識別情報に対応するものであってよい。また、接地判断結果は、接地判断の結果を示している。例えば、接地判断結果は、左足が接地していると判断した結果を示す「左足」、右足が接地していると判断した結果を示す「右足」、接地判断から除外すると判断した旨を示す「―」などが含まれる。接地判断結果は、上記例示した以外を示してもよい。 FIG. 5 shows an example of the contact determination information 324. FIG. As shown in FIG. 5, for example, in contact determination information 324, identification information and contact determination results are associated with each other. Here, the identification information is, for example, information indicating image data for which grounding determination has been performed. The identification information may correspond to identification information for identifying image data in the image information 322 . Further, the ground contact determination result indicates the ground contact determination result. For example, the ground contact determination result is "left foot" indicating the result of determining that the left foot is on the ground, "right foot" indicating the result of determining that the right foot is on the ground, and "determining that it is excluded from the ground contact determination". -" etc. are included. The contact determination result may indicate other than the above examples.
 接地タイミング情報325は、接地タイミング判定部334による判定の結果を示す情報を含んでいる。例えば、接地タイミング情報325は、接地タイミング判定部334による判定の結果として生成・更新される。 The touchdown timing information 325 includes information indicating the determination result of the touchdown timing determination unit 334 . For example, the touchdown timing information 325 is generated and updated as a result of determination by the touchdown timing determination section 334 .
 例えば、接地タイミング情報325は、足が接地したタイミングを示す情報を含んでいる。ここで、足が接地したタイミングを示す情報とは、例えば、接地したと判定した画像データを示す識別情報、接地したと判定した画像データの取得日時を示す情報、などのうちの少なくとも1つである。接地タイミング情報325は、例えば、接地した足を示す情報と、足が接地したタイミングを示す情報と、を対応付けたものなどであってもよい。 For example, the ground contact timing information 325 includes information indicating the timing at which the foot touches the ground. Here, the information indicating the timing at which the foot touched the ground is, for example, at least one of identification information indicating the image data determined to have touched the ground, information indicating the acquisition date and time of the image data determined to have touched the ground, and the like. be. The contact timing information 325 may be, for example, information indicating the foot that has touched the ground and information indicating the timing at which the foot touched the ground in association with each other.
 演算処理部330は、CPUなどの演算装置とその周辺回路を有する。演算処理部330は、記憶部320からプログラム326を読み込んで実行することにより、上記ハードウェアとプログラム326とを協働させて各種処理部を実現する。演算処理部330で実現される主な処理部としては、例えば、画像取得部331と、骨格認識部332と、接地判断部333と、接地タイミング判定部334と、出力部335と、などがある。 The arithmetic processing unit 330 has an arithmetic device such as a CPU and its peripheral circuits. The arithmetic processing unit 330 reads the program 326 from the storage unit 320 and executes it, so that the hardware and the program 326 work together to realize various processing units. Main processing units realized by the arithmetic processing unit 330 include, for example, an image acquisition unit 331, a skeleton recognition unit 332, a contact determination unit 333, a contact timing determination unit 334, an output unit 335, and the like. .
 画像取得部331は、通信I/F部310を介して、スマートフォン200から当該スマートフォン200が取得した画像データを取得する。例えば、画像取得部331は、スマートフォン200から時系列の画像データを取得する。また、画像取得部331は、取得した画像データを画像情報322として記憶部320に格納する。 The image acquisition unit 331 acquires image data acquired by the smartphone 200 from the smartphone 200 via the communication I/F unit 310 . For example, the image acquisition unit 331 acquires time-series image data from the smartphone 200 . The image acquisition unit 331 also stores the acquired image data in the storage unit 320 as the image information 322 .
 骨格認識部332は、学習済みモデル321を用いて、画像データ中において歩行姿勢測定の対象となる人物の骨格を認識する。例えば、骨格認識部332は、時系列の各画像データについて骨格を認識する。骨格認識部332は、時系列の画像データから所定の条件に基づいて抽出した画像データについて骨格の認識を行うよう構成してもよい。例えば、骨格認識部332は、背骨上部、右肩、左肩、右ひじ、左ひじ、右手首、左手首、右手、左手、……、などの各部位を認識する。また、骨格認識部332は、認識した各部位の画面データにおける座標を算出する。そして、骨格認識部332は、認識・算出した結果を、画像データを識別するための識別情報などと対応付けて、骨格情報323として記憶部320に格納する。このように、骨格認識部332は、画像データ中における人物の各部位が存在する位置を示す情報を取得する。 The skeleton recognition unit 332 uses the trained model 321 to recognize the skeleton of the person whose walking posture is to be measured in the image data. For example, the skeleton recognition unit 332 recognizes the skeleton of each piece of time-series image data. The skeleton recognition unit 332 may be configured to recognize the skeleton of image data extracted based on a predetermined condition from time-series image data. For example, the skeleton recognition unit 332 recognizes each part such as the upper part of the spine, the right shoulder, the left shoulder, the right elbow, the left elbow, the right wrist, the left wrist, the right hand, the left hand, and so on. The skeleton recognition unit 332 also calculates the coordinates of the recognized parts in the screen data. Then, the skeleton recognition unit 332 associates the recognition/calculation results with identification information for identifying the image data and stores them in the storage unit 320 as the skeleton information 323 . In this way, the skeleton recognition unit 332 acquires information indicating the position of each part of the person in the image data.
 なお、骨格認識部332が認識する部位は、学習済みモデル321(学習済みモデル321を学習する際に用いられた教師データ)に応じたものとなる。そのため、骨格認識部332は、学習済みモデル321に応じて、上記例示した以外の部位を認識しても構わない。 The parts recognized by the skeleton recognition unit 332 correspond to the trained model 321 (teaching data used when learning the trained model 321). Therefore, the skeleton recognition unit 332 may recognize parts other than those exemplified above according to the trained model 321 .
 接地判断部333は、骨格情報323が示す人物の骨格に基づいて、接地している足を判断する。例えば、接地判断部333は、骨格情報323に含まれる右足首や左足首の座標の変化具合に基づいて、歩行している人物について右足と左足のうちのいずれの足を接地しているかを判断する。例えば、接地判断部333は、骨格情報323を参照して足首の座標など位置を示す情報を取得する。そして、接地判断部333は、足首の座標の変化具合が所定の条件を満たす場合に、当該条件を満たす足首を有する足が接地していると判断する。また、接地判断部333は、判断の結果を示す情報を接地判断情報324として記憶部320に格納する。 The grounding determination unit 333 determines which foot is in contact with the ground based on the skeleton of the person indicated by the skeleton information 323 . For example, the contact determination unit 333 determines which of the right and left legs of the walking person is on the ground, based on the degree of change in the coordinates of the right and left ankles included in the skeleton information 323. do. For example, the contact determination unit 333 refers to the skeleton information 323 and acquires information indicating the position such as the coordinates of the ankle. Then, when the degree of change in the coordinates of the ankle satisfies a predetermined condition, the ground contact determination unit 333 determines that the foot having the ankle that satisfies the condition is in contact with the ground. Further, the contact determination unit 333 stores information indicating the result of the decision in the storage unit 320 as contact determination information 324 .
 図6は、右足首と左足首のY座標(画像データの上下方向の座標)の変化の一例を示している。例えば、図6は、X軸が時間に応じた値を示しており、Y軸が画像データにおける上下方向の座標(Y座標)を示している。例えば、図6の場合、Y軸の値が大きいほど、右足首や左足首が画像データの下方側に位置していること(つまり、右足首や左足首が画面の手前側に位置していること)を示している。 FIG. 6 shows an example of changes in the Y coordinates (coordinates in the vertical direction of the image data) of the right and left ankles. For example, in FIG. 6, the X axis indicates values corresponding to time, and the Y axis indicates vertical coordinates (Y coordinates) in image data. For example, in the case of FIG. 6, the larger the Y-axis value, the lower the right or left ankle in the image data (that is, the closer the right or left ankle is to the front of the screen). It is shown that.
 一般に、人物が歩行する場合、右足と左足のうちの一方が地面についたままで他方が前方へと進む、右足と左足の両方が接地、右足と左足のうちの他方が地面についたままで一方が前方へと進む、右足と左足の両方が接地、という運動を交互に繰り返す。つまり、本実施形態の場合、右足首のY座標が一定かつ左足首のY座標が増える、右足首のY座標と左足首のY座標が一定、左足首のY座標が一定かつ右足首のY座標が増える、右足首のY座標と左足首のY座標が一定、という状態を繰り返すことが想定される。 In general, when a person walks, one of the right and left feet remains on the ground and the other moves forward, both the right and left feet touch the ground, and one of the right and left feet remains on the ground and one moves forward. , and both right and left feet touch the ground, repeating this exercise alternately. That is, in the case of this embodiment, the Y coordinate of the right ankle is constant and the Y coordinate of the left ankle is increased, the Y coordinate of the right ankle and the Y coordinate of the left ankle are constant, the Y coordinate of the left ankle is constant and the Y coordinate of the right ankle is constant. It is assumed that the state in which the coordinates increase and the Y coordinate of the right ankle and the Y coordinate of the left ankle are constant is repeated.
 そこで、接地判断部333は、右足首と左足首のY座標の変化具合に基づいて、いずれの足が接地しているか判断する。換言すると、接地判断部333は、足首のY座標が一定であると評価できる場合に、当該足首を有する足が接地していると判断する。具体的には、例えば、接地判断部333は、判断対象となる画像データにおける足首のY座標と時系列の画像データにおいて判断対象よりも1つ前の画像データにおける足首のY座標とを比較する。そして、1つ前の画像データにおける足首のY座標からの変位(変化量)が所定値以下である場合に、接地判断部333は、Y座標が一定であるとして当該Y座標が一定な足首を有する足が接地していると判断する。なお、接地判断部333は、上記例示した以外の方法によりY座標が一定であると判断してもよい。また、Y座標が一定であると接地判断部333が判断する変位の範囲は任意に設定してよい。 Therefore, the ground contact determination unit 333 determines which foot is grounded based on the degree of change in the Y coordinates of the right ankle and the left ankle. In other words, the ground contact determination unit 333 determines that the foot having the ankle is in contact with the ground when it can be evaluated that the Y coordinate of the ankle is constant. Specifically, for example, the contact determination unit 333 compares the Y coordinate of the ankle in the image data to be determined with the Y coordinate of the ankle in the image data preceding the determination target in the time-series image data. . Then, when the displacement (amount of change) of the ankle from the Y coordinate in the previous image data is equal to or less than a predetermined value, the contact determination unit 333 assumes that the Y coordinate is constant and determines the ankle with the constant Y coordinate. It is determined that the foot that has the ground is in contact with the ground. Note that the contact determination unit 333 may determine that the Y coordinate is constant by a method other than the above-described example. Further, the range of displacement in which the ground contact determination unit 333 determines that the Y coordinate is constant may be set arbitrarily.
 また、接地判断部333は、足首のY座標に基づいて、接地していると判断した足ではない方の足が動き出したと判断される場合に、接地判断から除外する旨の判断を行うよう構成してもよい。例えば、一般に歩行の場合、右足が接地した後、左足が動き出す。接地判断部333は、右足首のY座標が一定となった後、左足首のY座標が増え始めたタイミングで、接地判断から除外する旨の判断を行ってよい。具体的には、例えば、接地判断部333は、右足首のY座標が一定となった後、時系列の画像データにおいて1つ前の画像データからの左足首のY座標の変位が1つ前の画像データからの右足首のY座標の変位を超えた場合に、左足が動き出したと判断して接地判断から除外することが出来る。 Further, the contact determination unit 333 is configured to determine to exclude from contact determination when it is determined that the leg other than the leg determined to be in contact with the ground has started to move based on the Y-coordinate of the ankle. You may For example, in general walking, the left foot begins to move after the right foot touches the ground. After the Y coordinate of the right ankle becomes constant, the contact determination section 333 may make a decision to exclude from contact determination at the timing when the Y coordinate of the left ankle starts to increase. Specifically, for example, after the Y coordinate of the right ankle becomes constant, the contact determination unit 333 determines that the displacement of the Y coordinate of the left ankle from the previous image data in the time-series image data When the displacement of the Y coordinate of the right ankle from the image data is exceeded, it can be determined that the left foot has started to move and can be excluded from the ground contact determination.
 また、一般に、人物が歩行する際、新たに接地する足は既に接地している足よりも前方に進む。つまり、本実施形態の場合、新たに接地する足の足首のY座標は既に接地している足の足首のY座標よりも大きくなっている。そのため、接地判断部333は、右足首と左足首のうちのY座標が大きい足首のみに着目して接地しているか否かの判断を行うよう構成してもよい。 Also, in general, when a person walks, the foot that newly touches the ground advances further than the foot that has already touched the ground. That is, in the case of this embodiment, the Y coordinate of the ankle of the newly grounded foot is larger than the Y coordinate of the ankle of the already grounded foot. Therefore, the contact determination unit 333 may be configured to determine whether or not the foot is contacting the ground by focusing on only the ankle having the larger Y coordinate between the right ankle and the left ankle.
 また、接地判断部333は、接地判断を行う際、足首の座標の他に膝の座標も参照するように構成してよい。例えば、接地判断部333は、膝の座標変化が所定の条件を満たす場合に、接地判断から除外する旨の判断を行うよう構成してもよい。具体的には、例えば、接地判断部333は、接地しているか否か判断する対象となる足について、時系列の画像データにおいて1つ前の画像データからの膝のY座標の変位が0以下である場合、接地判断から除外する旨の判断を行うよう構成してもよい。 Further, the contact determination unit 333 may be configured to refer to the coordinates of the knee in addition to the coordinates of the ankle when determining the contact. For example, the contact determination unit 333 may be configured to make a decision to exclude from the contact decision when the coordinate change of the knee satisfies a predetermined condition. Specifically, for example, the ground contact determination unit 333 determines whether the displacement of the Y coordinate of the knee from the previous image data in the time-series image data is 0 or less for the leg to be determined whether or not it is grounded. , it may be configured to make a determination to exclude it from the ground contact determination.
 例えば、以上のように、接地判断部333は、足首のY座標の変化具合に基づいて接地判断を行う。なお、接地判断部333は、つま先部分のY座標を用いるなど、上記例示した以外の骨格の変化具合に基づいて接地判断を行うよう構成してもよい。 For example, as described above, the contact determination unit 333 makes contact determination based on the degree of change in the Y coordinate of the ankle. Note that the contact determination unit 333 may be configured to perform contact determination based on the degree of change in the skeleton other than the above example, such as using the Y coordinate of the toe portion.
 接地タイミング判定部334は、骨格情報323が示す人物の骨格に基づいて判断される足の設置状況に基づいて足が地面に接地したタイミングを判定する。例えば、接地タイミング判定部334は、接地判断情報324が示す接地する足の変化に基づいて、足が地面に接地したタイミングを判定する。そして、接地タイミング判定部334は、判定した結果を接地タイミング情報325として記憶部320に格納する。 The ground contact timing determination unit 334 determines the timing at which the foot touches the ground based on the placement of the foot determined based on the skeleton of the person indicated by the skeleton information 323 . For example, the contact timing determination unit 334 determines the timing at which the foot contacts the ground based on the change in the contacting foot indicated by the contact determination information 324 . Then, the contact timing determination section 334 stores the decision result as the contact timing information 325 in the storage section 320 .
 例えば、接地タイミング判定部334は、接地判断結果が示す接地している足が別の足に切り替わるタイミングで足が地面に接地したと判定する。例えば、図7を参照すると、識別情報「124」「125」において「左足」が接地しており、識別情報「126」「127」において接地判断から除外されている。その後、識別情報「128」で「右足」が接地していると判断されている。つまり、図7の場合、識別情報「128」のタイミングで接地している足が左足から右足へと切り替わっている。そこで、接地タイミング判定部334は、識別情報「128」のタイミングで右足が接地したと判定する。 For example, the ground contact timing determination unit 334 determines that the foot has touched the ground at the timing when the grounded foot indicated by the ground contact determination result is switched to another foot. For example, referring to FIG. 7, the "left foot" is grounded in the identification information "124" and "125", and is excluded from the grounding determination in the identification information "126" and "127". After that, it is determined that the "right foot" is on the ground with the identification information "128". That is, in the case of FIG. 7, the foot on the ground is switched from the left foot to the right foot at the timing of the identification information "128". Therefore, the ground contact timing determination unit 334 determines that the right foot has grounded at the timing of the identification information "128".
 出力部335は、接地タイミング情報325や接地判断情報324などを出力する。例えば、出力部335は、外部装置やスマートフォン200などに対して、上記各情報のうちの少なくとも1つを出力することが出来る。 The output unit 335 outputs contact timing information 325, contact determination information 324, and the like. For example, the output unit 335 can output at least one of the above information to an external device, the smart phone 200, or the like.
 以上が、判定装置300の構成例である。続いて、図8、図9を参照して、判定装置300の動作について説明する。 The above is a configuration example of the determination device 300 . Next, the operation of the determination device 300 will be described with reference to FIGS. 8 and 9. FIG.
 まずは、図8を参照して、接地判断部333の動作例について説明する。図8を参照すると、接地判断部333は、判断対象となる画像データにおける膝のY座標と時系列の画像データにおいて判断対象よりも1つ前の画像データにおける膝のY座標とを比較する(ステップS101)。1つ前の画像データにおける膝のY座標からの変位が0を超えている場合(ステップS101、Yes)、接地判断部333は、足首の変位が条件を満たすか否か確認する(ステップS102)。 First, with reference to FIG. 8, an operation example of the contact determination unit 333 will be described. Referring to FIG. 8, the contact determination unit 333 compares the Y coordinate of the knee in the image data to be determined with the Y coordinate of the knee in the image data preceding the determination target in the time-series image data ( step S101). If the displacement of the knee from the Y coordinate in the previous image data exceeds 0 (step S101, Yes), the contact determination unit 333 confirms whether or not the displacement of the ankle satisfies the condition (step S102). .
 足首の変位が条件を満たす場合(ステップS102、Yes)、接地判断部333は、条件を満たす足が接地していると判断する(ステップS103)。一方、膝のY座標の変位が0以下である場合(ステップS101、No)や足首の変位が条件を満たさない場合(ステップS102、No)、接地判断部333は、条件を満たさない足は接地していないと判断する。 When the displacement of the ankle satisfies the condition (step S102, Yes), the ground contact determination unit 333 determines that the foot satisfying the condition is in contact with the ground (step S103). On the other hand, if the displacement of the Y coordinate of the knee is 0 or less (step S101, No) or if the displacement of the ankle does not satisfy the condition (step S102, No), the ground contact determination unit 333 determines that the foot that does not satisfy the condition is grounded. decide not to.
 以上が、接地判断部333の動作例である。なお、接地判断部333は、例えば、足首のY座標の変化具合に基づいてY座標が一定であると評価できる場合であって、接地していると判断した足ではない方の足が動き出したと判断されない場合や膝の座標変化が条件を満たす場合などに、足首の変位が条件を満たすと判断することが出来る。また、接地判断部333は、足首のY座標に着目する足首を特定して、特定した足首に対して上述した確認を行うよう構成してもよい。 The above is an operation example of the contact determination unit 333 . For example, when the Y-coordinate of the ankle can be evaluated to be constant based on the degree of change in the Y-coordinate of the ankle, the contact determination unit 333 determines that the leg other than the leg determined to be on the ground starts to move. If no determination is made or if the coordinate change of the knee satisfies the condition, it can be determined that the displacement of the ankle satisfies the condition. Further, the contact determination unit 333 may be configured to identify the ankle of interest based on the Y-coordinate of the ankle and perform the above-described confirmation on the identified ankle.
 続いて、図9を参照して、接地タイミング判定部334の動作例について説明する。図9を参照すると、接地タイミング判定部334は、接地タイミング情報325を参照して接地判断結果が示す接地している足が変わったか否か確認する(ステップS201)。 Next, with reference to FIG. 9, an operation example of the ground contact timing determination section 334 will be described. Referring to FIG. 9, the contact timing determination unit 334 refers to the contact timing information 325 and checks whether or not the contacting foot indicated by the contact decision result has changed (step S201).
 足が変わった場合(ステップS201、Yes)、接地タイミング判定部334は、当該変わったタイミングが接地のタイミングであると判定する(ステップS202)。一方、判断結果が示す接地している足が変わっていなかった場合(ステップS201)、接地タイミング判定部334は、上記判定を行わない。 When the foot has changed (step S201, Yes), the contact timing determination unit 334 determines that the changed timing is the contact timing (step S202). On the other hand, if the grounded foot indicated by the determination result has not changed (step S201), the ground contact timing determining section 334 does not perform the above determination.
 以上が、接地タイミング判定部334の処理例である。 The above is a processing example of the contact timing determination unit 334 .
 このように、判定装置300は、骨格認識部332と接地判断部333と接地タイミング判定部334とを有している。このような構成によると、接地タイミング判定部334は、骨格認識部332により認識された骨格の座標を用いた、接地判断部333による接地する足の判断結果に基づいて、足が接地するタイミングを判定することが出来る。つまり、上記構成によると、画像データに基づいて足が接地したタイミングを把握することが可能となる。 As described above, the determination device 300 has the skeleton recognition section 332 , the contact determination section 333 and the contact timing determination section 334 . According to such a configuration, the contact timing determination unit 334 determines the timing at which the foot touches the ground based on the determination result of the contacting foot made by the contact determination unit 333 using the coordinates of the skeleton recognized by the skeleton recognition unit 332 . can judge. That is, according to the above configuration, it is possible to grasp the timing when the foot touches the ground based on the image data.
 なお、判定装置300は、骨格認識を行う外部装置による骨格認識の結果に基づいて接地タイミングの判定を行うよう構成してもよい。このように構成する場合、判定装置300は、骨格認識部332としての機能を持たなくてもよい。 Note that the determination device 300 may be configured to determine the contact timing based on the result of skeleton recognition by an external device that recognizes the skeleton. When configured in this manner, determination device 300 does not need to have the function of skeleton recognition section 332 .
[第2の実施形態]
 次に、本開示の第2の実施形態について、図10から図13までを参照して説明する。図10は、算出システム400の構成例を示す図である。図11は、算出装置500の構成例を示すブロック図である。図12は、手間長さと高さの一例を示す図である。図13は、算出装置500の動作例を示すフローチャートである。
[Second embodiment]
Next, a second embodiment of the present disclosure will be described with reference to FIGS. 10 to 13. FIG. FIG. 10 is a diagram showing a configuration example of the calculation system 400. As shown in FIG. FIG. 11 is a block diagram showing a configuration example of the calculation device 500. As shown in FIG. FIG. 12 is a diagram showing an example of hand length and height. FIG. 13 is a flow chart showing an operation example of the computing device 500 .
 本開示の第2の実施形態においては、スマートフォン200などの撮像装置を用いて取得した画像データに基づいて、画像データ内の人物の腰の曲がりを示す指標値を算出する算出システム400について説明する。算出システム400では、第1の実施形態と同様に、画面の奥側から手前側に向かって人物が歩く様子を示す画像データを取得して、画像データから骨格の位置を認識する。そして、算出システム400は、認識した結果に基づいて腰の曲がりを示す指標値を算出する。 In the second embodiment of the present disclosure, a calculation system 400 that calculates an index value indicating the bending of a person's waist in image data based on image data acquired using an imaging device such as the smartphone 200 will be described. . As in the first embodiment, the calculation system 400 acquires image data showing a person walking from the back side to the front side of the screen, and recognizes the position of the skeleton from the image data. Then, the calculation system 400 calculates an index value indicating bending of the waist based on the recognition result.
 後述するように、本実施形態で説明する算出システム400では、左右の手間に存在する各部位を結ぶ骨格長の画像内距離の総和である手間長さと、頭から足先までの間の画像内距離である高さと、を取得する。そして、算出システム400は、取得した手間長さと高さとに基づいて腰の曲がりを示す指標値を算出する。手間長さと高さをどちらも身長と仮定した場合、腰を曲げるほど高さが変わる一方で手間長さは腰を曲げても変わらない。また、一般的に、両手を左右に広げたときの長さが身長に相当する。算出システム400は、上記を利用することで腰の曲がりを示す指標値を算出する。なお、手間長さと身長の比率は、年齢や身長などによって変わってくることが知られている。そのため、算出システム400は、年齢、身長、性別、などの人物の属性に応じて、手間長さに所定の補正値を加えたうえで、補正した手間長さと高さとに基づいて腰の曲がりを示す指標値を算出するよう構成してもよい。 As will be described later, in the calculation system 400 described in this embodiment, the hand length, which is the sum of the intra-image distances of the skeletal lengths that connect the parts existing between the left and right hands, and the intra-image distance from the head to the toes. Get the height, which is the distance. Then, the calculation system 400 calculates an index value indicating bending of the waist based on the acquired hand length and height. Assuming that both the hand length and the height are the height, the height changes as the waist bends, but the hand length does not change even when the waist is bent. Also, in general, the length of the person when both hands are spread out to the left and right corresponds to the height. The calculation system 400 calculates an index value indicating the bending of the waist by using the above. It is known that the ratio of hand length to height varies depending on age, height, and the like. Therefore, the calculation system 400 adds a predetermined correction value to the hand length in accordance with the attributes of the person such as age, height, gender, etc., and then corrects the bending of the waist based on the corrected hand length and height. It may be configured to calculate an index value to indicate.
 なお、本実施形態で説明する算出システム400は、第1の実施形態で説明した判定システム100としての機能を有することが出来る。つまり、算出システム400は、接地タイミングの判定を行うための構成を有していてもよい。 Note that the calculation system 400 described in this embodiment can have the function of the determination system 100 described in the first embodiment. That is, the calculation system 400 may have a configuration for determining the contact timing.
 図10は、算出システム400の構成例を示している。図10を参照すると、算出システム400は、例えば、スマートフォン200と、算出装置500と、を有している。図10で示すように、スマートフォン200と算出装置500とは、例えば、無線、または、有線により、互いに通信可能なよう接続されている。 FIG. 10 shows a configuration example of the calculation system 400. FIG. Referring to FIG. 10 , the calculation system 400 includes, for example, a smart phone 200 and a calculation device 500 . As shown in FIG. 10, the smartphone 200 and the computing device 500 are connected, for example, wirelessly or by wire so that they can communicate with each other.
 スマートフォン200の構成は、第1の実施形態と同様である。そのため、スマートフォン200についての説明は省略する。 The configuration of the smartphone 200 is the same as in the first embodiment. Therefore, description of smartphone 200 is omitted.
 算出装置500は、スマートフォン200が取得した画像データに基づいて、腰の曲がりを示す指標値を算出する情報処理装置である。例えば、算出装置500は、サーバ装置などである。算出装置500は、1台の情報処理装置であってもよいし、例えば、クラウド上などで実現されてもよい。 The calculation device 500 is an information processing device that calculates an index value indicating bending of the waist based on image data acquired by the smartphone 200 . For example, the computing device 500 is a server device or the like. The computing device 500 may be a single information processing device, or may be implemented on a cloud, for example.
 図11は、算出装置500の構成例を示している。図11を参照すると、算出装置500は、主な構成要素として、例えば、通信I/F部510と、記憶部520と、演算処理部530と、を有している。 FIG. 11 shows a configuration example of the calculation device 500. FIG. Referring to FIG. 11, the calculation device 500 has, for example, a communication I/F section 510, a storage section 520, and an arithmetic processing section 530 as main components.
 通信I/F部510は、データ通信回路からなる。通信I/F部510は、通信回線を介して接続された外部装置やスマートフォン200などとの間でデータ通信を行う。 The communication I/F unit 510 consists of a data communication circuit. Communication I/F unit 510 performs data communication with an external device, smartphone 200, or the like connected via a communication line.
 記憶部520は、ハードディスクやメモリなどの記憶装置である。記憶部520は、演算処理部530における各種処理に必要な処理情報やプログラム527を記憶する。プログラム527は、演算処理部530に読み込まれて実行されることにより各種処理部を実現する。プログラム527は、通信I/F部510などのデータ入出力機能を介して外部装置や記録媒体から予め読み込まれ、記憶部520に保存されている。記憶部520で記憶される主な情報としては、例えば、学習済みモデル521、画像情報522、骨格情報523、手間長さ情報524、高さ情報525、腰の曲がり値情報526などがある。 The storage unit 520 is a storage device such as a hard disk or memory. The storage unit 520 stores processing information and programs 527 necessary for various processes in the arithmetic processing unit 530 . The program 527 realizes various processing units by being read into the arithmetic processing unit 530 and executed. The program 527 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 510 and stored in the storage unit 520 . Main information stored in the storage unit 520 includes, for example, a trained model 521, image information 522, skeleton information 523, hand length information 524, height information 525, waist bending value information 526, and the like.
 学習済みモデル521、画像情報522、骨格情報523は、第1の実施形態で説明した学習済みモデル321、画像情報322、骨格情報323と同様の情報である。 The learned model 521, image information 522, and skeleton information 523 are the same information as the learned model 321, image information 322, and skeleton information 323 described in the first embodiment.
 手間長さ情報524は、手のひら、手首、ひじ、肩など左右の手の間に存在する部位間を結んだ骨格長の画像内距離の総和である手間長さを示している。手間長さ情報524は、例えば、手間長さが最も長くなるように手を下方に延ばした状態で取得した画像データなどに基づいて予め算出されており、通信I/F部510などを介して外部装置などから取得され、記憶部520に格納されている。手間長さ情報524は、手間長さ算出部533による算出処理の結果として生成・更新されてもよい。 The hand length information 524 indicates the hand length, which is the sum of the in-image distances of the skeletal lengths connecting the parts existing between the left and right hands, such as palms, wrists, elbows, and shoulders. The hand length information 524 is calculated in advance, for example, based on image data acquired with the hand extended downward so that the hand length is the longest, and is transmitted via the communication I/F unit 510 or the like. It is acquired from an external device or the like and stored in the storage unit 520 . The effort length information 524 may be generated/updated as a result of calculation processing by the effort length calculator 533 .
 例えば、手間長さ情報524は、手間長さを示す情報を含んでいる。手間長さ情報524は、時系列の画像データごとに取得した手間長さを示す情報を含んでもよい。例えば、手間長さ情報524は、手間長さを算出する際に用いた画像データを示す識別情報と、手間長さを示す情報と、を対応付けたものなどであってよい。 For example, the effort length information 524 includes information indicating the effort length. The hand length information 524 may include information indicating the hand length acquired for each time-series image data. For example, the effort length information 524 may be information in which identification information indicating image data used when calculating the effort length and information indicating the effort length are associated with each other.
 高さ情報525は、頭(額)と足先(例えば、左右のうち低い方)とを結んだ画像内距離である、画像データ内における人物の高さを示している。高さ情報525は、例えば、高さ算出部534による算出処理の結果として生成・更新される。 The height information 525 indicates the height of the person in the image data, which is the intra-image distance connecting the head (forehead) and the toes (for example, the lower one of the left and right). The height information 525 is generated/updated as a result of calculation processing by the height calculator 534, for example.
 例えば、高さ情報525は、時系列の画像データごとに取得した高さを示す情報を含んでいる。高さ情報525は、高さを算出する際に用いた画像データを示す識別情報と、高さを示す情報と、を対応付けたものなどであってよい。また、高さ情報525は、所定の条件を満たす画像データを示す識別情報と、高さを示す情報と、を対応付けたものなどであってもよい。 For example, the height information 525 includes information indicating the height acquired for each time-series image data. The height information 525 may be, for example, a correspondence between identification information indicating image data used when calculating the height and information indicating the height. Also, the height information 525 may be information that associates identification information indicating image data that satisfies a predetermined condition with information indicating height.
 腰の曲がり値情報526は、腰の曲がりを示す指標値である腰の曲がり値を示している。腰の曲がり値情報526は、例えば、曲がり算出部535による算出処理の結果として生成・更新される。 The waist curvature value information 526 indicates the waist curvature value, which is an index value indicating the curvature of the waist. The waist curvature value information 526 is generated and updated as a result of calculation processing by the curvature calculator 535, for example.
 例えば、腰の曲がり値情報526は、腰の曲がり値を示す情報を含んでいる。また、腰の曲がり値情報526は、時系列の画像データごとに取得した腰の曲がり値を示す情報を含んでもよい。腰の曲がり値情報526は、腰の曲がり値を算出する際に用いた画像データを示す識別情報と、腰の曲がり値を示す情報と、を対応付けたものなどであってよい。また、腰の曲がり値情報526は、所定の条件を満たす画像データを示す識別情報と、腰の曲がり値を示す情報と、を対応付けたものなどであってもよい。 For example, the waist curvature value information 526 includes information indicating the waist curvature value. Also, the waist curvature value information 526 may include information indicating the waist curvature value acquired for each time-series image data. The waist curvature value information 526 may be, for example, a correspondence between identification information indicating image data used in calculating the waist curvature value and information indicating the waist curvature value. Also, the waist curvature value information 526 may be information that associates identification information indicating image data that satisfies a predetermined condition with information indicating the waist curvature value.
 演算処理部530は、CPUなどの演算装置とその周辺回路を有する。演算処理部530は、記憶部520からプログラム527を読み込んで実行することにより、上記ハードウェアとプログラム527とを協働させて各種処理部を実現する。演算処理部530で実現される主な処理部としては、例えば、画像取得部531と、骨格認識部532と、手間長さ算出部533と、高さ算出部534と、曲がり算出部535と、出力部536と、などがある。 The arithmetic processing unit 530 has an arithmetic device such as a CPU and its peripheral circuits. The arithmetic processing unit 530 reads the program 527 from the storage unit 520 and executes it, so that the hardware and the program 527 cooperate to realize various processing units. Main processing units realized by the arithmetic processing unit 530 include, for example, an image acquisition unit 531, a skeleton recognition unit 532, a hand length calculation unit 533, a height calculation unit 534, a bend calculation unit 535, and an output unit 536 .
 画像取得部531、骨格認識部532は、第1の実施形態で説明した画像取得部331、骨格認識部332と同様の処理を行うことが出来る。 The image acquisition unit 531 and skeleton recognition unit 532 can perform the same processing as the image acquisition unit 331 and skeleton recognition unit 332 described in the first embodiment.
 手間長さ算出部533は、骨格情報523に基づいて、左右の手間を結ぶ骨格長の画像内距離の総和である手間長さを算出する。そして、手間長さ算出部533は、算出した手間長さを手間長さ情報524として記憶部520に格納する。 Based on the skeleton information 523, the hand length calculation unit 533 calculates the hand length, which is the sum of the in-image distances of the skeleton lengths connecting the left and right hands. Then, the effort length calculation unit 533 stores the calculated effort length as the effort length information 524 in the storage unit 520 .
 例えば、図12は、手間長さの一例を示している。図12を参照すると、手間長さ算出部533は、骨格情報523を参照することで、手のひら、手首、ひじ、肩など左右の手の間に存在する各部位の位置を示す情報を取得する。そして、手間長さ算出部533は、上記取得した情報に基づいて、隣接する部位を直線で結んだ際の各部位間の距離の総和を算出する。例えば、以上のように、手間長さ算出部533は、左右の手の間に存在する各部位の位置に基づいて、手間長さを算出する。 For example, FIG. 12 shows an example of the hand length. Referring to FIG. 12 , hand length calculation unit 533 obtains information indicating the positions of parts such as palms, wrists, elbows, and shoulders between the left and right hands by referring to skeleton information 523 . Then, based on the acquired information, the hand length calculation unit 533 calculates the total sum of the distances between adjacent parts when connecting the adjacent parts with a straight line. For example, as described above, the hand length calculator 533 calculates the hand length based on the position of each part between the left and right hands.
 手間長さ算出部533は、上記のような手間長さの算出処理を時系列の各画像データに対して行うことが出来る。つまり、手間長さ算出部533は、画像データごとに当該画像データにおける手間長さを算出するよう構成してよい。手間長さ算出部533は、時系列の画像データから任意の条件で抽出した画像データにおける手間長さを算出するよう構成してもよい。 The effort length calculation unit 533 can perform the above-described effort length calculation processing for each piece of time-series image data. In other words, the effort length calculation unit 533 may be configured to calculate the effort length for each image data. The effort length calculator 533 may be configured to calculate the effort length in image data extracted under arbitrary conditions from time-series image data.
 なお、手間長さ算出部533は、上記例示した以外の部位を用いて手間長さを算出してもよい。また、手間長さを示す情報は、事前に算出して取得しておくことが出来る。手間長さを示す情報を事前に取得しておく場合、算出装置500は、手間長さ算出部533を有さなくてもよい。 It should be noted that the effort length calculation unit 533 may calculate the effort length using parts other than those exemplified above. Also, the information indicating the length of effort can be calculated and acquired in advance. If the information indicating the length of effort is acquired in advance, the calculation device 500 does not need to have the length-of-handling calculation section 533 .
 高さ算出部534は、骨格情報523に基づいて、頭から足先までの画像内距離である高さを算出する。そして、高さ算出部534は、算出した高さを高さ情報525として記憶部520に格納する。 The height calculation unit 534 calculates the height, which is the in-image distance from the head to the toes, based on the skeleton information 523 . Height calculator 534 then stores the calculated height as height information 525 in storage 520 .
 例えば、図12は、高さの一例を示している。図12を参照すると、高さ算出部534は、骨格情報523を参照することで、最も高い部位である額と左右の足先の位置を示す情報を取得する。そして、高さ算出部534は、上記取得した情報に基づいて、額と左右の足先のうちの低い方とを直線で結んだ距離を算出する。または、高さ算出部534は、上記取得した情報に基づいて、額と左右の足先のうちの低い方とのY軸方向(図12の上下方向)の差を算出する。例えば、以上のように、高さ算出部534は、頭と足の位置に基づいて、高さを算出する。 For example, FIG. 12 shows an example of height. Referring to FIG. 12, the height calculation unit 534 obtains information indicating the positions of the forehead and the left and right toes, which are the highest parts, by referring to the skeleton information 523 . Then, the height calculation unit 534 calculates the distance obtained by connecting the forehead and the lower one of the left and right toes with a straight line based on the acquired information. Alternatively, the height calculation unit 534 calculates the difference in the Y-axis direction (vertical direction in FIG. 12) between the forehead and the lower one of the left and right toes based on the acquired information. For example, as described above, the height calculator 534 calculates the height based on the positions of the head and feet.
 高さ算出部534は、手間長さ算出部533と同様に、上記のような高さの算出処理を時系列の各画像データに対して行うことが出来る。つまり、高さ算出部534は、画像データごとに当該画像データにおける高さを算出するよう構成してよい。高さ算出部534は、時系列の画像データから任意の条件で抽出した画像データにおける高さを算出するよう構成してもよい。 The height calculation unit 534 can perform the above-described height calculation processing on each piece of time-series image data, similar to the effort length calculation unit 533 . In other words, the height calculator 534 may be configured to calculate the height of each piece of image data. The height calculator 534 may be configured to calculate the height of image data extracted under arbitrary conditions from time-series image data.
 なお、高さ算出部534は、上記例示した以外の部位を用いて高さを算出してもよい。例えば、高さ算出部534は、首と足首間など間に腰を含む部位間の長さなどを高さとして算出してもよい。 Note that the height calculation unit 534 may calculate the height using parts other than those exemplified above. For example, the height calculation unit 534 may calculate the length between parts including the waist between the neck and the ankles as the height.
 曲がり算出部535は、手間長さ情報524が示す手間長さと高さ情報525が示す高さとに基づいて、腰の曲がりを示す指標値である腰の曲がり値を算出する。そして、曲がり算出部535は、算出した腰の曲がり値を腰の曲がり値情報526として記憶部520に格納する。 The curvature calculation unit 535 calculates a waist curvature value, which is an index value indicating the curvature of the waist, based on the length of hand indicated by the length of hand information 524 and the height indicated by the height information 525 . Then, the curvature calculation unit 535 stores the calculated waist curvature value in the storage unit 520 as the waist curvature value information 526 .
 例えば、曲がり算出部535は、手間長さを高さで割ることで腰の曲がり値を算出する。つまり、曲がり算出部535は、(手間長さ)/(高さ)を計算することで、腰の曲がり値を算出する。例えば、曲がり算出部535は、画像データごとに算出された手間長さと高さとを用いて腰の曲がり値を算出することが出来る。 For example, the curvature calculation unit 535 calculates the waist curvature value by dividing the hand length by the height. That is, the curvature calculation unit 535 calculates the waist curvature value by calculating (hand length)/(height). For example, the curvature calculation unit 535 can calculate the waist curvature value using the hand length and height calculated for each image data.
 なお、手間長さは、歩いている際に手をふることなどにより短くなることが想定される。そのため、曲がり算出部535は、手間長さ情報524に含まれる手間長さのうち最も長い手間長さを用いるなど、手間長さ情報524から所定の基準で選択した手間長さを用いて腰の曲がり値を算出してもよい。換言すると、曲がり算出部535が腰の曲がり値を算出する際に用いる手間長さは、手間長さ情報524に含まれる手間長さのうち所定の基準で選択される手間長さであってよい。選択の基準は、最も長いものを用いるなど任意に設定してよい。また、手間長さ情報524には、予め取得された手間長さ1つのみが含まれることがある。そのため、曲がり算出部535が腰の曲がり値を算出する際に用いる手間長さは、例えば、一定値であってもよい。 In addition, it is assumed that the length of effort will be shortened by waving your hands while walking. Therefore, the bend calculation unit 535 uses the length of hand selected from the length of hand information 524 based on a predetermined standard, such as using the longest length of hand among the lengths of hand included in the length of hand information 524. A curvature value may be calculated. In other words, the hand length used by the bending calculator 535 to calculate the waist bending value may be a hand length selected according to a predetermined standard from among the hand lengths included in the hand length information 524. . The criteria for selection may be set arbitrarily, such as using the longest one. Also, the effort length information 524 may include only one previously acquired effort length. Therefore, the hand length used when the curvature calculator 535 calculates the waist curvature value may be, for example, a constant value.
 また、曲がり算出部535は、第1の実施形態で説明した接地タイミングの判定結果を用いて、腰の曲がり値の算出対象になる画像データなどを特定してもよい。例えば、曲がり算出部535は、接地タイミングと判定した画像データに基づいて取得した高さを用いて腰の曲がり値を算出するよう構成してよい。曲がり算出部535は、接地タイミングと判定した画像データから所定フレーム分の画像データに基づいて取得した高さを用いて腰の曲がり値を算出するよう構成してもよい。このように、接地タイミングの判定結果に基づいて特定される画像データに基づいて(高さに基づいて)腰の曲がり値を算出するよう構成することで、他の人との比較対象になる指標としてより適切な腰の曲がり値を算出することが出来る。なお、曲がり算出部535は、時系列の各画像データについて腰の曲がり値を算出するとともに、算出した腰の曲がり値の平均値などを算出するよう構成してもよい。 Further, the curvature calculation unit 535 may use the determination result of the contact timing described in the first embodiment to specify image data for which the waist curvature value is to be calculated. For example, the curvature calculation unit 535 may be configured to calculate the waist curvature value using the height acquired based on the image data determined to be the contact timing. The curvature calculation unit 535 may be configured to calculate the curvature value of the waist using the height acquired based on the image data for a predetermined number of frames from the image data determined to be the contact timing. In this way, by configuring to calculate the bending value of the waist (based on the height) based on the image data specified based on the determination result of the contact timing, the index to be compared with other people can be obtained. , a more appropriate waist bending value can be calculated. The curvature calculation unit 535 may be configured to calculate a waist curvature value for each piece of time-series image data and to calculate an average value of the calculated waist curvature values.
 出力部536は、手間長さ情報524、高さ情報525、腰の曲がり値情報526などを出力する。例えば、出力部536は、外部装置やスマートフォン200などに対して、上記各情報のうちの少なくとも1つを出力することが出来る。 The output unit 536 outputs hand length information 524, height information 525, waist bending value information 526, and the like. For example, the output unit 536 can output at least one of the above information to an external device, the smart phone 200, or the like.
 以上が、算出装置500の構成例である。続いて、図13を参照して、算出装置500の動作例について説明する。 The above is an example of the configuration of the calculation device 500. Next, an operation example of the calculation device 500 will be described with reference to FIG. 13 .
 図13を参照すると、曲がり算出部535は、手間長さ情報524が示す手間長さと高さ情報525が示す高さとを取得する(ステップS301)。 Referring to FIG. 13, the bend calculation unit 535 acquires the length of effort indicated by the length of effort information 524 and the height indicated by the height information 525 (step S301).
 曲がり算出部535は、手間長さと高さとに基づいて腰の曲がりを示す指標値である腰の曲がり値を算出する(ステップS302)。例えば、曲がり算出部535は、手間長さを高さで割ることで腰の曲がり値を算出する。 The curvature calculation unit 535 calculates a waist curvature value, which is an index value indicating the curvature of the waist, based on the hand length and height (step S302). For example, the curvature calculation unit 535 calculates the waist curvature value by dividing the hand length by the height.
 以上が、算出装置500の動作例である。 The above is an operation example of the calculation device 500 .
 このように、算出装置500は曲がり算出部535を有している。このような構成によると、算出装置500は、取得した手間長さと高さに基づいて腰の曲がり値を算出することが出来る。つまり、上記構成によると、画像データに基づいて腰の曲がりに応じた指標値を算出することが出来る。 Thus, the calculation device 500 has the bend calculation section 535 . According to such a configuration, the calculation device 500 can calculate the waist bending value based on the acquired hand length and height. That is, according to the above configuration, it is possible to calculate the index value corresponding to the bending of the waist based on the image data.
 なお、算出装置500は、第1の実施形態で説明した判定装置300としての機能を有することが出来る。また、算出装置500は、第1の実施形態で説明した判定装置300と同様の変形例を有することが出来る。 Note that the calculation device 500 can have the function of the determination device 300 described in the first embodiment. Further, the calculation device 500 can have modifications similar to the determination device 300 described in the first embodiment.
[第3の実施形態]
 次に、本開示の第3の実施形態について、図14から図19までを参照して説明する。図14は、算出システム600の構成例を示す図である。図15は、算出装置700の構成例を示すブロック図である。図16は、高さの一例を示す図である。図17、図18は、足の成分の一例を示す図である。図19は、算出装置700の動作例を示すフローチャートである。
[Third embodiment]
Next, a third embodiment of the present disclosure will be described with reference to FIGS. 14 to 19. FIG. FIG. 14 is a diagram showing a configuration example of the calculation system 600. As shown in FIG. FIG. 15 is a block diagram showing a configuration example of the calculation device 700. As shown in FIG. FIG. 16 is a diagram showing an example of height. 17 and 18 are diagrams showing examples of leg components. FIG. 19 is a flow chart showing an operation example of the computing device 700 .
 本開示の第3の実施形態においては、スマートフォン200などの撮像装置を用いて取得した画像データに基づいて、着目対象の状態を示す指標値を算出する算出システム600について説明する。算出システム600では、第1の実施形態、第2の実施形態と同様に、画面の奥側から手前側に向かって人物が歩く様子を示す画像データを取得して、画像データから骨格の位置を認識する。そして、算出システム600は、認識した結果に基づいて対象の状態を示す指標値を算出する。 In the third embodiment of the present disclosure, a calculation system 600 that calculates an index value indicating the state of a target of interest based on image data acquired using an imaging device such as the smartphone 200 will be described. As in the first and second embodiments, the calculation system 600 acquires image data showing how a person walks from the back side to the front side of the screen, and calculates the position of the skeleton from the image data. recognize. Calculation system 600 then calculates an index value indicating the state of the object based on the recognition result.
 例えば、本実施形態で説明する算出システム600では、足に着目してつま先の上がりや向きなどのつま先の状態を示す指標値を算出する。具体的には、例えば、算出システム600は、着目対象である足内に存在する部位である足首と足先の位置のX軸方向の差やY軸方向の差と、比較対象になる比較長さと、を取得する。そして、算出システム600は、X軸方向の差と比較長さとに基づいてつま先の向きを示す指標値を算出するとともに、Y軸方向の差と比較長さとに基づいてつま先の上がりを示す指標値を算出する。撮像装置であるカメラに近づくほど人物などは大きく映る。そのため、同じようにつま先が上がっていたり同じ方向につま先が向いていたりしても、撮像装置に対する人物の近さに応じて画像データ内における長さが異なる。算出システム600は、足首と足先の位置のX軸方向の差やY軸方向の差を比較対象となる長さである比較長さと比較することで、上記問題を解決した指標値の算出を可能とする。 For example, in the calculation system 600 described in the present embodiment, focusing on the foot, an index value indicating the state of the toe, such as the toe's rise and direction, is calculated. Specifically, for example, the calculation system 600 calculates the difference in the X-axis direction and the difference in the Y-axis direction between the positions of the ankle and the tip of the foot, which are parts present in the foot of interest, and the comparative length to be compared. and get. Then, the calculation system 600 calculates an index value indicating the direction of the toe based on the difference in the X-axis direction and the comparative length, and an index value indicating the rise of the toe based on the difference in the Y-axis direction and the comparative length. Calculate People and the like appear larger as they get closer to the camera, which is an imaging device. Therefore, even if the toes are similarly raised or pointed in the same direction, the length in the image data differs according to the proximity of the person to the imaging device. The calculation system 600 calculates an index value that solves the above problem by comparing the difference in the X-axis direction and the difference in the Y-axis direction between the positions of the ankle and the toe with the comparison length, which is the length to be compared. make it possible.
 なお、本実施形態においては、着目対象として足に着目する場合について例示する。しかしながら、着目対象は、足以外の任意の箇所であってよい。つまり、算出システム600は、足首と足先以外の着目対象内に存在する部位の位置のX軸方向の差やY軸方向の差を用いてもよい。また、比較長さとは、撮像装置であるカメラに対する画像データ取得時の人物の近さに応じて画像データ内における長さが変動するものの、当該人物においては長さが一定となる部位間の長さのことをいう。比較長さは、例えば、第2の実施形態で説明した頭から足先までの画像内距離である高さ(腰の曲がりは考慮しないものとする)、手間長さ、頭から骨盤までの画像内距離である高さ、などのうちの1つを用いることが出来る。比較長さは、上記例示した以外の部位間の長さであってもよい。 Note that, in the present embodiment, a case where attention is focused on the foot as the target of attention will be exemplified. However, the target of interest may be any location other than the foot. In other words, the calculation system 600 may use the difference in the X-axis direction and the difference in the Y-axis direction of the positions of parts existing in the object of interest other than the ankles and toes. In addition, the comparative length is the length between parts where the length in the image data varies depending on the proximity of the person to the camera, which is the imaging device, when the image data is acquired, but the length is constant in the person. It means that. The comparative lengths are, for example, the height, which is the distance in the image from the head to the toes described in the second embodiment (the bending of the waist is not considered), the hand length, and the image from the head to the pelvis. One of height, which is the inner distance, and so on can be used. The comparative length may be a length between sites other than those exemplified above.
 また、本実施形態で説明する算出システム600は、第1の実施形態で説明した判定システム100や第2の実施形態で説明した算出システム400としての機能を有することが出来る。つまり、算出システム600は、接地タイミングの判定を行うための構成や腰の曲がり値を算出するための構成などを有していてもよい。 Also, the calculation system 600 described in this embodiment can have the functions of the determination system 100 described in the first embodiment and the calculation system 400 described in the second embodiment. In other words, the calculation system 600 may have a configuration for determining contact timing, a configuration for calculating a waist bending value, and the like.
 図14は、算出システム600の構成例を示している。図14を参照すると、算出システム600は、例えば、スマートフォン200と、算出装置700と、を有している。図14で示すように、スマートフォン200と算出装置700とは、例えば、無線、または、有線により、互いに通信可能なよう接続されている。 FIG. 14 shows a configuration example of the calculation system 600. FIG. Referring to FIG. 14, the calculation system 600 has, for example, a smart phone 200 and a calculation device 700 . As shown in FIG. 14, the smartphone 200 and the computing device 700 are connected, for example, wirelessly or by wire so that they can communicate with each other.
 スマートフォン200の構成は、第1の実施形態と同様である。そのため、スマートフォン200についての説明は省略する。 The configuration of the smartphone 200 is the same as in the first embodiment. Therefore, description of smartphone 200 is omitted.
 算出装置700は、スマートフォン200が取得した画像データに基づいて、つま先の上がりやつま先の向きを示す指標値を算出する情報処理装置である。例えば、算出装置700は、サーバ装置などである。算出装置700は、1台の情報処理装置であってもよいし、例えば、クラウド上などで実現されてもよい。なお、算出装置700は、つま先の上がりを示す指標値とつま先の向きを示す指標値のうちのいずれか一方のみを算出してもよい。 The calculation device 700 is an information processing device that calculates, based on the image data acquired by the smartphone 200, an index value indicating the toe lift and the toe direction. For example, the computing device 700 is a server device or the like. The computing device 700 may be a single information processing device, or may be implemented on a cloud, for example. Calculation device 700 may calculate only one of the index value indicating the toe-up and the index value indicating the direction of the toe.
 図15は、算出装置700の構成例を示している。図15を参照すると、算出装置700は、主な構成要素として、例えば、通信I/F部710と、記憶部720と、演算処理部730と、を有している。 FIG. 15 shows a configuration example of the calculation device 700. FIG. Referring to FIG. 15, the calculation device 700 has, for example, a communication I/F section 710, a storage section 720, and an arithmetic processing section 730 as main components.
 通信I/F部710は、データ通信回路からなる。通信I/F部710は、通信回線を介して接続された外部装置やスマートフォン200などとの間でデータ通信を行う。 The communication I/F unit 710 consists of a data communication circuit. Communication I/F unit 710 performs data communication with an external device, smartphone 200, or the like connected via a communication line.
 記憶部720は、ハードディスクやメモリなどの記憶装置である。記憶部720は、演算処理部730における各種処理に必要な処理情報やプログラム727を記憶する。プログラム727は、演算処理部730に読み込まれて実行されることにより各種処理部を実現する。プログラム727は、通信I/F部710などのデータ入出力機能を介して外部装置や記録媒体から予め読み込まれ、記憶部720に保存されている。記憶部720で記憶される主な情報としては、例えば、学習済みモデル721、画像情報722、骨格情報723、比較長さ情報724、足成分情報725、つま先データ情報726などがある。 The storage unit 720 is a storage device such as a hard disk or memory. The storage unit 720 stores processing information and programs 727 necessary for various processes in the arithmetic processing unit 730 . The program 727 realizes various processing units by being read into the arithmetic processing unit 730 and executed. The program 727 is read in advance from an external device or recording medium via a data input/output function such as the communication I/F unit 710 and stored in the storage unit 720 . Main information stored in the storage unit 720 includes, for example, a trained model 721, image information 722, skeleton information 723, comparative length information 724, foot component information 725, toe data information 726, and the like.
 学習済みモデル721、画像情報722、骨格情報723は、第1の実施形態で説明した学習済みモデル321、画像情報322、骨格情報323と同様の情報である。 The learned model 721, image information 722, and skeleton information 723 are the same information as the learned model 321, image information 322, and skeleton information 323 described in the first embodiment.
 比較長さ情報724は、撮像装置であるカメラに対する画像データ取得時の人物の近さに応じて画像データ内における長さが変動するものの、当該人物においては長さが一定となる部位間の長さである比較長さを示している。例えば、比較長さ情報724は、頭から足先までの画像内距離である高さ、手間長さ、頭から骨盤までの画像内距離である第2高さ、などのうちの1つを示している。比較長さ情報724は、例えば、比較長さ算出部733による算出処理の結果として生成・更新される。 The comparative length information 724 is the length between parts that is constant in the person, although the length in the image data varies depending on the proximity of the person to the camera, which is the imaging device, when the image data is acquired. It shows the comparative length, which is the length. For example, the comparative length information 724 indicates one of the height, which is the distance in the image from the head to the toes, the hand length, the second height, which is the distance in the image from the head to the pelvis, and so on. ing. The comparative length information 724 is generated and updated as a result of calculation processing by the comparative length calculator 733, for example.
 例えば、比較長さ情報724は、時系列の画像データごとに取得した比較長さを示す情報を含んでいる。例えば、比較長さ情報724は、比較長さを算出する際に用いた画像データを示す識別情報と、比較長さを示す情報と、を対応付けたものなどであってよい。 For example, the comparison length information 724 includes information indicating the comparison length acquired for each time-series image data. For example, the comparison length information 724 may be information that associates identification information indicating image data used when calculating the comparison length with information indicating the comparison length.
 足成分情報725は、足首と足先の位置のX軸方向の差であるX成分や足首と足先の位置のY軸方向の差であるY成分などの部位間長さを示している。足成分情報725は、例えば、足成分算出部734による算出処理の結果として生成・更新される。なお、足成分情報725は、X成分とY成分のうちのいずれか1つのみを含んでいてもよい。 The foot component information 725 indicates inter-part lengths such as the X component, which is the difference between the positions of the ankle and the tip of the foot in the X-axis direction, and the Y component, which is the difference between the positions of the ankle and the tip of the foot in the Y-axis direction. The foot component information 725 is generated and updated as a result of calculation processing by the foot component calculator 734, for example. Note that the foot component information 725 may include only one of the X component and the Y component.
 例えば、足成分情報725は、時系列の画像データごとに取得したX成分やY成分を示す情報を含んでいる。例えば、比較長さ情報724は、X成分やY成分を算出する際に用いた画像データを示す識別情報と、X成分やY成分を示す情報と、を対応付けたものなどであってよい。 For example, the foot component information 725 includes information indicating the X component and the Y component acquired for each time-series image data. For example, the comparison length information 724 may be information that associates identification information indicating image data used when calculating the X component and the Y component with information indicating the X component and the Y component.
 つま先データ情報726は、つま先の上がりを示す指標値やつま先の向きを示す指標値を示している。つま先データ情報726は、例えば、つま先データ算出部735による算出処理の結果として生成・更新される。 The toe data information 726 indicates an index value indicating the rise of the toe and an index value indicating the direction of the toe. The toe data information 726 is generated and updated as a result of calculation processing by the toe data calculator 735, for example.
 例えば、つま先データ情報726は、つま先の上がりを示す指標値やつま先の向きを示す指標値を含んでいる。つま先データ情報726は、時系列の画像データごとに取得したつま先の上がりを示す指標値やつま先の向きを示す指標値を含んでいる。高さ情報525は、指標値を算出する際に用いた画像データを示す識別情報と、つま先の上がりを示す指標値やつま先の向きを示す指標値を示す情報と、を対応付けたものなどであってよい。また、つま先データ情報726は、所定の条件を満たす画像データを示す識別情報と、つま先の上がりを示す指標値やつま先の向きを示す指標値を示す情報と、を対応付けたものなどであってもよい。 For example, the toe data information 726 includes an index value indicating the toe lift and an index value indicating the direction of the toe. The toe data information 726 includes an index value indicating the rise of the toe and an index value indicating the direction of the toe acquired for each time-series image data. The height information 525 is, for example, information in which identification information indicating image data used when calculating an index value is associated with information indicating an index value indicating a raised toe or an index value indicating the direction of the toe. It's okay. Also, the toe data information 726 is information that associates identification information indicating image data that satisfies a predetermined condition with information indicating an index value indicating a raised toe or an index value indicating the direction of the toe. good too.
 演算処理部730は、CPUなどの演算装置とその周辺回路を有する。演算処理部730は、記憶部720からプログラム727を読み込んで実行することにより、上記ハードウェアとプログラム727とを協働させて各種処理部を実現する。演算処理部730で実現される主な処理部としては、例えば、画像取得部731と、骨格認識部732と、比較長さ算出部733と、足成分算出部734と、つま先データ算出部735と、出力部736と、などがある。 The arithmetic processing unit 730 has an arithmetic device such as a CPU and its peripheral circuits. The arithmetic processing unit 730 reads the program 727 from the storage unit 720 and executes it, thereby realizing various processing units by cooperating the hardware and the program 727 . Main processing units realized by the arithmetic processing unit 730 include, for example, an image acquisition unit 731, a skeleton recognition unit 732, a comparison length calculation unit 733, a foot component calculation unit 734, and a toe data calculation unit 735. , output 736, and so on.
 画像取得部731、骨格認識部732は、第1の実施形態で説明した画像取得部331、骨格認識部332と同様の処理を行うことが出来る。 The image acquisition unit 731 and skeleton recognition unit 732 can perform the same processing as the image acquisition unit 331 and skeleton recognition unit 332 described in the first embodiment.
 比較長さ算出部733は、骨格情報723に基づいて、頭から足先までの画像内距離である高さ、手間長さ、頭から骨盤までの画像内距離である第2高さ、などのうちの1つである比較長さを算出する。そして、比較長さ算出部733は、算出した比較長さを比較長さ情報724として記憶部720に格納する。 Based on the skeleton information 723, the comparative length calculation unit 733 calculates the height, which is the distance in the image from the head to the toes, the hand length, the second height, which is the distance in the image from the head to the pelvis, and the like. A comparison length, which is one of them, is calculated. Then, the comparison length calculator 733 stores the calculated comparison length as the comparison length information 724 in the storage unit 720 .
 例えば、図16は、比較長さの一例である高さの一例を示している。図16を参照すると、比較長さ算出部733は、骨格情報523を参照することで、最も高い部位である額と左右の足先の位置を示す情報を取得する。そして、比較長さ算出部733は、上記取得した情報に基づいて、額と左右の足先のうちの低い方とを直線で結んだ距離を算出する。または、比較長さ算出部733は、上記取得した情報に基づいて、額と左右の足先のうちの低い方とのY軸方向(図12の上下方向)の差を算出する。例えば、以上のように、比較長さ算出部733は、頭と足の位置に基づいて、高さを算出する。なお、比較長さ算出部733は、同様の処理により、手間長さや頭から骨盤までの画像内距離である第2高さなどを比較長さとして算出するよう構成してもよい。 For example, FIG. 16 shows an example of height, which is an example of comparative length. Referring to FIG. 16, the comparative length calculation unit 733 obtains information indicating the positions of the forehead and the left and right toes, which are the highest parts, by referring to the skeleton information 523 . Then, the comparative length calculator 733 calculates the distance obtained by connecting the forehead and the lower one of the left and right toes with a straight line based on the acquired information. Alternatively, the comparative length calculator 733 calculates the difference in the Y-axis direction (vertical direction in FIG. 12) between the forehead and the lower one of the left and right toes based on the acquired information. For example, as described above, the comparative length calculator 733 calculates the height based on the positions of the head and feet. Note that the comparison length calculation unit 733 may be configured to calculate, as a comparison length, a hand length, a second height that is an in-image distance from the head to the pelvis, and the like, through similar processing.
 足成分算出部734は、骨格情報523を参照することで、足首と足先の位置のX軸方向の差であるX成分や足首と足先の位置のY軸方向の差であるY成分などの部位間長さのうちの少なくとも1つを算出する。そして、足成分算出部734は、算出した足成分を足成分情報725として記憶部720に格納する。 The foot component calculation unit 734 refers to the skeleton information 523 to calculate the X component, which is the difference between the positions of the ankle and the tip of the foot in the X-axis direction, and the Y component, which is the difference between the positions of the ankle and the tip of the foot in the Y-axis direction. at least one of the site-to-site lengths of Then, foot component calculation section 734 stores the calculated foot component as foot component information 725 in storage section 720 .
 例えば、図17は、Y成分の一例を示している。図17を参照すると、足成分算出部734は、骨格情報523を参照することで、足首と足先の位置を示す情報を取得する。そして、足成分算出部734は、上記取得した情報に基づいて、足首と足先とのY軸方向(図17の上下方向)の差を算出する。例えば、以上のように、足成分算出部734は、足首と足先の位置に基づいて、Y成分を算出する。 For example, FIG. 17 shows an example of the Y component. Referring to FIG. 17 , the foot component calculation unit 734 obtains information indicating the positions of the ankle and the toe by referring to the skeleton information 523 . Then, the foot component calculator 734 calculates the difference between the ankle and the toe in the Y-axis direction (vertical direction in FIG. 17) based on the acquired information. For example, as described above, the foot component calculator 734 calculates the Y component based on the positions of the ankle and the tip of the foot.
 また、図18は、X成分の一例を示している。図18を参照すると、足成分算出部734は、骨格情報523を参照することで、足首と足先の位置を示す情報を取得する。そして、足成分算出部734は、上記取得した情報に基づいて、足首と足先とのX軸方向(図18の左右方向)の差を算出する。例えば、以上のように、足成分算出部734は、足首と足先の位置に基づいて、X成分を算出する。 Also, FIG. 18 shows an example of the X component. Referring to FIG. 18, foot component calculation section 734 obtains information indicating the positions of the ankle and toe by referring to skeleton information 523 . Then, the foot component calculator 734 calculates the difference between the ankle and the toe in the X-axis direction (horizontal direction in FIG. 18) based on the acquired information. For example, as described above, the foot component calculator 734 calculates the X component based on the ankle and toe positions.
 なお、足成分算出部734は、例えば、左右の足それぞれのY成分やX成分を算出するよう構成してもよいし、左右の足のうち下方に位置している足についてのみY成分やX成分を算出するよう構成してもよい。また、足成分算出部734は、第1の実施形態で説明した接地タイミングの判定結果を用いて、足成分を算出するか否か判断するよう構成してもよい。例えば、足成分算出部734は、接地タイミングと判定した画像データに基づいてY成分やX成分を算出するよう構成することが出来る。足成分算出部734は、接地タイミングと判定した画像データから所定フレーム分の画像データに基づいてY成分やX成分を算出するよう構成してもよい。 Note that the foot component calculation unit 734 may be configured, for example, to calculate the Y component and the X component of each of the left and right feet, or only the Y component and the X component of the left and right foot that are positioned below. It may be configured to calculate the components. Further, the foot component calculation unit 734 may be configured to determine whether or not to calculate the foot component using the determination result of the contact timing described in the first embodiment. For example, the foot component calculator 734 can be configured to calculate the Y component and the X component based on the image data determined as the contact timing. The foot component calculator 734 may be configured to calculate the Y component and the X component based on a predetermined frame of image data from the image data determined to be the contact timing.
 つま先データ算出部735は、比較長さ724が示す比較長さと足成分情報725が示すY成分とに基づいて、つま先の上がりを示す指標値を算出する。また、つま先データ算出部735は、比較長さ724が示す比較長さと足成分情報725が示すX成分とに基づいて、つま先の向きを示す指標値を算出する。そして、つま先データ算出部735は、算出したつま先の上がりを示す指標値やつま先の向きを示す指標値をつま先データ情報726として記憶部720に格納する。 The toe data calculation unit 735 calculates an index value indicating the rise of the toe based on the comparison length indicated by the comparison length 724 and the Y component indicated by the foot component information 725. Also, the toe data calculator 735 calculates an index value indicating the direction of the toe based on the comparative length indicated by the comparative length 724 and the X component indicated by the foot component information 725 . Then, the toe data calculation unit 735 stores the calculated index value indicating the upward movement of the toe and the calculated index value indicating the direction of the toe in the storage unit 720 as the toe data information 726 .
 例えば、つま先データ算出部735は、Y成分を比較長さで割ることでつま先の上がりを示す指標値を算出する。つまり、つま先データ算出部735は、(Y成分)/(比較長さ)を計算することで、つま先の上がりを示す指標値を算出する。例えば、つま先データ算出部735は、画像データごとに算出されたY成分と比較長さとを用いてつま先の上がりを示す指標値を算出することが出来る。 For example, the toe data calculation unit 735 calculates an index value indicating the rise of the toe by dividing the Y component by the comparison length. That is, the toe data calculation unit 735 calculates the index value indicating the rise of the toe by calculating (Y component)/(comparative length). For example, the toe data calculation unit 735 can calculate an index value indicating the rise of the toe using the Y component and the comparison length calculated for each image data.
 また、例えば、つま先データ算出部735は、X成分を比較長さで割ることでつま先の向きを示す指標値を算出する。つまり、つま先データ算出部735は、(X成分)/(比較長さ)を計算することで、つま先の向きを示す指標値を算出する。例えば、つま先データ算出部735は、画像データごとに算出されたX成分と比較長さとを用いてつま先の向きを示す指標値を算出することが出来る。 Also, for example, the toe data calculation unit 735 calculates an index value indicating the direction of the toe by dividing the X component by the comparison length. That is, the toe data calculator 735 calculates an index value indicating the direction of the toe by calculating (X component)/(comparative length). For example, the toe data calculation unit 735 can calculate an index value indicating the direction of the toe using the X component and the comparison length calculated for each image data.
 なお、つま先データ算出部735は、第1の実施形態で説明した接地タイミングの判定結果を用いて、つま先の上がりを示す指標値やつま先の向きを示す指標値の算出対象になる画像データなどを特定してもよい。例えば、つま先データ算出部735は、接地タイミングと判定した画像データに基づいて取得したY成分やX成分と比較長さとを用いてつま先の上がりを示す指標値やつま先の向きを示す指標値を算出するよう構成してよい。つま先データ算出部735は、接地タイミングと判定した画像データから所定フレーム分の画像データに基づいて取得したY成分やX成分と比較長さとを用いてつま先の上がりを示す指標値やつま先の向きを示す指標値を算出するよう構成してもよい。このように、接地タイミングの判定結果に基づいて特定される画像データに基づく算出を行うことで、足が地面に接地した際のつま先の上がりやつま先の向きを評価することが可能になる。 Note that the toe data calculation unit 735 uses the determination result of the contact timing described in the first embodiment to calculate the image data for which the index value indicating the toe lift and the index value indicating the direction of the toe are to be calculated. may be specified. For example, the toe data calculation unit 735 calculates an index value indicating the rise of the toe and an index value indicating the direction of the toe using the Y component and the X component obtained based on the image data determined as the contact timing and the comparison length. can be configured to The toe data calculation unit 735 calculates an index value indicating the rise of the toe and the direction of the toe by using the Y component and the X component obtained based on the image data for a predetermined frame from the image data determined as the contact timing and the comparison length. It may be configured to calculate an index value to indicate. In this way, by performing calculation based on image data specified based on the determination result of the contact timing, it becomes possible to evaluate the toe lift and the direction of the toe when the foot touches the ground.
 出力部736は、比較長さ情報724、足成分情報725、つま先データ情報726などを出力する。例えば、出力部736は、外部装置やスマートフォン200などに対して、上記各情報のうちの少なくとも1つを出力することが出来る。 The output unit 736 outputs comparative length information 724, foot component information 725, toe data information 726, and the like. For example, the output unit 736 can output at least one of the above information to an external device, the smartphone 200, or the like.
 以上が、算出装置700の構成例である。続いて、図19を参照して、算出装置700の動作例について説明する。 The above is a configuration example of the calculation device 700 . Next, an operation example of the calculation device 700 will be described with reference to FIG. 19 .
 図19を参照すると、つま先データ算出部735は、比較長さ情報が示す比較長さと足成分情報725が示すY成分やX成分を取得する(ステップS401)。 Referring to FIG. 19, the toe data calculation unit 735 acquires the comparative length indicated by the comparative length information and the Y component and the X component indicated by the foot component information 725 (step S401).
 つま先データ算出部735は、比較長さとY成分やX成分に基づいてつま先データを算出する。例えば、つま先データ算出部735は、比較長さとY成分に基づいて、つま先の上がりを示す指標値を算出する。また、例えば、つま先データ算出部735は、比較長さとX成分に基づいて、つま先の向きを示す指標値を算出する。 The toe data calculation unit 735 calculates toe data based on the comparison length and the Y component and the X component. For example, the toe data calculation unit 735 calculates an index value indicating the rise of the toe based on the comparison length and the Y component. Also, for example, the toe data calculation unit 735 calculates an index value indicating the direction of the toe based on the comparison length and the X component.
 以上が、算出装置700の動作例である。 The above is an operation example of the calculation device 700 .
 このように、算出装置700はつま先データ算出部735を有している。このような構成によると、算出装置700は、取得した比較長さとY成分やX成分に基づいてつま先の上がりを示す指標値やつま先の向きを示す指標値を算出することが出来る。つまり、上記構成によると、画像データに基づいてつま先の上がりや向きなどの状態を示す指標値を算出することが出来る。 Thus, the computing device 700 has a toe data computing section 735. According to such a configuration, the calculation device 700 can calculate the index value indicating the toe-up and the index value indicating the direction of the toe based on the obtained comparison length and the Y component and the X component. In other words, according to the above configuration, it is possible to calculate an index value indicating a state such as the toe rising or the direction based on the image data.
 なお、着目対象が足以外の任意の箇所である場合、算出装置700は、Y成分やX成分として、足首と足先以外の着目対象内に存在する部位間の長さを算出するよう構成してよい。 Note that when the target of interest is an arbitrary part other than the foot, the calculation device 700 is configured to calculate the length between parts existing in the target of interest other than the ankle and the toe as the Y component and the X component. you can
 また、算出装置700は、第1の実施形態で説明した判定装置300としての機能を有することが出来る。また、算出装置700は、第1の実施形態で説明した判定装置300と同様の変形例を有することが出来る。同様に、算出装置700は、第2の実施形態で説明した算出装置500としての機能を有することが出来る。また、算出装置700は、第2の実施形態で説明した算出装置500と同様の変形例を有することが出来る。 Also, the calculation device 700 can have a function as the determination device 300 described in the first embodiment. Further, the calculation device 700 can have modifications similar to the determination device 300 described in the first embodiment. Similarly, the computing device 700 can have the function of the computing device 500 described in the second embodiment. Also, the computing device 700 can have modifications similar to the computing device 500 described in the second embodiment.
[第4の実施形態]
 次に、図20から図22までを参照して、本開示の第4の実施形態について説明する。本開示の第4の実施形態では、情報処理装置である算出装置800の構成の概要について説明する。
[Fourth embodiment]
Next, a fourth embodiment of the present disclosure will be described with reference to FIGS. 20 to 22. FIG. In the fourth embodiment of the present disclosure, an overview of the configuration of a computing device 800, which is an information processing device, will be described.
 図20は、算出装置800のハードウェア構成例を示している。図20を参照すると、算出装置800は、一例として、以下のようなハードウェア構成を有している。
 ・CPU(Central Processing Unit)801(演算装置)
 ・ROM(Read Only Memory)802(記憶装置)
 ・RAM(Random Access Memory)803(記憶装置)
 ・RAM803にロードされるプログラム群804
 ・プログラム群804を格納する記憶装置805
 ・情報処理装置外部の記録媒体810の読み書きを行うドライブ装置806
 ・情報処理装置外部の通信ネットワーク811と接続する通信インタフェース807
 ・データの入出力を行う入出力インタフェース808
 ・各構成要素を接続するバス809
FIG. 20 shows a hardware configuration example of the computing device 800 . Referring to FIG. 20, the computing device 800 has, as an example, the following hardware configuration.
- CPU (Central Processing Unit) 801 (arithmetic unit)
・ROM (Read Only Memory) 802 (storage device)
・RAM (Random Access Memory) 803 (storage device)
Program group 804 loaded into RAM 803
- Storage device 805 for storing program group 804
- A drive device 806 that reads and writes a recording medium 810 external to the information processing device
- A communication interface 807 that connects to a communication network 811 outside the information processing apparatus
An input/output interface 808 for inputting/outputting data
A bus 809 connecting each component
 また、算出装置800は、プログラム群804をCPU801が取得して当該CPU801が実行することで、図21に示す高さ算出部821、曲がり算出部822としての機能を実現することが出来る。なお、プログラム群804は、例えば、予め記憶装置805やROM802に格納されており、必要に応じてCPU801がRAM803などにロードして実行する。また、プログラム群804は、通信ネットワーク811を介してCPU8801に供給されてもよいし、予め記録媒体810に格納されており、ドライブ装置806が該プログラムを読み出してCPU801に供給してもよい。 In addition, the calculation device 800 can realize the functions of the height calculation unit 821 and the curve calculation unit 822 shown in FIG. The program group 804 is stored in the storage device 805 or the ROM 802 in advance, for example, and is loaded into the RAM 803 or the like and executed by the CPU 801 as necessary. The program group 804 may be supplied to the CPU 8801 via the communication network 811 or may be stored in the recording medium 810 in advance, and the drive device 806 may read the program and supply it to the CPU 801 .
 なお、図20は、算出装置800のハードウェア構成例を示している。算出装置800のハードウェア構成は上述した場合に限定されない。例えば、算出装置800は、ドライブ装置806を有さないなど、上述した構成の一部から構成されてもよい。 Note that FIG. 20 shows a hardware configuration example of the computing device 800 . The hardware configuration of the computing device 800 is not limited to the case described above. For example, the computing device 800 may consist of some of the configurations described above, such as not having the drive device 806 .
 高さ算出部821は、画像データに基づいて、当該画像データ内の人物の高さを算出する。 The height calculator 821 calculates the height of the person in the image data based on the image data.
 曲がり算出部822は、画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得する。また、曲がり算出部822は、取得した手間長さと、高さ算出部821が算出した高さと、に基づいて、画像データ内の人物の腰の曲がりを示す指標値を算出する。 The bend calculation unit 822 acquires the hand length, which is the length connecting the left and right hands of the person in the image data. Also, the curve calculation unit 822 calculates an index value indicating the curve of the waist of the person in the image data based on the obtained hand length and the height calculated by the height calculation unit 821 .
 このように、算出装置800は、高さ算出部821と曲がり算出部822とを有している。また、曲がり算出部822は、手間長さを取得するよう構成されている。このような構成によると、曲がり算出部822は、取得した手間長さと、高さ算出部821が算出した高さと、に基づいて、画像データ内の人物の腰の曲がりを示す指標値を算出することが出来る。その結果、画像データに基づいて腰の曲がりに応じた指標値を算出することが可能になる。 Thus, the calculation device 800 has a height calculation section 821 and a curvature calculation section 822. In addition, the bend calculator 822 is configured to acquire the hand length. According to such a configuration, the curve calculation unit 822 calculates an index value indicating the curve of the waist of the person in the image data based on the obtained hand length and the height calculated by the height calculation unit 821. can do As a result, it is possible to calculate an index value corresponding to the bending of the waist based on the image data.
 なお、上述した算出装置800などの情報処理装置は、当該情報処理装置に所定のプログラムが組み込まれることで実現できる。具体的に、本発明の他の形態であるプログラムは、情報処理装置に、画像データに基づいて、当該画像データ内の人物の高さを算出し、画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した手間長さと、算出した高さと、に基づいて、画像データ内の人物の腰の曲がりを示す指標値を算出する、処理を実現するためのプログラムである。 An information processing device such as the computing device 800 described above can be realized by installing a predetermined program in the information processing device. Specifically, a program, which is another aspect of the present invention, causes an information processing apparatus to calculate the height of a person in the image data based on the image data, and connect the left and right hands of the person in the image data. A program for realizing a process of acquiring a hand length, which is a length, and calculating an index value indicating the bending of the waist of a person in image data based on the acquired hand length and the calculated height. is.
 また、上述した情報処理装置により実行される算出方法は、情報処理装置が、画像データに基づいて、当該画像データ内の人物の高さを算出し、画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した手間長さと、算出した高さと、に基づいて、画像データ内の人物の腰の曲がりを示す指標値を算出する、というものである。 Further, in the calculation method executed by the information processing device described above, the information processing device calculates the height of the person in the image data based on the image data, and connects the right and left hands of the person in the image data. The hand length, which is the length, is acquired, and an index value indicating the bending of the waist of the person in the image data is calculated based on the acquired hand length and the calculated height.
 上述した構成を有する、プログラム(又は記録媒体)、又は、算出方法などの発明であっても、上述した場合と同様の作用・効果を有するために、上述した曲がり算出部822を有する算出装置800と同様の目的を達成することが出来る。 Even in the invention of the program (or recording medium) or calculation method having the configuration described above, the calculation device 800 having the curvature calculation section 822 described above has the same functions and effects as those described above. can achieve the same purpose.
 また、算出装置800は、プログラム群804をCPU801が取得して当該CPU801が実行することで、図22に示す成分算出部831、比較長さ算出部832、指標値算出部833としての機能を実現するよう構成してもよい。 Calculation device 800 realizes the functions of component calculation unit 831, comparison length calculation unit 832, and index value calculation unit 833 shown in FIG. may be configured to
 成分算出部831は、画像データに基づいて、当該画像データ内の人物において着目対象内に存在する部位間の部位間長さを算出する。 The component calculation unit 831 calculates, based on the image data, the inter-part length between the parts present in the target of interest in the person in the image data.
 比較長さ算出部832は、画像データに基づいて、比較対象になる比較長さを算出する。 The comparison length calculator 832 calculates a comparison length to be compared based on the image data.
 指標値算出部833は、成分算出部831が算出した部位間長さと、比較長さ算出部832が算出した比較長さと、に基づいて、着目対象の状態を示す指標値を算出する。 The index value calculation unit 833 calculates an index value indicating the state of the object of interest based on the inter-part length calculated by the component calculation unit 831 and the comparison length calculated by the comparison length calculation unit 832 .
 このように、算出装置800は、成分算出部831と比較長さ算出部832と指標値算出部833とを有することが出来る。このような構成によると、指標値算出部833は、成分算出部831が算出した部位間長さと、比較長さ算出部832が算出した比較長さと、に基づいて、着目対象の状態を示す指標値を算出することが出来る。その結果、画像データに基づいて着目対象の状態に応じた指標値を算出することが可能になる。 In this way, the calculation device 800 can have a component calculation section 831 , a comparison length calculation section 832 and an index value calculation section 833 . According to such a configuration, the index value calculation unit 833 calculates the index value indicating the state of the object of interest based on the inter-part length calculated by the component calculation unit 831 and the comparison length calculated by the comparison length calculation unit 832. value can be calculated. As a result, it becomes possible to calculate an index value corresponding to the state of the target of interest based on the image data.
 なお、上述した算出装置800などの情報処理装置は、当該情報処理装置に所定のプログラムが組み込まれることで実現できる。具体的に、本発明の他の形態であるプログラムは、情報処理装置に、画像データに基づいて、当該画像データ内の人物において着目対象内に存在する部位間の部位間長さを算出し、画像データに基づいて、比較対象になる比較長さを算出し、算出した部位間長さと、算出した比較長さと、に基づいて、着目対象の状態を示す指標値を算出する、処理を実現するためのプログラムである。 An information processing device such as the computing device 800 described above can be realized by installing a predetermined program in the information processing device. Specifically, a program, which is another aspect of the present invention, causes an information processing device to calculate, based on image data, an inter-part length between parts existing in a target of interest in a person in the image data, Based on the image data, a comparative length to be compared is calculated, and based on the calculated length between parts and the calculated comparative length, an index value indicating the state of the target of interest is calculated. It is a program for
 また、上述した情報処理装置により実行される算出方法は、情報処理装置が、画像データに基づいて、当該画像データ内の人物において着目対象内に存在する部位間の部位間長さを算出し、画像データに基づいて、比較対象になる比較長さを算出し、算出した部位間長さと、算出した比較長さと、に基づいて、着目対象の状態を示す指標値を算出する、というものである。 Further, in the calculation method executed by the information processing device described above, the information processing device calculates, based on the image data, the inter-part length between the parts existing in the target of interest in the person in the image data, Based on the image data, a comparative length to be compared is calculated, and an index value indicating the state of the object of interest is calculated based on the calculated length between parts and the calculated comparative length. .
 上述した構成を有する、プログラム(又は記録媒体)、又は、算出方法などの発明であっても、上述した場合と同様の作用・効果を有するために、上述した指標値算出部833を有する算出装置800と同様の目的を達成することが出来る。 Even in the invention of the program (or recording medium) or calculation method having the above-described configuration, the calculation device having the above-described index value calculation unit 833 in order to have the same action and effect as the above-described case. It can achieve the same purpose as the 800.
 <付記>
 上記実施形態の一部又は全部は、以下の付記のようにも記載されうる。以下、本発明における算出装置などの概略を説明する。但し、本発明は、以下の構成に限定されない。
<Appendix>
Some or all of the above embodiments may also be described as the following appendices. An outline of the calculation device and the like in the present invention will be described below. However, the present invention is not limited to the following configurations.
(付記1)
 画像データに基づいて、当該画像データ内の人物の高さを算出する高さ算出部と、
 前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する曲がり算出部と、
 を有する
 算出装置。
(付記2)
 前記曲がり算出部は、前記手間長さを前記高さで割ることで前記指標値を算出する
 付記1に記載の算出装置。
(付記3)
 前記画像データに基づいて当該画像データ内の人物の前記手間長さを算出する手間長さ算出部を有し、
 前記曲がり算出部は、前記手間長さ算出部が算出した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記指標値を算出する
 付記1または付記2に記載の算出装置。
(付記4)
 前記手間長さ算出部は、前記画像データに基づいて認識される前記画像データ内の人物の各部位の座標を示す情報に基づいて前記手間長さを算出する
 付記3に記載の算出装置。
(付記5)
 前記曲がり算出部は、前記手間長さ算出部が算出した前記手間長さのうち所定の基準で選択した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記指標値を算出する
 付記3または付記4に記載の算出装置。
(付記6)
 前記曲がり算出部は、予め記憶装置に格納された前記手間長さを取得し、取得した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記指標値を算出する
 付記1または付記2に記載の算出装置。
(付記7)
 前記曲がり算出部は、前記画像データ内の人物の足が地面に接地したタイミングを示す接地タイミング情報を取得し、前記接地タイミング情報に基づいて特定される前記画像データに基づいて算出された前記高さに基づいて、前記指標値を算出する
 付記1から付記6までのうちのいずれか1項に記載の算出装置。
(付記8)
 前記高さ算出部は、前記画像データに基づいて認識される前記画像データ内の人物の各部位の座標を示す情報に基づいて前記高さを算出する
 付記1から付記7までのうちのいずれか1項に記載の算出装置。
(付記9)
 情報処理装置が、
 画像データに基づいて、当該画像データ内の人物の高さを算出し、
 前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する
 算出方法。
(付記10)
 情報処理装置に、
 画像データに基づいて、当該画像データ内の人物の高さを算出し、
 前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する
 処理を実現するためのプログラム。
(Appendix 1)
a height calculation unit that calculates the height of a person in the image data based on the image data;
A hand length, which is a length connecting the left and right hands of a person in the image data, is obtained, and based on the obtained hand length and the height calculated by the height calculation unit, a curvature calculation unit that calculates an index value indicating the curvature of the waist of the person;
A computing device.
(Appendix 2)
The calculation device according to appendix 1, wherein the curve calculation unit calculates the index value by dividing the hand length by the height.
(Appendix 3)
a hand length calculation unit that calculates the hand length of the person in the image data based on the image data;
According to appendix 1 or appendix 2, the bend calculation unit calculates the index value based on the hand length calculated by the hand length calculation unit and the height calculated by the height calculation unit. calculator.
(Appendix 4)
The calculation device according to appendix 3, wherein the hand length calculation unit calculates the hand length based on information indicating coordinates of each part of the person in the image data recognized based on the image data.
(Appendix 5)
The bend calculation unit calculates the index based on the length of effort selected according to a predetermined criterion from among the lengths of effort calculated by the length of effort calculation unit and the height calculated by the height calculation unit. The calculation device according to appendix 3 or appendix 4, which calculates a value.
(Appendix 6)
The bend calculation unit acquires the length of effort stored in advance in a storage device, and calculates the index value based on the acquired length of effort and the height calculated by the height calculation unit. The calculation device according to appendix 1 or appendix 2.
(Appendix 7)
The bend calculation unit obtains contact timing information indicating the timing at which the person's foot contacts the ground in the image data, and calculates the height calculated based on the image data specified based on the contact timing information. The calculation device according to any one of appendices 1 to 6, wherein the index value is calculated based on the
(Appendix 8)
The height calculation unit calculates the height based on information indicating coordinates of each part of the person in the image data recognized based on the image data. 2. The computing device according to item 1.
(Appendix 9)
The information processing device
Based on the image data, calculate the height of the person in the image data,
A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height. Calculation method for calculating an index value that indicates
(Appendix 10)
information processing equipment,
Based on the image data, calculate the height of the person in the image data,
A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height. A program for realizing processing that calculates an index value that indicates
 なお、上記各実施形態及び付記において記載したプログラムは、記憶装置に記憶されていたり、コンピュータが読み取り可能な記録媒体に記録されていたりする。例えば、記録媒体は、フレキシブルディスク、光ディスク、光磁気ディスク、及び、半導体メモリ等の可搬性を有する媒体である。 The programs described in each of the above embodiments and supplementary notes are stored in a storage device or recorded in a computer-readable recording medium. For example, the recording medium is a portable medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
 以上、上記各実施形態を参照して本願発明を説明したが、本願発明は、上述した実施形態に限定されるものではない。本願発明の構成や詳細には、本願発明の範囲内で当業者が理解しうる様々な変更をすることが出来る。 Although the present invention has been described with reference to the above-described embodiments, the present invention is not limited to the above-described embodiments. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
 なお、本発明は、日本国にて2021年3月29日に特許出願された特願2021-055313の特許出願に基づく優先権主張の利益を享受するものであり、当該特許出願に記載された内容は、全て本明細書に含まれるものとする。 In addition, the present invention enjoys the benefit of the priority claim based on the patent application of Japanese Patent Application No. 2021-055313 filed on March 29, 2021 in Japan, and is described in the patent application. The contents are hereby incorporated by reference in their entirety.
100 判定システム
200 スマートフォン
300 判定装置
310 通信I/F部
320 記憶部
321 学習済みモデル
322 画像情報
323 骨格情報
324 接地判断情報
325 接地タイミング情報
326 プログラム
330 演算処理部
331 画像取得部
332 骨格認識部
333 接地判断部
334 接地タイミング判定部
335 出力部
400 算出システム
500 算出装置
510 通信I/F部
520 記憶部
521 学習済みモデル
522 画像情報
523 骨格情報
524 手間長さ情報
525 高さ情報
526 腰の曲がり値情報
527 プログラム
530 演算処理部
531 画像取得部
532 骨格認識部
533 手間長さ算出部
534 高さ算出部
535 曲がり算出部
536 出力部
600 算出システム
700 算出装置
710 通信I/F部
720 記憶部
721 学習済みモデル
722 画像情報
723 骨格情報
724 比較長さ情報
725 足成分情報
726 つま先データ情報
727 プログラム
730 演算処理部
731 画像取得部
732 骨格認識部
733 比較長さ算出部
734 足成分算出部
735 つま先データ算出部
736 出力部
800 算出装置
801 CPU
802 ROM
803 RAM
804 プログラム群
805 記憶装置
806 ドライブ装置
807 通信インタフェース
808 入出力インタフェース
809 バス
810 記録媒体
811 通信ネットワーク
821 高さ算出部
822 曲がり算出部
831 成分算出部
832 比較長さ算出部
833 指標値算出部

 
100 determination system 200 smart phone 300 determination device 310 communication I/F unit 320 storage unit 321 trained model 322 image information 323 skeleton information 324 contact determination information 325 contact timing information 326 program 330 arithmetic processing unit 331 image acquisition unit 332 skeleton recognition unit 333 Landing determination unit 334 Landing timing determination unit 335 Output unit 400 Calculation system 500 Calculation device 510 Communication I/F unit 520 Storage unit 521 Trained model 522 Image information 523 Skeleton information 524 Hand length information 525 Height information 526 Waist bending value information 527 program 530 arithmetic processing unit 531 image acquisition unit 532 skeleton recognition unit 533 hand length calculation unit 534 height calculation unit 535 bend calculation unit 536 output unit 600 calculation system 700 calculation device 710 communication I/F unit 720 storage unit 721 learning Finished model 722 Image information 723 Skeleton information 724 Comparison length information 725 Foot component information 726 Toe data information 727 Program 730 Operation processing unit 731 Image acquisition unit 732 Skeleton recognition unit 733 Comparison length calculation unit 734 Foot component calculation unit 735 Toe data calculation Unit 736 Output unit 800 Calculation device 801 CPU
802 ROMs
803 RAM
804 program group 805 storage device 806 drive device 807 communication interface 808 input/output interface 809 bus 810 recording medium 811 communication network 821 height calculator 822 bend calculator 831 component calculator 832 comparative length calculator 833 index value calculator

Claims (10)

  1.  画像データに基づいて、当該画像データ内の人物の高さを算出する高さ算出部と、
     前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する曲がり算出部と、
     を有する
     算出装置。
    a height calculation unit that calculates the height of a person in the image data based on the image data;
    A hand length, which is a length connecting the left and right hands of a person in the image data, is obtained, and based on the obtained hand length and the height calculated by the height calculation unit, a curvature calculation unit that calculates an index value indicating the curvature of the waist of the person;
    A computing device.
  2.  前記曲がり算出部は、前記手間長さを前記高さで割ることで前記指標値を算出する
     請求項1に記載の算出装置。
    The calculation device according to claim 1, wherein the curve calculation unit calculates the index value by dividing the hand length by the height.
  3.  前記画像データに基づいて当該画像データ内の人物の前記手間長さを算出する手間長さ算出部を有し、
     前記曲がり算出部は、前記手間長さ算出部が算出した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記指標値を算出する
     請求項1または請求項2に記載の算出装置。
    a hand length calculation unit that calculates the hand length of the person in the image data based on the image data;
    3. According to claim 1 or claim 2, the curve calculation unit calculates the index value based on the hand length calculated by the hand length calculation unit and the height calculated by the height calculation unit. The computing device described.
  4.  前記手間長さ算出部は、前記画像データに基づいて認識される前記画像データ内の人物の各部位の座標を示す情報に基づいて前記手間長さを算出する
     請求項3に記載の算出装置。
    4. The calculation device according to claim 3, wherein the hand length calculator calculates the hand length based on information indicating coordinates of each part of the person in the image data recognized based on the image data.
  5.  前記曲がり算出部は、前記手間長さ算出部が算出した前記手間長さのうち所定の基準で選択した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記指標値を算出する
     請求項3または請求項4に記載の算出装置。
    The bend calculation unit calculates the index based on the length of effort selected according to a predetermined criterion from among the lengths of effort calculated by the length of effort calculation unit and the height calculated by the height calculation unit. 5. A calculation device according to claim 3 or 4, for calculating a value.
  6.  前記曲がり算出部は、予め記憶装置に格納された前記手間長さを取得し、取得した前記手間長さと、前記高さ算出部が算出した前記高さと、に基づいて、前記指標値を算出する
     請求項1または請求項2に記載の算出装置。
    The bend calculation unit acquires the length of effort stored in advance in a storage device, and calculates the index value based on the acquired length of effort and the height calculated by the height calculation unit. 3. A computing device according to claim 1 or claim 2.
  7.  前記曲がり算出部は、前記画像データ内の人物の足が地面に接地したタイミングを示す接地タイミング情報を取得し、前記接地タイミング情報に基づいて特定される前記画像データに基づいて算出された前記高さに基づいて、前記指標値を算出する
     請求項1から請求項6までのうちのいずれか1項に記載の算出装置。
    The bend calculation unit obtains contact timing information indicating the timing at which the person's foot contacts the ground in the image data, and calculates the height calculated based on the image data specified based on the contact timing information. 7. The calculation device according to any one of claims 1 to 6, wherein the index value is calculated based on the
  8.  前記高さ算出部は、前記画像データに基づいて認識される前記画像データ内の人物の各部位の座標を示す情報に基づいて前記高さを算出する
     請求項1から請求項7までのうちのいずれか1項に記載の算出装置。
    The height calculator calculates the height based on information indicating the coordinates of each part of the person in the image data recognized based on the image data. A computing device according to any one of the preceding claims.
  9.  情報処理装置が、
     画像データに基づいて、当該画像データ内の人物の高さを算出し、
     前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する
     算出方法。
    The information processing device
    Based on the image data, calculate the height of the person in the image data,
    A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height. Calculation method for calculating an index value that indicates
  10.  情報処理装置に、
     画像データに基づいて、当該画像データ内の人物の高さを算出し、
     前記画像データ内の人物の左右の手間を結ぶ長さである手間長さを取得し、取得した前記手間長さと、算出した前記高さと、に基づいて、前記画像データ内の人物の腰の曲がりを示す指標値を算出する
     処理を実現するためのプログラムを記録した、コンピュータが読み取り可能な記録媒体。

     
    information processing equipment,
    Based on the image data, calculate the height of the person in the image data,
    A hand length, which is a length connecting the left and right hands of the person in the image data, is obtained, and the curve of the waist of the person in the image data is calculated based on the obtained hand length and the calculated height. A computer-readable recording medium in which a program for realizing processing for calculating an index value indicating is recorded.

PCT/JP2022/004471 2021-03-29 2022-02-04 Calculation device WO2022209287A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023510576A JPWO2022209287A1 (en) 2021-03-29 2022-02-04

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-055313 2021-03-29
JP2021055313 2021-03-29

Publications (1)

Publication Number Publication Date
WO2022209287A1 true WO2022209287A1 (en) 2022-10-06

Family

ID=83458826

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/004471 WO2022209287A1 (en) 2021-03-29 2022-02-04 Calculation device

Country Status (2)

Country Link
JP (1) JPWO2022209287A1 (en)
WO (1) WO2022209287A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017170832A1 (en) * 2016-03-31 2017-10-05 Necソリューションイノベータ株式会社 Gait analyzing device, gait analyzing method, and computer-readable recording medium
WO2018131630A1 (en) * 2017-01-13 2018-07-19 三菱電機株式会社 Operation analysis device and operation analysis method
WO2020261404A1 (en) * 2019-06-26 2020-12-30 日本電気株式会社 Person state detecting device, person state detecting method, and non-transient computer-readable medium containing program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017170832A1 (en) * 2016-03-31 2017-10-05 Necソリューションイノベータ株式会社 Gait analyzing device, gait analyzing method, and computer-readable recording medium
WO2018131630A1 (en) * 2017-01-13 2018-07-19 三菱電機株式会社 Operation analysis device and operation analysis method
WO2020261404A1 (en) * 2019-06-26 2020-12-30 日本電気株式会社 Person state detecting device, person state detecting method, and non-transient computer-readable medium containing program

Also Published As

Publication number Publication date
JPWO2022209287A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
KR101986327B1 (en) Method for providing posture guide and apparatus thereof
CN102824176B (en) Upper limb joint movement degree measuring method based on Kinect sensor
US20190220657A1 (en) Motion recognition device and motion recognition method
US11403882B2 (en) Scoring metric for physical activity performance and tracking
CN111597975B (en) Personnel action detection method and device and electronic equipment
JP6649323B2 (en) Gait analysis system and method
US20190266734A1 (en) Information processing device, information processing method, and recording medium
CN114495177A (en) Scene interactive human body action and balance intelligent evaluation method and system
CN109219426B (en) Rehabilitation training assistance control device and computer-readable recording medium
WO2017222072A1 (en) Posture analysis device, posture analysis method, and computer-readable recording medium
US20220222975A1 (en) Motion recognition method, non-transitory computer-readable recording medium and information processing apparatus
JP2020141806A (en) Exercise evaluation system
KR101818198B1 (en) Apparatus and method for evaluating Taekwondo motion using multi-directional recognition
CN108509924B (en) Human body posture scoring method and device
US10470688B2 (en) Measurement apparatus, method and non-transitory computer-readable recording medium
WO2022209287A1 (en) Calculation device
WO2022209288A1 (en) Calculation device
CN112818800A (en) Physical exercise evaluation method and system based on human skeleton point depth image
CN111353345B (en) Method, apparatus, system, electronic device, and storage medium for providing training feedback
CN113345069A (en) Modeling method, device and system of three-dimensional human body model and storage medium
CN111353347B (en) Action recognition error correction method, electronic device, and storage medium
JP2020140283A (en) Information processing device, information processing method, and computer program
WO2022209286A1 (en) Determination device
WO2022137450A1 (en) Information processing device, information processing method, and program
CN110148202B (en) Method, apparatus, device and storage medium for generating image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22779516

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023510576

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22779516

Country of ref document: EP

Kind code of ref document: A1