CN117136377A - computing device - Google Patents

computing device Download PDF

Info

Publication number
CN117136377A
CN117136377A CN202280025797.3A CN202280025797A CN117136377A CN 117136377 A CN117136377 A CN 117136377A CN 202280025797 A CN202280025797 A CN 202280025797A CN 117136377 A CN117136377 A CN 117136377A
Authority
CN
China
Prior art keywords
length
image data
information
calculation unit
inter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280025797.3A
Other languages
Chinese (zh)
Inventor
寺岛宏纪
永井克幸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Solution Innovators Ltd
Original Assignee
NEC Solution Innovators Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Solution Innovators Ltd filed Critical NEC Solution Innovators Ltd
Publication of CN117136377A publication Critical patent/CN117136377A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Abstract

The computing device (800) has: a component calculation unit (831) that calculates, based on the image data, the inter-region length between regions existing in the object of interest in the person in the image data; a comparison length calculation unit (832) that calculates a comparison length to be compared based on the image data; and an index value calculation unit (833) that calculates an index value indicating the state of the object of interest, based on the inter-part length calculated by the component calculation unit and the comparison length calculated by the comparison length calculation unit.

Description

Computing device
Technical Field
The invention relates to a computing device, a computing method, and a storage medium.
Background
It is known to conduct walking analysis of a user based on data.
Patent document 1, for example, discloses a document describing walking analysis of a person. Patent document 1 describes a walking analysis device provided with: a data acquisition unit that acquires two types of image data from the depth sensor; a bone information creation unit that creates bone information based on the image data acquired by the data acquisition unit; a correction processing unit configured to correct the bone information created by the bone information creation unit; and an analysis processing unit that analyzes the walking of the user using the corrected bone information.
Prior art literature
Patent literature
Patent document 1: international publication No. 2017/170832
Disclosure of Invention
Examples of the index to be calculated in the walking analysis include an index value corresponding to the state of the target of interest, such as the lifting of the toe and the orientation of the toe. However, for example, in the case where analysis based on image data is not performed using a depth sensor of the type described in patent document 1, information on the depth direction cannot be obtained, and thus it is difficult to calculate an index value indicating the state of the object of interest.
Accordingly, an object of the present invention is to provide a determination device, a determination method, and a program that solve the technical problem that it is difficult to calculate an index value corresponding to a state of an object of interest based on image data.
To achieve this object, a computing device of one aspect of the present disclosure is configured to have:
a component calculation unit that calculates, based on image data, inter-region lengths between regions existing in an object of interest in a person in the image data;
a comparison length calculation unit that calculates a comparison length to be compared based on the image data; and
an index value calculation unit configured to calculate an index value indicating a state of the object of interest based on the inter-part length calculated by the component calculation unit and the comparison length calculated by the comparison length calculation unit.
In addition, the calculation method as another aspect of the present disclosure is configured to:
the information processing apparatus calculates, based on the image data, inter-region lengths between regions existing in the object of interest among the persons in the image data,
based on the image data, a comparison length to be a comparison object is calculated,
an index value representing the state of the object of interest is calculated based on the calculated inter-site length and the calculated comparison length.
In addition, as a storage medium of other aspects of the present disclosure, a storage medium readable by a computer, and a program for causing an information processing apparatus to realize:
based on the image data, an inter-region length between regions existing in the object of interest in the person in the image data is calculated,
based on the image data, a comparison length to be a comparison object is calculated,
an index value representing the state of the object of interest is calculated based on the calculated inter-site length and the calculated comparison length.
Effects of the invention
According to the above-described configurations, an index value corresponding to the state of the object of interest can be calculated based on the image data.
Drawings
Fig. 1 is a diagram showing a configuration example of a determination system according to embodiment 1 of the present disclosure.
Fig. 2 is a diagram for explaining an example of image data acquired in the determination system.
Fig. 3 is a block diagram showing a configuration example of the determination device shown in fig. 1.
Fig. 4 is a diagram showing an example of bone information.
Fig. 5 is a diagram showing an example of the ground determination information.
Fig. 6 is a diagram for explaining a judgment example by the ground judgment section.
Fig. 7 is a diagram for explaining a determination example by the grounding timing determination unit.
Fig. 8 is a flowchart showing an example of the processing performed by the ground determination unit.
Fig. 9 is a flowchart showing an example of the processing performed by the ground timing determination unit.
Fig. 10 is a diagram showing a configuration example of a computing system of embodiment 2 of the present disclosure.
Fig. 11 is a block diagram showing a configuration example of the computing device shown in fig. 10.
Fig. 12 is a diagram showing an example of the length and height between hands.
Fig. 13 is a flowchart showing an example of the operation of the computing device.
Fig. 14 is a diagram showing a configuration example of a computing system of embodiment 3 of the present disclosure.
Fig. 15 is a block diagram showing a structural example of the computing device shown in fig. 14.
Fig. 16 is a view showing an example of the height.
Fig. 17 is a diagram showing an example of the components.
Fig. 18 is a diagram showing another example of the components.
Fig. 19 is a flowchart showing an example of the operation of the computing device.
Fig. 20 is a diagram showing an example of a hardware configuration of a computing device according to embodiment 4 of the present disclosure.
Fig. 21 is a block diagram showing a configuration example of a computing device.
Fig. 22 is a block diagram showing another configuration example of the computing device.
Detailed Description
[ embodiment 1 ]
Embodiment 1 of the present disclosure will be described with reference to fig. 1 to 9. Fig. 1 is a diagram showing a configuration example of a determination system 100. Fig. 2 is a diagram for explaining an example of image data acquired in the determination system 100. Fig. 3 is a block diagram showing a configuration example of the determination device 300. Fig. 4 is a diagram showing an example of the bone information 323. Fig. 5 is a diagram showing an example of the ground determination information 324. Fig. 6 is a diagram for explaining a judgment example by the ground judgment unit 333. Fig. 7 is a diagram for explaining a determination example by the ground timing determination unit 334.
Fig. 8 is a flowchart showing an example of the processing performed by the ground determination unit 333. Fig. 9 is a flowchart showing an example of the processing performed by the ground timing determination unit 334.
In embodiment 1 of the present disclosure, a description will be given of a determination system 100 for determining the timing of the foot ground of a walking person based on image data acquired using an imaging device such as a smartphone 200. Time-series image data showing a situation in which a person walks from the far side toward the near-front side of the screen is acquired by the determination system 100. As will be described later, the determination system 100 identifies the bone position from each of the plurality of image data. Then, the determination system 100 determines the timing of foot grounding based on the degree of change in the identified position.
The ground contact timing determined by the determination system 100 can be applied to various walking analyses, for example, such as walking analyses of angles of joints, an index of whether or not the toe is lifted from the ground, a step distance, a stride, and one walking cycle at the time of foot contact. The timing of the ground determined by the determination system 100 may be used in cases other than the above-described examples.
Fig. 1 shows an example of the configuration of a determination system 100. Referring to fig. 1, the determination system 100 includes, for example, a smart phone 200 and a determination device 300. As shown in fig. 1, the smartphone 200 and the determination device 300 are connected, for example, by wireless or wired, so as to be able to communicate with each other.
The smartphone 200 functions as an imaging device that captures a situation where a person walks. The smart phone 200 may be a smart phone having a camera function for acquiring image data, a touch panel for displaying a screen, various sensors such as a GPS sensor and an acceleration sensor, and other general functions.
In the case of the present embodiment, as shown in fig. 2, the smartphone 200 captures a case where a person walks in the depth direction from the far to the near of the screen. In other words, the smartphone 200 acquires time-series image data showing a situation in which a person walks in the depth direction. Further, the smartphone 200 transmits the acquired image data to the determination device 300. The smartphone 200 may associate information indicating the date and time at which the image data was acquired with the image data, and transmit the associated data to the determination device 300.
The determination device 300 is an information processing device that determines the timing of grounding the foot of the walking person to the ground based on the image data acquired by the smartphone 200. For example, the determination device 300 is a server device or the like. The determination device 300 may be 1 information processing device, and may be implemented by a cloud, for example.
Fig. 3 shows an example of the configuration of the determination device 300. Referring to fig. 3, the determination device 300 includes, for example, a communication I/F unit 310, a storage unit 320, and an arithmetic processing unit 330 as main components.
The communication I/F section 310 includes a data communication circuit. The communication I/F unit 310 performs data communication with an external device, a smart phone 200, or the like connected via a communication line.
The storage unit 320 is a storage device such as a hard disk or a memory. The storage unit 320 stores processing information and programs 326 necessary for various kinds of processing in the arithmetic processing unit 330. The program 326 realizes various processing units by being read and executed by the arithmetic processing unit 330. The program 326 is read from an external device or a storage medium via a data input/output function such as the communication I/F unit 310 in advance, and stored in the storage unit 320. The main information stored in the storage unit 320 includes, for example, a learned model 321, image information 322, skeleton information 323, ground determination information 324, ground timing information 325, and the like.
The learned model 321 is a learned model used for bone recognition by the bone recognition unit 332. The learning model 321 is generated in advance by, for example, performing machine learning using teacher data such as image data having skeletal coordinates in an external device or the like, and the learning model 321 is acquired from the external device or the like via the communication I/F unit 310 or the like and stored in the storage unit 320. The learned model 321 may be a model generated by a known method. The learned model 321 may be updated by a relearning process or the like using additional teacher data.
The image information 322 includes time-series image data acquired by a camera provided in the smartphone 200. The image information 322 is generated and updated by the image acquisition section 331 acquiring image data or the like.
For example, in the image information 322, identification information for identifying image data is associated with the image data. The image information 322 may include information other than the above examples, such as information indicating the date and time at which the image data was acquired by the smartphone 200.
The bone information 323 includes information indicating coordinates of each part of the person recognized by the bone recognition unit 332. The bone information 323 includes, for example, information indicating coordinates of each part of each image data of the time series. For example, the bone information 323 is generated and updated as a result of processing by the bone recognition portion 332.
Fig. 4 shows an example of the bone information 323. Referring to fig. 4, in the bone information 323, for example, identification information is associated with position information of each part. Here, the identification information is information indicating image data used for identifying bone information, and the like. The identification information may correspond to identification information for identifying the image data in the image information 322. The positional information of each part includes information showing coordinates of each part in the image data, such as the position of the pelvis.
The location included in the location information of each location corresponds to the learned model 321. For example, in fig. 4, the pelvis, center of the spine, … …, right knee, left knee, … …, right ankle, left ankle, … … are illustrated. The positional information of each part includes, for example, about 30 parts (other parts than the illustrated parts) such as the right shoulder, … …, left elbow, … …, and the like. In the case of the present embodiment, the position information of each part includes at least information indicating coordinates of the right ankle and the left ankle. The location included in the location information of each location may be a location other than the location illustrated in fig. 4 or the like.
The ground determination information 324 includes information indicating a determination result based on the ground determination unit 333. For example, the ground determination information 324 is generated and updated as a determination result based on the ground determination section 333.
Fig. 5 shows an example of the ground determination information 324. As shown in fig. 5, for example, in the ground determination information 324, the identification information is associated with the ground determination result. Here, the identification information is information indicating image data for which the ground determination is made, or the like. The identification information may be information corresponding to identification information for identifying the image data in the image information 322. In addition, the ground determination result shows the result of the ground determination. For example, the ground determination result includes "left foot" indicating the result of the ground determination of the left foot, "right foot" indicating the result of the ground determination of the right foot, "negative" indicating the result of the ground determination of the right foot, "positive" indicating that the ground determination is not performed, and the like. The ground determination result may be other than the above-described example.
The ground timing information 325 includes information indicating a determination result based on the ground timing determination section 334. For example, the ground timing information 325 is generated and updated as a determination result of the ground timing determination section 334.
For example, the ground timing information 325 includes information indicating the timing of the foot ground. Here, the information indicating the timing of the foot grounding refers to, for example, at least one of identification information indicating the image data determined to be grounded, information indicating the acquisition date and time of the image data determined to be grounded, and the like. The ground timing information 325 may be, for example, information in which information indicating the grounded foot is associated with information indicating the timing of the foot ground.
The arithmetic processing unit 330 includes an arithmetic device such as a CPU and peripheral circuits thereof. The arithmetic processing unit 330 reads and executes the program 326 from the storage unit 320, and realizes various processing units by making the above-described hardware cooperate with the program 326. Examples of the main processing units implemented by the arithmetic processing unit 330 include an image acquisition unit 331, a bone recognition unit 332, a ground determination unit 333, a ground timing determination unit 334, and an output unit 335.
The image acquisition unit 331 acquires image data acquired by the smartphone 200 from the smartphone 200 via the communication I/F unit 310. For example, the image acquisition unit 331 acquires time-series image data from the smartphone 200. The image acquisition unit 331 stores the acquired image data as image information 322 in the storage unit 320.
The bone recognition unit 332 recognizes the bone of the person who is the object of the walking posture measurement in the image data using the learned model 321. For example, the bone recognition unit 332 recognizes a bone for each image data of the time series. The bone recognition unit 332 may be configured to recognize bones from image data extracted from time-series image data based on predetermined conditions. For example, the bone recognition unit 332 recognizes the upper spine, the right shoulder, the left shoulder, the right elbow, the left elbow, the right wrist, the left wrist, the right hand, the left hand, … …, and the like. The bone recognition unit 332 calculates coordinates of each recognized portion in the screen data. Then, the bone recognition unit 332 associates the recognition and calculation results with recognition information or the like for recognizing the image data, and stores the result as bone information 323 in the storage unit 320. In this way, the bone recognition unit 332 acquires information indicating the positions of the parts of the person in the image data.
The portion identified by the bone identification unit 332 corresponds to the learned model 321 (teacher data used when learning the learned model 321). Therefore, the bone recognition unit 332 may recognize a portion other than the above example from the learned model 321.
The ground determination unit 333 determines the foot being grounded based on the bones of the person shown in the bone information 323. For example, the ground determination unit 333 determines which of the right foot and the left foot is being grounded with respect to the walking person based on the degree of change in the coordinates of the right ankle and the left ankle included in the skeleton information 323. For example, the ground determination unit 333 refers to the bone information 323 and acquires information indicating the position such as the ankle coordinates. Then, when the degree of change in the coordinates of the ankle satisfies a predetermined condition, the ground determination unit 333 determines that the foot having the ankle satisfying the condition is being grounded. The ground determination unit 333 stores information showing the result of the determination in the storage unit 320 as ground determination information 324.
Fig. 6 shows an example of a change in Y coordinates (coordinates in the up-down direction of image data) of the right ankle and the left ankle. For example, in fig. 6, the X-axis represents a value corresponding to time, and the Y-axis represents coordinates (Y-coordinates) in the up-down direction within image data. For example, in the case of fig. 6, the larger the value of the Y axis is, the more the right ankle and the left ankle are positioned below the image data (that is, the more the right ankle and the left ankle are positioned at the front side of the screen).
Generally, in the case of walking of a person, the motions of advancing one of the right foot and the left foot forward while keeping the other foot in contact with the ground, both of the right foot and the left foot being grounded, advancing one of the right foot and the left foot forward while keeping the other foot in contact with the ground, both of the right foot and the left foot being grounded are alternately repeated. That is, in the case of the present embodiment, it is assumed that the state in which the Y-coordinate of the right ankle is fixed and the Y-coordinate of the left ankle is increased, the Y-coordinate of the right ankle and the Y-coordinate of the left ankle are both fixed, the Y-coordinate of the left ankle is fixed and the Y-coordinate of the right ankle is increased, and the Y-coordinate of the right ankle and the Y-coordinate of the left ankle are both fixed is repeated.
Then, the ground determining unit 333 determines which foot is being grounded based on the degree of change in the Y coordinates of the right ankle and the left ankle. In other words, when the Y coordinate of the ankle is fixed, the ground determination unit 333 determines that the foot having the ankle is grounded. Specifically, for example, the ground determination unit 333 compares the Y coordinate of the ankle in the image data to be determined with the Y coordinate of the ankle in the image data immediately preceding the determination in the time-series image data. Then, when the displacement (change amount) from the Y coordinate of the ankle in the previous image data is equal to or smaller than the predetermined value, the ground determination unit 333 regards the Y coordinate as fixed, and determines that the foot having the ankle fixed by the Y coordinate is grounded. The ground determination unit 333 may determine that the Y coordinate is fixed by a method other than the above-described example. The range of displacement in which the ground determination unit 333 determines that the Y coordinate is fixed may be arbitrarily set.
The ground contact determination unit 333 may be configured to determine that the ground contact determination is not performed when the foot that is not determined to be the ground contact is moved based on the Y coordinate of the ankle. For example, in the case of normal walking, the left foot moves after the right foot is grounded. The ground determination unit 333 may determine that the ground determination is not performed at the timing when the Y coordinate of the right ankle starts to increase after the Y coordinate of the left ankle becomes fixed. Specifically, for example, when the Y coordinate of the right ankle is fixed, and the displacement from the Y coordinate of the left ankle of the preceding image data exceeds the displacement from the Y coordinate of the right ankle of the preceding image data in the time-series image data, it can be determined that the left foot is moving and the ground contact determination is not performed.
In addition, typically, when a person walks, the newly grounded foot advances further forward than the grounded foot. That is, in the case of the present embodiment, the Y coordinate of the ankle of the newly grounded foot becomes larger than the Y coordinate of the ankle of the grounded foot. Therefore, the ground determining unit 333 may be configured to pay attention to only the ankle having a large Y coordinate among the right ankle and the left ankle and determine whether or not the ground is being grounded.
The ground determination unit 333 may be configured to refer to the coordinates of the knee in addition to the coordinates of the ankle when performing the ground determination. For example, the grounding judgment unit 333 may be configured to perform judgment indicating that grounding judgment is not performed when the coordinate change of the knee satisfies a predetermined condition. Specifically, for example, the ground determination unit 333 may be configured to determine whether or not to perform ground determination on the foot to be determined as being grounded, when the displacement of the Y coordinate of the knee from the previous image data in the time-series image data is 0 or less.
For example, as described above, the ground determination unit 333 performs ground determination based on the degree of change in the Y coordinate of the ankle. The ground determination unit 333 may be configured to perform ground determination based on the degree of change of the skeleton other than the above-described example using the Y coordinate of the toe portion.
The ground contact timing determination unit 334 determines the timing of the ground contact between the foot and the ground based on the ground contact condition of the foot determined based on the bones of the person shown in the bone information 323. For example, the ground timing determination unit 334 determines the timing of the ground between the foot and the ground based on the change in the foot of the ground shown by the ground determination information 324. Then, the ground timing determination unit 334 stores the result of the determination in the storage unit 320 as the ground timing information 325.
For example, the ground timing determination unit 334 determines that the ground is grounded at the timing at which the foot being grounded is switched to the other foot as indicated by the ground determination result. For example, referring to fig. 7, the left foot is grounded in the identification information "124" and "125", and the ground determination is not performed in the identification information "126" and "127". Then, in the identification information "128", it is determined that "right foot" is grounded. That is, in the case of fig. 7, the grounded foot is switched from the left foot to the right foot at the timing of the identification information "128". Then, the grounding timing determination unit 334 determines that the right foot is grounded at the timing of the identification information "128".
The output unit 335 outputs the ground timing information 325, the ground determination information 324, and the like. For example, the output unit 335 can output at least one of the above information to an external device, the smart phone 200, or the like.
The above is a configuration example of the determination device 300. Next, the operation of the determination device 300 will be described with reference to fig. 8 and 9.
First, an operation example of the ground determination unit 333 will be described with reference to fig. 8. Referring to fig. 8, the ground determination unit 333 compares the Y coordinate of the knee in the image data to be determined with the Y coordinate of the knee in the image data immediately preceding the determination object in the time-series image data (step S101). When the displacement from the Y coordinate of the knee in the previous image data exceeds 0 (yes in step S101), the ground determination unit 333 confirms whether or not the displacement of the ankle satisfies the condition (step S102).
When the ankle displacement satisfies the condition (yes in step S102), the ground determination unit 333 determines that the foot satisfying the condition is being grounded (step S103). On the other hand, when the displacement of the Y coordinate of the knee is 0 or less (no in step S101) or when the displacement of the ankle does not satisfy the condition (no in step S102), the ground determination unit 333 determines that the foot that does not satisfy the condition is not grounded.
The above is an example of the operation of the ground determination unit 333. Further, for example, in a case where the degree of change in the Y coordinate based on the ankle can be evaluated as the Y coordinate fixation, and in a case where it is not determined that the foot other than the foot that is being determined to be grounded moves, or in a case where the change in the coordinate of the knee satisfies the condition, the ground determination unit 333 can determine that the displacement of the ankle satisfies the condition. The ground determination unit 333 may be configured to determine the ankle of the target ankle Y coordinate, and to perform the above-described confirmation with respect to the determined ankle.
Next, an operation example of the ground timing determination unit 334 will be described with reference to fig. 9. Referring to fig. 9, the ground timing determination unit 334 refers to the ground timing information 325, and confirms whether or not the foot being grounded shown by the ground determination result is changed (step S201).
When the foot is changed (yes in step S201), the ground timing determination unit 334 determines that the changed timing is the ground timing (step S202). On the other hand, when the foot being grounded is not changed as shown by the determination result (step S201), the ground timing determination unit 334 does not perform the above determination.
The above is a processing example of the ground timing determination unit 334.
As described above, the determination device 300 includes the bone recognition unit 332, the ground determination unit 333, and the ground timing determination unit 334. According to such a configuration, the ground timing determination unit 334 can determine the timing of the foot ground based on the determination result of the ground determination unit 333 on the grounded foot using the coordinates of the bone recognized by the bone recognition unit 332. That is, according to the above configuration, the timing of foot grounding can be grasped based on the image data.
The determination device 300 may be configured to determine the grounding timing based on the result of bone recognition by an external device that performs bone recognition. In the case of such a configuration, the determination device 300 may not have a function as the bone recognition unit 332.
[ embodiment 2 ]
Next, embodiment 2 of the present disclosure will be described with reference to fig. 10 to 13. Fig. 10 is a diagram showing a structural example of the computing system 400. Fig. 11 is a block diagram showing a configuration example of the computing device 500. Fig. 12 is a diagram showing an example of the length and height between hands. Fig. 13 is a flowchart illustrating an example of the operation of the computing device 500.
In embodiment 2 of the present disclosure, a description will be given of a calculation system 400 for calculating an index value indicating the bending of the waist of a person in image data based on image data acquired using an imaging device such as a smartphone 200. In the computing system 400, as in embodiment 1, image data representing a situation in which a person walks from the far side toward the near side of the screen is acquired, and the position of the bone is identified from the image data. The computing system 400 then calculates an index value showing the curvature of the waist based on the identified result.
As described later, in the computing system 400 described in this embodiment, the inter-hand length, which is the sum of the intra-image distances of the bone lengths connecting the respective portions existing between the left hand and the right hand, and the height, which is the intra-image distance from the head to the foot, are acquired. The computing system 400 then calculates an index value representing the curvature of the waist based on the acquired length and height of the hand. In the case where both the length and the height of the space are assumed to be the height, the height changes with the bending, and on the other hand, the length does not change even when the space is bent. In addition, the length of the two hands when the two hands are unfolded left and right is generally equal to the height. The computing system 400 calculates an index value representing the curvature of the waist by using the above. Furthermore, it is known that the ratio of the length of the hand to the height gradually changes with age, height, and the like. Accordingly, the calculation system 400 may be configured to calculate an index value indicating the bending of the waist based on the corrected hand length and height, in addition to adding a predetermined correction value to the hand length according to the character attribute such as age, height, sex, and the like.
The computing system 400 described in this embodiment can have a function as the determination system 100 described in embodiment 1. That is, the computing system 400 may have a structure for making a determination of the ground timing.
Fig. 10 shows an example of the structure of a computing system 400. Referring to fig. 10, a computing system 400, for example, has a smartphone 200 and a computing device 500. As shown in fig. 10, the smartphone 200 and the computing device 500 are connected, for example, by wireless or wired, so as to be able to communicate with each other.
The structure of the smartphone 200 is the same as that of embodiment 1. The description of smartphone 20 is omitted.
The calculation device 500 is an information processing device that calculates an index value indicating the bending of the waist based on the image data acquired by the smartphone 200. For example, the computing device 500 is a server device or the like. The computing device 500 may be 1 information processing device, and may be implemented by a cloud, for example.
Fig. 11 shows a configuration example of a computing device 500. Referring to fig. 11, a computing device 500 includes, for example, a communication I/F unit 510, a storage unit 520, and a calculation processing unit 530 as main components.
The communication I/F section 510 includes a data communication circuit. The communication I/F unit 510 performs data communication with an external device, the smartphone 200, or the like connected via a communication line.
The storage unit 520 is a storage device such as a hard disk or a memory. The storage unit 520 stores processing information and programs 527 necessary for various processes in the arithmetic processing unit 530. The program 527 realizes various processing units by being read and executed by the arithmetic processing unit 530. The program 527 is read from an external device or a storage medium via a data input/output function such as the communication I/F unit 510, and stored in the storage unit 520. The main information stored in the storage unit 520 includes, for example, a learned model 521, image information 522, bone information 523, hand length information 524, height information 525, and bending value information 526 of the waist.
The learned model 521, the image information 522, and the bone information 523 are the same information as the learned model 321, the image information 322, and the bone information 323 described in embodiment 1.
The inter-hand length information 524 shows the inter-hand length that is the sum of intra-image distances connecting bone lengths between sites present between the left hand and the right hand, such as the palm, wrist, elbow, shoulder, and the like. The inter-hand length information 524 is calculated in advance based on, for example, image data or the like acquired in a state where the hand is extended downward so as to have the longest inter-hand length, acquired from an external device or the like via the communication I/F unit 510 or the like, and stored in the storage unit 520. The inter-hand length information 524 may be generated and updated as a result of the calculation process of the inter-hand length calculation section 533.
For example, the inter-hand length information 524 includes information representing the inter-hand length. The inter-hand length information 524 may include information representing the inter-hand length acquired for each image data of the time series. For example, the inter-hand length information 524 may be information in which identification information representing image data used in calculating the inter-hand length is associated with information representing the inter-hand length.
The height information 525 indicates the height of the person in the image data, which is the intra-image distance connecting the head (forehead) and the toe (for example, the lower one of the left and right sides). The height information 525 is generated and updated as a result of the calculation process of the height calculating section 534, for example.
For example, the height information 525 includes information indicating a height acquired for each image data of the time series. The height information 525 may be information or the like associating identification information representing image data used in calculating the height with information representing the height. The height information 525 may be information or the like in which identification information indicating image data satisfying a predetermined condition is associated with information indicating a height.
The waist bending value information 526 shows a waist bending value as an index value indicating bending of the waist. The bending value information 526 of the waist is generated and updated as a result of the calculation process of the bending calculation section 535, for example.
For example, the bending value information 526 of the waist includes information indicating the bending value of the waist. The bending value information 526 of the waist may include information indicating the bending value of the waist obtained for each image data of the time series. The bending value information 526 of the waist may be information or the like in which identification information representing image data used when calculating the bending value of the waist is associated with information representing the bending value of the waist. The bending value information 526 of the waist may be information or the like in which identification information indicating image data satisfying a predetermined condition is associated with information indicating the bending value of the waist.
The arithmetic processing unit 530 includes an arithmetic device such as a CPU and peripheral circuits thereof. The arithmetic processing unit 530 reads and executes the program 527 from the storage unit 520, and realizes various processing units by making the above-described hardware cooperate with the program 527. Examples of the main processing units implemented by the arithmetic processing unit 530 include an image acquisition unit 531, a bone recognition unit 532, an inter-hand length calculation unit 533, a height calculation unit 534, a curvature calculation unit 535, and an output unit 536.
The image acquisition unit 531 and the bone recognition unit 532 can perform the same processing as the image acquisition unit 331 and the bone recognition unit 332 described in embodiment 1.
The inter-hand length calculation unit 533 calculates an inter-hand length, which is the sum of intra-image distances of bone lengths connecting the left hand and the right hand, based on the bone information 523. Then, the inter-hand length calculation unit 533 saves the calculated inter-hand length as the inter-hand length information 524 in the storage unit 520.
For example, fig. 12 shows an example of the inter-hand length. Referring to fig. 12, the inter-hand length calculation unit 533 acquires information showing the positions of each part present between the left and right hands, such as the palm, the wrist, the elbow, and the shoulder, by referring to the bone information 523. Then, the inter-hand length calculation unit 533 calculates the sum of distances between the respective parts when connecting the adjacent parts by the straight line, based on the acquired information. For example, as described above, the inter-hand length calculation unit 533 calculates the inter-hand length based on the positions of the respective portions existing between the left and right hands.
The inter-hand length calculation unit 533 can perform the above-described inter-hand length calculation process on each image data of the time series. That is, the inter-hand length calculation unit 533 may be configured to calculate the inter-hand length in the image data for each image data. The inter-hand length calculation unit 533 may be configured to calculate the inter-hand length in image data extracted from time-series image data under an arbitrary condition.
The inter-hand length calculation unit 533 may calculate the inter-hand length using a portion other than the above-described example. In addition, information indicating the length of the hand can be obtained by performing calculation in advance. In the case where the information indicating the inter-hand length is acquired in advance, the computing device 500 may not include the inter-hand length calculating unit 533.
The height calculating unit 534 calculates the height, which is the distance in the image from the head to the foot, based on the bone information 523. Then, the height calculating unit 534 stores the calculated height as height information 525 in the storage unit 520.
For example, fig. 12 shows an example of the height. Referring to fig. 12, the height calculating unit 534 obtains information indicating positions of the forehead and the left and right toes as the highest points by referring to the bone information 523. Then, the height calculating unit 534 calculates a distance connecting the lower toe of the forehead and the left and right toes with a straight line based on the acquired information. Alternatively, the height calculating unit 534 calculates the difference between the forehead and the lower toe of the left and right toes in the Y-axis direction (the up-down direction in fig. 12) based on the acquired information. For example, as described above, the height calculation unit 534 calculates the height based on the head and foot positions.
The height calculating unit 534 can perform the height calculating process described above with respect to each image data of the time series, similarly to the inter-hand length calculating unit 533. That is, the height calculating unit 534 may be configured to calculate the height in each image data for that image data. The height calculating unit 534 may be configured to calculate the height in the image data extracted from the time-series image data under an arbitrary condition.
The height calculation unit 534 may calculate the height using a location other than the above-described location. For example, the height calculating unit 534 may calculate the length or the like between the head and the ankle or the like including the waist as the height.
The curve calculating unit 535 calculates a curve value of the waist, which is an index value indicating the curve of the waist, based on the inter-hand length indicated by the inter-hand length information 524 and the height indicated by the height information 525. Then, the bending calculation unit 535 stores the calculated bending value of the waist in the storage unit 520 as bending value information 526 of the waist.
For example, the bending calculation section 535 calculates the bending value of the waist by dividing the inter-hand length by the height. That is, the bending calculation section 535 calculates the bending value of the waist by calculating (hand length)/(height). For example, the bending calculation unit 535 can calculate the bending value of the waist using the inter-hand length and the height calculated for each image data.
Further, it is assumed that the length between hands is shortened by swinging the hands or the like when walking. Accordingly, the bending calculation unit 535 may calculate the bending value of the waist using the longest inter-hand length among the inter-hand lengths included in the inter-hand length information 524, or the like, selected from the inter-hand length information 524 based on a predetermined reference. In other words, the inter-hand length used when the bending calculation unit 535 calculates the bending value of the waist may be the inter-hand length selected based on a predetermined reference from among the inter-hand lengths included in the inter-hand length information 524. The selected reference may be arbitrarily set using the longest hand length or the like. In addition, the inter-hand length information 524 may include only 1 inter-hand length acquired in advance. Therefore, the length of the hand used when the bending calculation unit 535 calculates the bending value of the waist may be a fixed value, for example.
The curve calculation unit 535 may determine image data or the like to be calculated as the curve value of the waist using the determination result of the grounding timing described in embodiment 1. For example, the curve calculating unit 535 may be configured to calculate the curve value of the waist using the height acquired based on the image data determined to be the ground contact timing. The curve calculating unit 535 may be configured to calculate the curve value of the waist using the height obtained based on the image data of the predetermined frame number from the image data determined as the grounding timing. By calculating the bending value of the waist based on the image data (based on the height) determined based on the determination result of the grounding timing in this way, the bending value of the waist more suitable as an index to be compared with other people can be calculated. The curve calculation unit 535 may be configured to calculate a curve value of the waist for each image data of the time series, and calculate an average value or the like of the calculated curve values of the waist.
The output unit 536 outputs the inter-hand length information 524, the height information 525, the bending value information 526 of the waist, and the like. For example, the output unit 536 can output at least one of the above information to an external device, the smartphone 200, or the like.
The above is a configuration example of the computing device 500. Next, an operation example of the computing device 500 will be described with reference to fig. 13.
Referring to fig. 13, the bending calculation unit 535 acquires the inter-hand length shown by the inter-hand length information 524 and the height shown by the height information 525 (step S301).
The curve calculating unit 535 calculates a curve value of the waist, which is an index value indicating the curve of the waist, based on the inter-hand length and the height (step S302). For example, the bending calculation section 535 calculates the bending value of the waist by dividing the inter-hand length by the height.
The above is an example of the operation of the computing device 500.
As such, the computing device 500 has the bending computing section 535. According to such a configuration, the computing device 500 can calculate the bending value of the waist based on the acquired hand length and height. That is, according to the above configuration, an index value corresponding to the curvature of the waist can be calculated based on the image data.
The computing device 500 may have the function as the determination device 300 described in embodiment 1. The computing device 500 may have the same modification as the determination device 300 described in embodiment 1.
[ embodiment 3 ]
Next, embodiment 3 of the present disclosure will be described with reference to fig. 14 to 19. Fig. 14 is a diagram showing a configuration example of the computing system 600. Fig. 15 is a block diagram showing a configuration example of the computing device 700. Fig. 16 is a view showing an example of the height. Fig. 17 and 18 are diagrams showing an example of the foot component. Fig. 19 is a flowchart showing an example of the operation of the computing device 700.
In embodiment 3 of the present disclosure, a description will be given of a calculation system 600 that calculates an index value indicating a state of an object of interest based on image data acquired using an imaging device such as the smartphone 200. In the computing system 600, as in embodiment 1 and embodiment 2, image data indicating that a person walks from the far side toward the near side of the screen is acquired, and the position of the bone is identified from the image data. Then, the computing system 600 calculates an index value representing the state of the object based on the recognized result.
For example, in the calculation system 600 described in the present embodiment, the attention foot calculates an index value indicating the state of the toe such as the raised or oriented toe. Specifically, for example, the computing system 600 acquires a difference in the X-axis direction, a difference in the Y-axis direction, and a comparison length as a comparison target, which are positions of the ankle and the toe of a portion existing on the foot as a target of interest. Then, the computing system 600 calculates an index value indicating the orientation of the toe based on the difference in the X-axis direction and the comparison length, and calculates an index value indicating the lifting of the toe based on the difference in the Y-axis direction and the comparison length. The closer to a camera as an image pickup device, the larger a person or the like is photographed. Therefore, even if the toe is lifted in the same manner or the toe is oriented in the same direction, the length in the image data differs according to the difference in distance of the person with respect to the image pickup device. The calculation system 600 can calculate an index value that solves the above-described problem by comparing the difference in the X-axis direction or the difference in the Y-axis direction of the positions of the ankle and the toe with the comparison length that is the length of the comparison object.
In the present embodiment, a case where the foot is focused as the target of interest will be described as an example. However, the object of interest may be any portion other than the foot. That is, the computing system 600 may also use a difference in the X-axis direction or a difference in the Y-axis direction in the positions of the portions present in the object of interest other than the ankle and the toe. The comparison length is a length between parts of the person whose length is fixed, although the length in the image data varies according to the distance between the person and the camera as the imaging device at the time of image data acquisition. For example, the comparison length may be one of the height (assuming that the curvature of the waist is not considered) of the intra-image distance from the head to the foot, the inter-hand length, and the height of the intra-image distance from the head to the pelvis described in embodiment 2. The comparison length may be a length between portions other than the above examples.
The computing system 600 described in this embodiment may have functions as the determination system 100 described in embodiment 1 and the computing system 400 described in embodiment 2. That is, the computing system 600 may have a structure for making a determination of the grounding timing, a structure for calculating the bending value of the waist, or the like.
Fig. 14 shows an example of the structure of a computing system 600. Referring to fig. 14, a computing system 600 has, for example, a smartphone 200 and a computing device 700. As shown in fig. 14, the smartphone 200 and the computing device 700 are connected, for example, by wireless or wired, so as to be able to communicate with each other.
The structure of the smartphone 200 is the same as that of embodiment 1. The description of smartphone 20 is omitted.
The calculation device 700 is an information processing device that calculates an index value indicating the lifting of the toe and the orientation of the toe based on the image data acquired by the smartphone 200. For example, the computing device 700 is a server device or the like. The computing device 700 may be 1 information processing device, and may be implemented by a cloud or the like. The calculation device 700 may calculate only one of an index value indicating the lifting of the toe and an index value indicating the orientation of the toe.
Fig. 15 shows a configuration example of a computing device 700. Referring to fig. 15, the computing device 700 includes, for example, a communication I/F unit 710, a storage unit 720, and a calculation processing unit 730 as main components.
The communication I/F section 710 includes a data communication circuit. The communication I/F unit 710 performs data communication with an external device, the smartphone 200, or the like connected via a communication line.
The storage unit 720 is a storage device such as a hard disk or a memory. The storage unit 720 stores processing information and programs 727 necessary for various processes in the arithmetic processing unit 730. The program 727 is read and executed by the arithmetic processing unit 730 to realize various processing units. The program 727 is read from an external device or a storage medium via a data input/output function such as the communication I/F unit 710, and stored in the storage unit 720. The main information stored in the storage unit 720 includes, for example, a learned model 721, image information 722, bone information 723, comparison length information 724, foot component information 725, toe data information 726, and the like.
The learned model 721, the image information 722, and the bone information 723 are the same information as the learned model 321, the image information 322, and the bone information 323 described in embodiment 1.
The comparison length information 724 indicates a comparison length, which is a length between points of a person whose length is fixed in the image data, although the length in the image data varies according to the distance of the person from the camera as the imaging device at the time of image data acquisition. For example, the comparative length information 724 indicates one of the height, which is the intra-image distance from the head to the foot, the inter-hand length, the intra-image distance from the head to the pelvis, i.e., the 2 nd height, and the like. The comparison length information 724 is generated and updated as a result of the calculation process of the comparison length calculation section 733, for example.
For example, the comparison length information 724 includes information indicating a comparison length acquired for each image data of the time series. For example, the comparison length information 724 may be information or the like in which identification information representing image data used in calculating the comparison length is associated with information representing the comparison length.
The foot component information 725 indicates the inter-site length of an X component, which is the difference in the X-axis direction between the ankle and the toe, a Y component, which is the difference in the Y-axis direction between the ankle and the toe, and the like. The foot component information 725 is generated and updated as a result of calculation processing by the foot component calculating section 734, for example. The foot component information 725 may include only one of the X component and the Y component.
For example, the foot component information 725 includes information representing an X component and a Y component acquired for each image data of the time series. For example, the comparison length information 724 may be information in which identification information indicating image data used in calculating the X component and the Y component is associated with information indicating the X component and the Y component.
The toe data information 726 shows an index value indicating the lifting of the toe and an index value indicating the orientation of the toe. The toe data information 726 is generated and updated as a result of calculation processing by the toe data calculation section 735, for example.
For example, the toe data information 726 includes an index value indicating the lifting of the toe and an index value indicating the orientation of the toe. The toe data information 726 includes an index value indicating the lifting of the toe and an index value indicating the orientation of the toe, which are acquired for each piece of time-series image data. The height information 525 may be information or the like in which identification information indicating image data used in calculating the index value is associated with information indicating an index value indicating the lifting of the toe and an index value indicating the orientation of the toe. The toe data information 726 may be information or the like in which identification information indicating image data satisfying a predetermined condition is associated with information indicating an index value indicating the lifting of the toe and an index value indicating the orientation of the toe.
The arithmetic processing unit 730 includes an arithmetic device such as a CPU and peripheral circuits thereof. The arithmetic processing unit 730 reads and executes the program 727 from the storage unit 720, and realizes various processing units by making the hardware cooperate with the program 727. Examples of the main processing units implemented by the arithmetic processing unit 730 include an image acquisition unit 731, a bone recognition unit 732, a comparison length calculation unit 733, a foot component calculation unit 734, a toe data calculation unit 735, and an output unit 736.
The image acquisition unit 731 and the bone recognition unit 732 can perform the same processing as the image acquisition unit 331 and the bone recognition unit 332 described in embodiment 1.
The comparison length calculation unit 733 calculates a comparison length, which is one of the height, which is the intra-image distance from the head to the foot, the inter-hand length, the 2 nd height, which is the intra-image distance from the head to the pelvis, and the like, based on the bone information 723. Then, the comparison length calculation unit 733 stores the calculated comparison length in the storage unit 720 as comparison length information 724.
For example, fig. 16 shows an example of the height as an example of the comparison length. Referring to fig. 16, the comparative length calculation unit 733 refers to the bone information 523 to obtain information indicating positions of the forehead and the left and right toes, which are the highest positions. Then, the comparative length calculating unit 733 calculates a distance by which the lower toe of the forehead and the left and right toes are connected by a straight line, based on the acquired information. Alternatively, the comparative length calculating unit 733 calculates the difference between the forehead and the lower toe of the left and right toes in the Y-axis direction (up-down direction in fig. 12) based on the acquired information. For example, as described above, the comparison length calculation unit 733 calculates the height based on the positions of the head and the foot. The comparison length calculation unit 733 may calculate the inter-hand length, the intra-image distance from the head to the pelvis, that is, the 2 nd height, or the like as the comparison length by the same process.
The foot component calculation unit 734 calculates at least one of the inter-site lengths of an X component, which is a difference in the X-axis direction between the ankle and the toe position, and a Y component, which is a difference in the Y-axis direction between the ankle and the toe position, by referring to the bone information 523. Then, the foot component calculation unit 734 stores the calculated foot component as foot component information 725 in the storage unit 720.
For example, fig. 17 shows an example of the Y component. Referring to fig. 17, the foot component calculating unit 734 acquires information indicating positions of the ankle and the toe by referring to the bone information 523. Then, the foot component calculating unit 734 calculates a difference between the ankle and the toe in the Y-axis direction (up-down direction in fig. 17) based on the acquired information. For example, as described above, the foot component calculation unit 734 calculates the Y component based on the ankle and toe positions.
Fig. 18 shows an example of the X component. Referring to fig. 18, the foot component calculating unit 734 acquires information indicating positions of the ankle and the toe by referring to the bone information 523. Then, the foot component calculating unit 734 calculates a difference between the ankle and the toe in the X-axis direction (left-right direction in fig. 18) based on the acquired information. For example, as described above, the foot component calculating unit 734 calculates the X component based on the ankle and toe positions.
The foot component calculation unit 734 may be configured to calculate, for example, the Y component and the X component of each of the left and right feet, or may be configured to calculate only the Y component or the X component of the foot located below among the left and right feet. The foot component calculation unit 734 may be configured to determine whether to calculate the foot component using the determination result of the grounding timing described in embodiment 1. For example, the foot component calculation unit 734 may be configured to calculate the Y component and the X component based on the image data determined to be the ground timing. The foot component calculation unit 734 may be configured to calculate the Y component and the X component based on image data of a predetermined frame number from the image data determined to be the ground contact timing.
The toe data calculation unit 735 calculates an index value indicating the lifting of the toe, based on the comparison length indicated by the comparison length 724 and the Y component indicated by the foot component information 725. The toe data calculation unit 735 calculates an index value indicating the orientation of the toe based on the comparison length indicated by the comparison length 724 and the X component indicated by the foot component information 725. Then, the toe data calculation unit 735 stores the calculated index value indicating the lifting of the toe and the calculated index value indicating the orientation of the toe as the toe data information 726 in the storage unit 720.
For example, the toe data calculation unit 735 calculates an index value showing the lifting of the toe by dividing the Y component by the comparison length. That is, the toe data calculation unit 735 calculates an index value indicating the lifting of the toe by calculating (Y component)/(comparative length). For example, the toe data calculation unit 735 can calculate an index value indicating the lifting of the toe using the Y component calculated for each image data and the comparison length.
In addition, for example, the toe data calculation unit 735 calculates an index value showing the orientation of the toe by dividing the X component by the comparison length. That is, the toe data calculation unit 735 calculates an index value indicating the orientation of the toe by calculating (X component)/(comparison length). For example, the toe data calculation unit 735 can calculate an index value indicating the direction of the toe using the X component calculated for each image data and the comparison length.
The toe data calculation unit 735 may determine image data or the like to be calculated as an index value indicating the lifting of the toe and an index value indicating the orientation of the toe, using the determination result of the grounding timing described in embodiment 1. For example, the toe data calculation unit 735 may be configured to calculate an index value indicating the lifting of the toe and an index value indicating the orientation of the toe, using the Y component, the X component, and the comparison length acquired based on the image data determined to be the ground contact timing. The toe data calculation unit 735 may be configured to calculate an index value indicating the lifting of the toe and an index value indicating the orientation of the toe, using the Y component, the X component, and the comparison length acquired based on the image data of a predetermined frame number from the image data determined to be the ground contact timing. In this way, by performing calculation based on the image data specified by the determination result based on the ground-contact timing, the toe lift and the toe orientation at the time of ground contact between the foot and the ground can be evaluated.
The output unit 736 outputs comparison length information 724, foot component information 725, toe data information 726, and the like. For example, the output unit 736 can output at least one of the above-described pieces of information to an external device, the smart phone 200, or the like.
The above is a configuration example of the computing device 700. Next, an operation example of the computing device 700 will be described with reference to fig. 19.
Referring to fig. 19, the toe data calculation unit 735 acquires the comparison length indicated by the comparison length information and the Y component and the X component indicated by the foot component information 725 (step S401).
The toe data calculation unit 735 calculates toe data based on the comparison length and the Y component and the X component. For example, the toe data calculation unit 735 calculates an index value indicating the lifting of the toe based on the comparison length and the Y component. Further, for example, the toe data calculation unit 735 calculates an index value indicating the orientation of the toe based on the comparison length and the X component.
The above is an example of the operation of the computing device 700.
In this way, the computing device 700 has the toe data computing unit 735. According to such a configuration, the calculation device 700 can calculate an index value indicating the lifting of the toe and an index value indicating the orientation of the toe based on the acquired comparison length, the Y component, and the X component. That is, according to the above configuration, it is possible to calculate the index value indicating the state of lifting, orientation, etc. of the toe based on the image data.
In addition, when the object of interest is an arbitrary portion other than the foot, the calculation device 700 may be configured to calculate the length between the ankle and the portion other than the toe that is present in the object of interest as the Y component and the X component.
The computing device 700 may have a function as the determination device 300 described in embodiment 1. The computing device 700 may have the same modification as the determination device 300 described in embodiment 1. Similarly, the computing device 700 can have the functions as the computing device 500 described in embodiment 2. The computing device 700 may have the same modification as the computing device 500 described in embodiment 2.
[ embodiment 4 ]
Next, embodiment 4 of the present disclosure will be described with reference to fig. 20 to 22. In embodiment 4 of the present disclosure, an outline of the configuration of a computing device 800 as an information processing device will be described.
Fig. 20 shows an example of a hardware configuration of the computing device 800. Referring to fig. 20, as an example, a computing device 800 has the following hardware configuration.
CPU 801 (Central Processing Unit: central processing Unit)
ROM (Read Only Memory) 802 (storage device)
RAM (Random Access Memory: random access memory) 803 (storage device)
Program group 804 loaded into RAM803
Storage 805 for saving program group 804
Drive device 806 for reading from and writing to storage medium 810 external to information processing apparatus
Communication interface 807 connected to communication network 811 outside the information processing apparatus
Input/output interface 808 for inputting/outputting data
Bus 809 connecting the constituent elements
Further, the computing device 800 can realize functions as the height calculating unit 821 and the bending calculating unit 822 shown in fig. 21 by acquiring the program group 804 from the CPU801 and executing the program group 804 by the CPU801. The program group 804 is stored in advance in the storage device 805 or the ROM802, for example, and the CPU801 loads and executes the program group 804 to the RAM803 or the like as necessary. The program group 804 may be supplied to the CPU801 via the communication network 811, or may be stored in the storage medium 810 in advance, and the drive device 806 may read the program and supply it to the CPU801.
Fig. 20 shows an example of a hardware configuration of the computing device 800. The hardware structure of the computing device 800 is not limited to the above case. For example, the computing device 800 may be configured by a part of the above-described configuration without the drive device 806 or the like.
The height calculating unit 821 calculates the height of the person in the image data based on the image data.
The curvature calculating unit 822 obtains the inter-hand length, which is the length connecting the left and right hands of the person in the image data. Further, the curve calculating unit 822 calculates an index value indicating the curve of the waist of the person in the image data based on the acquired hand length and the height calculated by the height calculating unit 821.
As described above, the computing device 800 includes the height calculating unit 821 and the bending calculating unit 822. The bending calculation unit 822 is configured to acquire the inter-hand length. According to such a configuration, the curve calculating unit 822 can calculate an index value indicating the curve of the waist of the person in the image data based on the acquired hand length and the height calculated by the height calculating unit 821. As a result, an index value corresponding to the curvature of the waist can be calculated based on the image data.
The information processing apparatus such as the computing apparatus 800 can be realized by embedding a predetermined program into the information processing apparatus. Specifically, a program as another aspect of the present invention is a program for causing an information processing apparatus to realize: the height of the person in the image data is calculated based on the image data, the inter-hand length which is the length connecting the left and right hands of the person in the image data is acquired, and an index value indicating the curvature of the waist of the person in the image data is calculated based on the acquired inter-hand length and the calculated height.
In addition, the calculation method performed by the above-described information processing apparatus is as follows: the information processing device calculates the height of the person in the image data based on the image data, acquires the inter-hand length as the length connecting the left and right hands of the person in the image data, and calculates an index value indicating the curvature of the waist of the person in the image data based on the acquired inter-hand length and the calculated height.
Even in the invention such as the program (or the storage medium) having the above-described configuration and the calculation method, the same operation and effect as those in the case described above can be achieved, and therefore the same object as the calculation device 800 having the above-described bending calculation unit 822 can be achieved.
The computing device 800 may be configured to obtain the program group 804 from the CPU801 and execute the program group 804 from the CPU801, thereby realizing the functions of the component calculating unit 831, the comparison length calculating unit 832, and the index value calculating unit 833 shown in fig. 22.
The component calculation unit 831 calculates, based on the image data, the inter-region length between the regions existing in the object of interest among the persons in the image data.
The comparison length calculating unit 832 calculates a comparison length to be compared based on the image data.
The index value calculation unit 833 calculates an index value indicating the state of the object of interest based on the inter-part length calculated by the component calculation unit 831 and the comparison length calculated by the comparison length calculation unit 832.
As described above, the computing device 800 may include the component calculating unit 831, the comparison length calculating unit 832, and the index value calculating unit 833. According to such a configuration, the index value calculation unit 833 can calculate an index value indicating the state of the object of interest based on the inter-part length calculated by the component calculation unit 831 and the comparison length calculated by the comparison length calculation unit 832. As a result, an index value corresponding to the state of the object of interest can be calculated based on the image data.
The information processing apparatus such as the computing apparatus 800 can be realized by embedding a predetermined program into the information processing apparatus. Specifically, a program as another aspect of the present invention is a program for causing an information processing apparatus to realize: an inter-part length between parts existing in a target object of interest in a person in image data is calculated based on the image data, a comparison length to be a comparison object is calculated based on the image data, and an index value indicating a state of the target object of interest is calculated based on the calculated inter-part length and the calculated comparison length.
The calculation method performed by the information processing apparatus is as follows: the information processing device calculates, based on the image data, the inter-site length between sites existing in the object of interest in the person in the image data, calculates, based on the image data, the comparison length to be the comparison object, and calculates, based on the calculated inter-site length and the calculated comparison length, an index value indicating the state of the object of interest.
Even in the case of the invention such as the program (or the storage medium) having the above-described configuration and the calculation method, the same operation and effect as those in the case described above can be achieved, and therefore the same object as that of the calculation device 800 having the index value calculation unit 833 described above can be achieved.
< additional notes >
Some or all of the above embodiments may be described as follows. The outline of the determination device and the like in the present invention will be described below. However, the present invention is not limited to the following configuration.
(additionally, 1)
A computing device, having:
a component calculation unit that calculates, based on image data, inter-region lengths between regions existing in an object of interest in a person in the image data;
a comparison length calculation unit that calculates a comparison length to be compared based on the image data; and
An index value calculation unit configured to calculate an index value indicating a state of the object of interest based on the inter-part length calculated by the component calculation unit and the comparison length calculated by the comparison length calculation unit.
(additionally remembered 2)
According to the computing device of appendix 1,
the index value calculation unit calculates the index value by dividing the inter-site length by the comparison length.
(additionally, the recording 3)
According to the computing device of appendix 1 or 2,
the component calculation unit calculates a difference in the X-axis direction between the portions existing in the object of interest as the inter-portion length,
the index value calculation unit calculates an index value indicating a state of the object of interest based on the inter-part length calculated by the component calculation unit and the comparison length calculated by the comparison length calculation unit.
(additionally remembered 4)
The computing device according to any one of supplementary notes 1 to 3,
the component calculation unit calculates a difference in the Y-axis direction between the parts existing in the object of interest as the inter-part length,
the index value calculation unit calculates an index value indicating a state of the object of interest based on the inter-part length calculated by the component calculation unit and the comparison length calculated by the comparison length calculation unit.
(additionally noted 5)
The computing device according to any one of supplementary notes 1 to 4,
the component calculation unit calculates the inter-part length based on information indicating coordinates of each part of the person in the image data identified based on the image data.
(additionally described 6)
The computing device according to any one of supplementary notes 1 to 5,
the comparison length calculation unit calculates the comparison length based on information indicating coordinates of each part of the person in the image data identified based on the image data.
(additionally noted 7)
The computing device according to any one of supplementary notes 1 to 6,
the index value calculation unit acquires ground contact timing information indicating a timing at which a foot of a person in the image data touches the ground, and calculates the index value based on the inter-site length and the comparison length calculated based on the image data determined based on the ground contact timing information.
(additionally noted 8)
The computing device according to any one of supplementary notes 1 to 7,
the comparison length calculating section calculates a height from the head to the toe of the person in the image data as the comparison length.
(additionally, the mark 9)
A method of calculating the degree of freedom of a computer,
the information processing apparatus calculates, based on the image data, inter-region lengths between regions existing in the object of interest among the persons in the image data,
based on the image data, a comparison length to be a comparison object is calculated,
an index value representing the state of the object of interest is calculated based on the calculated inter-site length and the calculated comparison length.
(additionally noted 10)
A program for causing an information processing apparatus to realize the following processing:
based on the image data, an inter-region length between regions existing in the object of interest in the person in the image data is calculated,
based on the image data, a comparison length to be a comparison object is calculated,
an index value representing the state of the object of interest is calculated based on the calculated inter-site length and the calculated comparison length.
The programs described in the above embodiments and the accompanying drawings are stored in a storage device or recorded in a computer-readable storage medium. For example, the storage medium is a removable medium such as a flexible disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
The present application has been described above with reference to the above embodiments, but the present application is not limited to the above embodiments. Various modifications can be made to the structure and details of the present application within the scope of the present application, as will be understood by those skilled in the art.
The present invention is also advantageous in that it is based on the priority claims of japanese patent application No. 2021-055314 filed in 3/29 of 2021, the contents of which are incorporated in the present specification.
Description of the reference numerals
100: a judgment system; 200: a smart phone; 300: a judging device; 310: a communication I/F section; 320: a storage unit; 321: a model is learned; 322: image information; 323: bone information; 324: the grounding judgment information; 325: ground timing information; 326: a program; 330: an arithmetic processing unit; 331: an image acquisition unit; 332: a bone recognition unit; 333: a grounding judgment unit; 334: a grounding timing determination unit; 335: an output unit; 400: a computing system; 500: a computing device; 510: a communication I/F section; 520: a storage unit; 521: a model is learned; 522: image information; 523: bone information; 524: the length information of the hand; 525: height information; 526: information of the bending value of the waist; 527: a program; 530: an arithmetic processing unit; 531: an image acquisition unit; 532: a bone recognition unit; 533: a hand length calculation unit; 534: a height calculating section; 535: a bending calculation unit; 536: an output unit; 600: a computing system; 700: a computing device; 710: a communication I/F section; 720: a storage unit; 721: a model is learned; 722: image information; 723: bone information; 724: comparing the length information; 725: foot component information; 726: toe data information; 727: a program; 730: an arithmetic processing unit; 731: an image acquisition unit; 732: a bone recognition unit; 733: a comparison length calculation unit; 734: a foot component calculation unit; 735: a toe data calculation unit; 736: an output unit; 800: a computing device; 801: a CPU;802: a ROM;803: a RAM;804: a program group; 805: a storage device; 806: a driving device; 807: a communication interface; 808: an input/output interface; 809: a bus; 810: a storage medium; 811: a communication network; 821: a height calculating section; 822: a bending calculation unit; 831: a component calculation unit; 832: a comparison length calculation unit; 833: an index value calculation unit.

Claims (10)

1. A computing device, having:
a component calculation unit that calculates, based on image data, inter-region lengths between regions existing in an object of interest in a person in the image data;
a comparison length calculation unit that calculates a comparison length to be compared based on the image data; and
an index value calculation unit configured to calculate an index value indicating a state of the object of interest based on the inter-part length calculated by the component calculation unit and the comparison length calculated by the comparison length calculation unit.
2. The computing device of claim 1,
the index value calculation unit calculates the index value by dividing the inter-site length by the comparison length.
3. The computing device of claim 1 or 2,
the component calculation unit calculates a difference in the X-axis direction between the sites existing in the object of interest as the inter-site length,
the index value calculation unit calculates an index value indicating a state of the object of interest based on the inter-part length calculated by the component calculation unit and the comparison length calculated by the comparison length calculation unit.
4. The computing device according to claim 1 to 3,
The component calculation unit calculates a difference in the Y-axis direction between the parts existing in the object of interest as the inter-part length,
the index value calculation unit calculates an index value indicating a state of the object of interest based on the inter-part length calculated by the component calculation unit and the comparison length calculated by the comparison length calculation unit.
5. The computing device of any one of claim 1 to 4,
the component calculation unit calculates the inter-part length based on information indicating coordinates of each part of the person in the image data identified from the image data.
6. The computing device of any one of claim 1 to 5,
the comparison length calculation unit calculates the comparison length based on information indicating coordinates of each part of the person in the image data identified from the image data.
7. The computing device according to claim 1 to 6,
the index value calculation unit acquires ground contact timing information indicating the timing at which the feet of the person in the image data contact the ground, and calculates the index value based on the inter-site length calculated from the image data determined based on the ground contact timing information and the comparison length.
8. The computing device of any one of claim 1 to 7,
the comparison length calculating section calculates a height from the head to the toe of the person in the image data as the comparison length.
9. A method of calculating the degree of freedom of a computer,
the information processing apparatus calculates, based on the image data, inter-region lengths between regions existing in the object of interest among the persons in the image data,
the information processing apparatus calculates a comparison length to be a comparison object based on the image data,
the information processing apparatus calculates an index value indicating a state of the object of interest based on the calculated inter-site length and the calculated comparison length.
10. A storage medium readable by a computer, recorded with a program for causing an information processing apparatus to realize:
based on the image data, an inter-region length between regions existing in the object of interest in the person in the image data is calculated,
based on the image data, a comparison length to be a comparison object is calculated,
an index value representing the state of the object of interest is calculated based on the calculated inter-site length and the calculated comparison length.
CN202280025797.3A 2021-03-29 2022-02-04 computing device Pending CN117136377A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2021055314 2021-03-29
JP2021-055314 2021-03-29
PCT/JP2022/004472 WO2022209288A1 (en) 2021-03-29 2022-02-04 Calculation device

Publications (1)

Publication Number Publication Date
CN117136377A true CN117136377A (en) 2023-11-28

Family

ID=83458824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280025797.3A Pending CN117136377A (en) 2021-03-29 2022-02-04 computing device

Country Status (3)

Country Link
JP (1) JPWO2022209288A1 (en)
CN (1) CN117136377A (en)
WO (1) WO2022209288A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11089977B2 (en) * 2016-03-31 2021-08-17 Nec Solution Innovators, Ltd. Gait analyzing device, gait analyzing method, and computer-readable recording medium
WO2018131630A1 (en) * 2017-01-13 2018-07-19 三菱電機株式会社 Operation analysis device and operation analysis method
JP2020188860A (en) * 2019-05-20 2020-11-26 オムロン株式会社 Mountain climber support device, system, method and program
WO2020261404A1 (en) * 2019-06-26 2020-12-30 日本電気株式会社 Person state detecting device, person state detecting method, and non-transient computer-readable medium containing program

Also Published As

Publication number Publication date
WO2022209288A1 (en) 2022-10-06
JPWO2022209288A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
JP5403699B2 (en) Finger shape estimation device, finger shape estimation method and program
EP2904472B1 (en) Wearable sensor for tracking articulated body-parts
Ye et al. A depth camera motion analysis framework for tele-rehabilitation: Motion capture and person-centric kinematics analysis
EP3437557B1 (en) Gait analyzing device, gait analyzing method, and computer-readable recording medium
KR101930652B1 (en) Gait analysis system and computer program recorded on recording medium
US10839526B2 (en) Information processing device, information processing method, and recording medium
JP6733995B2 (en) Work analysis device, work analysis method, and program
US20220222975A1 (en) Motion recognition method, non-transitory computer-readable recording medium and information processing apparatus
CN110910426A (en) Action process and action trend identification method, storage medium and electronic device
US10470688B2 (en) Measurement apparatus, method and non-transitory computer-readable recording medium
CN117136377A (en) computing device
JP6635848B2 (en) Three-dimensional video data generation device, three-dimensional video data generation program, and method therefor
JP2018161397A (en) Diagnostic image processing apparatus, evaluation support method, and program
CN116246343A (en) Light human body behavior recognition method and device
CN108885087B (en) Measuring apparatus, measuring method, and computer-readable recording medium
CN117119958A (en) Determination device
WO2022209287A1 (en) Calculation device
WO2022137450A1 (en) Information processing device, information processing method, and program
JP6170696B2 (en) Image processing apparatus and image processing method
CN110781857A (en) Motion monitoring method, device, system and storage medium
CN113836991A (en) Motion recognition system, motion recognition method, and storage medium
CN113544736A (en) Lower limb muscle strength estimation system, lower limb muscle strength estimation method, and program
JP2020173494A (en) Handwritten character recognizing device and handwritten character recognizing method
US20220061700A1 (en) Human body portion tracking method and human body portion tracking system
JP7307698B2 (en) Work management system and work management method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination