CN117617971A - Physical and mental state evaluation system and physical and mental state evaluation method - Google Patents

Physical and mental state evaluation system and physical and mental state evaluation method Download PDF

Info

Publication number
CN117617971A
CN117617971A CN202311078827.XA CN202311078827A CN117617971A CN 117617971 A CN117617971 A CN 117617971A CN 202311078827 A CN202311078827 A CN 202311078827A CN 117617971 A CN117617971 A CN 117617971A
Authority
CN
China
Prior art keywords
psychological
reproducibility
subject
state
evaluation value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311078827.XA
Other languages
Chinese (zh)
Inventor
阿部悟
高岛慎吾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asics Corp
Original Assignee
Asics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2023110567A external-priority patent/JP2024035087A/en
Application filed by Asics Corp filed Critical Asics Corp
Publication of CN117617971A publication Critical patent/CN117617971A/en
Pending legal-status Critical Current

Links

Abstract

The invention provides a physical and mental state evaluation system and a physical and mental state evaluation method, which are techniques capable of easily grasping the relationship between a psychological state and an athletic performance and improving the relationship. In a physical and mental state evaluation system (100), an action information acquisition unit (70) acquires information indicating the action state of a subject who repeatedly performs a predetermined physical action. A reproducibility evaluation unit (94) determines a reproducibility evaluation value based on the analysis result concerning the reproducibility of a predetermined physical movement. A psychological information acquisition unit (80) acquires information indicating the psychological state of a subject. A psychological assessment unit (96) determines a psychological assessment value based on the analysis result relating to the psychological state of the subject. An evaluation value storage unit (98) associates and stores the reproducibility evaluation value and the psychological evaluation value. A result output unit (99) outputs information indicating the correlation between the quality of the reproducibility evaluation value and the psychological evaluation value.

Description

Physical and mental state evaluation system and physical and mental state evaluation method
Technical Field
The present invention relates to a physical and mental state evaluation system and a physical and mental state evaluation method, and more particularly, to a technique for evaluating the relationship between physical actions and mental states.
Background
The factors that generally determine how well a player's athletic performance is, are related to various elements. Particularly when a player falls into a sink or a minus injury or an obstacle is present, it is considered that stability of the psychological state of the player is affected in many cases. Here, a technique for quantitatively evaluating emotion of a user is known (for example, refer to patent document 1).
[ Prior Art literature ]
[ patent literature ]
Patent document 1 Japanese patent laid-open No. 2021-110781
Disclosure of Invention
[ problem to be solved by the invention ]
Even if the player himself recognizes that the situation is bad because of the psychological state, it is not easy to find a marketable expert, and even if consultation is acceptable, it must be tried to accept to see whether it is effective. Knowing the effect of clear consultation also predicts the need for a relatively long time. Therefore, it is not easy to easily achieve such improvement of mental state.
The present invention has been made in view of such circumstances, and an object thereof is to provide a technique capable of easily grasping and improving the relationship between a psychological state and an athletic performance.
[ means of solving the problems ]
In order to solve the above problems, a physical and mental state estimation system according to an aspect of the present invention includes: a reproducibility evaluation unit that determines a reproducibility evaluation value based on an analysis result regarding reproducibility of a predetermined physical motion performed by the subject; a psychological assessment unit that determines a psychological assessment value based on an analysis result concerning the psychological state of the subject; an evaluation value storage unit for storing the reproducibility evaluation value in association with the psychological evaluation value; and a result output unit that outputs information indicating the correlation between the quality of the reproducibility evaluation value and the psychological evaluation value.
Other aspects of the invention are a physical and mental state assessment method. The method comprises the following steps: a step of determining a reproducibility evaluation value based on an analysis result concerning reproducibility of a predetermined physical motion performed by the subject; a process of deciding a psychological assessment value based on an analysis result related to the psychological state of the subject; establishing a corresponding relation between the reproducibility evaluation value and the psychological evaluation value and recording the corresponding relation; and a process of outputting information indicating the correlation of the quality of the reproducibility evaluation value and the psychological evaluation value.
Any combination of the above structural elements, and the expression of the present invention in methods, apparatuses, systems, computer programs, data structures, recording media, and the like are also effective as the form of the present invention.
[ Effect of the invention ]
The invention can easily grasp the relationship between the psychological state and the athletic performance and improve the psychological state and the athletic performance.
Drawings
Fig. 1 is a diagram showing a schematic configuration of a physical and mental state evaluation system that acquires and analyzes information indicating an operation state of a subject.
Fig. 2 is a functional block diagram showing the respective configurations of the physical and mental state estimation system.
Fig. 3 is a diagram illustrating a procedure of estimating the position of an anatomical feature point from an image obtained when a subject performs a flick operation of golf.
Fig. 4 (a) and 4 (b) are diagrams showing examples of positions of a plurality of feature points when the same flick operation is repeated three times in two-dimensional coordinates.
Fig. 5 (a) and 5 (b) are graphs showing time history characteristics of motion characteristics when the same flick motion is repeated three times.
Fig. 6 is a diagram showing an example of a screen that compares and displays motion reproducibility in the case of a good psychological state with motion reproducibility in the case of a poor psychological state.
Fig. 7 is a diagram illustrating a process of estimating the position of an anatomical feature point from an image obtained when a subject performs a running motion.
Fig. 8 is a diagram illustrating a procedure of estimating the position of an anatomical feature point from images obtained when a test of a subject is performed with a trick.
[ description of symbols ]
10: subject(s)
30: action information acquisition unit
40: psychological information acquisition unit
51: result output unit
70: action information acquisition unit
80: psychological information acquisition unit
94: reproducibility evaluation unit
96: psychological assessment unit
98: evaluation value storage unit
99: result output unit
100: physical and mental state evaluation system
Detailed Description
Hereinafter, the present invention will be described exemplarily with reference to the drawings based on an appropriate embodiment of the present invention. In the embodiment and the modification, the same or equivalent components are denoted by the same reference numerals, and overlapping description is omitted as appropriate.
(first embodiment)
Fig. 1 shows a schematic configuration of a physical and mental state evaluation system that acquires and analyzes information indicating an operation state of a subject and information indicating a mental state. In the example of the present embodiment, the subject 10 performs a golf flick operation by using the camera function of the information terminal 12, and the captured image is transmitted to the physical and mental state estimation server 50 as information indicating the operation state. A measurement device 20 such as a smart watch that acquires biological information, for example, information indicating a psychological state such as heart rate, oxygen concentration in blood, and pupil reaction, and information indicating an operational state such as arm motion, is mounted on the subject 10, for example, on the wrist. Information indicating the psychological state of the subject 10 and information indicating the action state are transmitted from the measurement device 20 to the physical and mental state evaluation server 50 via a prescribed communication path. For example, the information can be synchronized by connecting the measuring device 20 to a predetermined program of the information terminal 12 by short-range wireless communication, and the information can be transmitted from the information terminal 12 to the physical and mental state estimation server 50.
The physical and mental state estimation server 50 is a server on a network that quantifies and estimates the reproducibility of the motion of the subject 10 and quantifies and estimates the mental state of the subject 10 based on the information representing the motion state and the information representing the mental state. The information terminal 12 and the physical and mental state evaluation server 50 are connected via a network such as wireless communication or the internet. The physical and mental state evaluation server 50 receives information from a plurality of subjects 10, and determines an evaluation value and manages the information for each subject 10. The physical and mental state evaluation server 50 associates and stores the reproducibility evaluation value and the psychological evaluation value, and outputs information indicating the association between the quality of the reproducibility evaluation value and the psychological evaluation value to the information terminal 12. The subject 10 can grasp the relationship between the reproducibility of the motion and the psychological state by browsing the results displayed on the information terminal 12. The correlation between the merits of the psychological states and the merits of the action reproductions can be visualized, and the reproductivity evaluation values for the various conditions different in psychological states can be displayed in comparison. Therefore, the subject 10 can search for clues of improvement in the psychological state by itself, for example, in the case where the performance of the exercise is reduced.
Fig. 2 is a functional block diagram showing the respective configurations of the physical and mental state estimation system 100. In this figure, block diagrams focusing on functions are drawn, and these functional blocks may be implemented in various forms by hardware, software, or a combination of these. The information terminal 12 and the physical and mental state estimation server 50 may include a portable terminal or a computer mainly including a central processing unit (Central Processing Unit, CPU), a graphic processing unit (Graphics Processing Unit, GPU), a random access Memory (Random Access Memory, RAM), a Read Only Memory (ROM), an auxiliary storage device, a display device, a communication device, a camera module, and the like, and a program stored in the portable terminal or the computer.
The physical and mental state estimation system 100 in the present embodiment includes a measurement device 20, an information terminal 12, and a physical and mental state estimation server 50. The physical and mental state assessment system 100 is believed to be implemented by various hardware or software structures. For example, as the configuration of the physical and mental state estimation system 100, it is not necessary to include the measuring device 20, and various general-purpose devices may be used as the measuring device 20, or the information terminal 12 may be used as the measuring device 20 by flexibly utilizing various functions of the information terminal 12. On the premise that the motion state or the psychological state of the subject is measured using a general-purpose device or the function of the information terminal 12 as the measuring device 20, the physical and mental state evaluation system 100 may be constituted by only the information terminal 12 and the physical and mental state evaluation server 50.
Or may be substantially composed of only the information terminal 12 by providing the information terminal 12 with all the functions of the information terminal 12 and the functions characteristic of the physical and mental state estimation server 50, including the combination of the measuring device 20 and the information terminal 12. Or the physical and mental state estimation system 100 may be constituted substantially only by the physical and mental state estimation server 50 by having the physical and mental state estimation server 50 have all the functions characteristic of the present embodiment and using general-purpose devices as the information terminal 12 and the measuring device 20. Regardless of the hardware configuration, the physical/mental state estimation system 100 may include at least the software configuration of the information terminal 12 and the physical/mental state estimation server 50 in the present figure.
The measuring device 20 includes a communication section 21, a heartbeat detection section 22, a sound detection section 24, and an operation detection section 26. The measurement device 20 is, for example, a wearable device such as a smart watch worn on the wrist of the subject or smart glasses worn on the eyes like glasses or sunglasses. The heartbeat detection unit 22 is a function for detecting the heart rate of the subject to which the measurement device 20 is attached, and is configured as hardware including, for example, an optical heart rate meter. The sound detection unit 24 is a function of inputting ambient sound, and is configured to include, as hardware, a sound processing device including a microphone, for example. The motion detection unit 26 is a function for detecting a motion of the subject wearing the measuring device 20 on the body, and is configured by hardware including, for example, a motion sensor such as a nine-axis sensor for detecting an arm motion, a camera module for detecting a line of sight of the subject 10, a pressure sensor for detecting a foot pressure or a grip force of the subject 10, and the like.
The information terminal 12 includes an action information acquisition unit 30, a psychological information acquisition unit 40, a result output unit 51, and a communication unit 52. The information terminal 12 is a portable terminal such as a smart phone or a tablet terminal, for example, capable of capturing moving images, displaying images, recording, inputting/outputting sound, detecting operation, and transmitting/receiving information.
The operation information acquisition unit 30 acquires information indicating the operation state of the subject 10 repeatedly performing a predetermined physical operation. The motion information acquisition unit 30 includes a moving image acquisition unit 32, a sound acquisition unit 34, and a motion detection unit 36.
The moving image acquisition unit 32 is a function of acquiring an image obtained by imaging the subject 10 repeatedly performing a predetermined physical operation, and is configured to include a camera module as hardware, for example. For example, as shown in fig. 1, the moving image acquisition unit 32 acquires an image obtained by capturing a case where the subject 10 repeatedly performs a flick operation or a swing operation of golf.
The sound acquisition unit 34 is a function for inputting ambient sound, and is configured to include, as hardware, a sound processing device including a microphone, for example. The sound acquisition section 34 can acquire sound information input through the measurement device 20 via the communication section 52. For example, as shown in fig. 1, the sound acquisition unit 34 acquires a hitting sound, a swing sound, or the like when the subject 10 performs a golf putter or swing.
The motion detection unit 36 is a function for detecting a motion of a subject carrying the information terminal 12, and is configured to include a motion sensor such as a nine-axis sensor as hardware. The motion detection section 36 may acquire motion information detected by the measurement device 20 via the communication section 52. In the case where the measuring apparatus 20 is a smart glasses, information of the sight line movement of the subject 10 detected by the measuring apparatus 20 is acquired as the motion information.
The psychological information acquiring section 40 acquires information indicating the psychological state of the subject 10. The psychological information acquisition section 40 further acquires disturbance information indicating the degree of disturbance stimulus that may affect the psychological state of the subject 10. The psychological information acquiring unit 40 may acquire information indicating the psychological state of the subject 10 at a timing when a predetermined physical action is performed.
The psychological information acquiring section 40 includes a biological information acquiring section 41, a visual line acquiring section 42, an interference information acquiring section 43, a question answer acquiring section 44, a question result acquiring section 45, and an action information acquiring section 46. The biological information acquisition unit 41 acquires information indicating a psychological state including biological information such as a heart rate or a blood oxygen concentration from the measurement device 20 via the communication unit 52. The sight line acquisition unit 42 detects a sight line movement of the subject from the moving image acquired by the operation information acquisition unit 30. Or the sight line acquisition section 42 may acquire information of the sight line movement of the subject detected by the measuring device 20 such as smart glasses.
The disturbance information acquisition unit 43 acquires disturbance information indicating the degree of disturbance stimulus that may affect the psychological state of the subject 10 when the subject performs a predetermined physical operation. The interference information is input by the subject 10 through a touch panel operation. The disturbing stimulus is, for example, an auditory stimulus such as noise, a visual stimulus such as induction to become attention, or the like. The disturbance stimulus may be information indicating the external environment when the subject 10 performs a predetermined physical action, or may be a stimulus that is an external influence intentionally applied to measure an influence on the mental state. Examples of the information indicating the external environment include objective values such as weather, air temperature, humidity, date, time, week, season, grass type, latitude, longitude, and altitude. The disturbance stimulus to the concentration force of the subject is, for example, a stimulus that disturbs the consciousness of the subject, such as a memory problem or a calculation problem. The disturbing stimulus to the attention of the subject is, for example, a stimulus that disturbs the sense of sight, hearing, touch, or the like of the subject. The disturbance information may be, for example, information that the subject 10 selects an item of external stimulus and inputs, and the subject 10 subjectively judges the degree of stimulus and inputs a numerical value.
The question-answer acquiring unit 44 acquires an answer to a question related to psychological information via an input of the subject 10 through a touch panel operation. The question related to the psychological information is, for example, a question item of a questionnaire for estimating a psychological state. As an example of the problem, as a problem item for consciously evaluating psychological conditions, "q. "" Q. I concentrate. "Q. I have good mood. ". As an answer to these questions, "1. Disagree at all" can be cited. ""2. Disagree. ""3 "is also acceptable. ""4. Agree. ""5. Very agrees. "examples of scale answers for these five classes".
The task result acquisition unit 45 acquires the result of the cognitive task for evaluating concentration or attention through the input of the subject 10 by the touch panel operation. The result of evaluating the cognitive problem of concentration or attention is, for example, a score such as a response rate or a response time calculated based on the input of the subject 10 in the test for measuring the difficulty in dispersion of concentration or attention. Examples of the cognitive problem include the following tests: a plurality of pictures are displayed in which characters representing color names such as "yellow", "blue", "green" and "red" are mixed and arranged in such a manner that characters displayed in the same display color as the characters are displayed and characters intentionally displayed in a different display color from the color represented by the characters, and characters matching the characters with the display colors are selected. The subject 10 selects a character whose character matches the display color, and detects the correct answering rate and the time when all answers are correct as a test result. As described above, by the test which may be seen by mistake at first glance, the concentration or the attention of the subject 10 can be evaluated, and the psychological state at that point in time can be estimated.
The action information acquiring section 46 acquires information on actions or solutions performed by the subject 10 to improve or maintain physical actions or mental states via input of the subject 10 through a touch panel operation. The action information is, for example, a project name or a numerical value indicating the kind of action such as the exercise amount and the positive idea.
The result output unit 51 displays the information indicating the state of motion acquired by the motion information acquisition unit 30 or the information indicating the state of mind acquired by the mental information acquisition unit 40 on a screen, and transmits the information to the physical and mental state evaluation server 50 via the communication unit 52. The result output unit 51 obtains various evaluation values determined by the physical and mental state evaluation server 50 via the communication unit 52 and displays the evaluation values on the screen. The result output unit 51 may have a function of publishing information indicating an action state such as an acquired moving image or information indicating a psychological state, an evaluation value acquired from the physical and psychological state evaluation server 50, or the like to various social network services (Social Networking Service, SNS) and publishing the information in a specific range, or collecting and displaying the information from the SNS.
The physical and mental state evaluation server 50 includes a communication unit 62, an action information acquisition unit 70, a psychological information acquisition unit 80, an evaluation determination unit 90, an evaluation value storage unit 98, and a result output unit 99.
The operation information acquisition unit 70 is a function corresponding to the operation information acquisition unit 30 of the information terminal 12, and includes a moving image acquisition unit 72, a sound acquisition unit 74, and an operation acquisition unit 76. The moving image acquisition unit 72 acquires an image obtained by capturing the subject 10 repeatedly performing a predetermined body motion from the information terminal 12 via the communication unit 62. The sound acquiring unit 74 acquires sound information from the information terminal 12 via the communication unit 62. The operation acquisition unit 76 acquires information indicating an operation, such as a detection result of the operation sensor or a detection result of the line of sight, from the information terminal 12 via the communication unit 62.
The psychological information acquiring section 80 is a function corresponding to the psychological information acquiring section 40 of the information terminal 12, and includes a biological information acquiring section 81, a visual line acquiring section 82, an interference information acquiring section 83, a question answer acquiring section 84, a question result acquiring section 85, and an action information acquiring section 86. The biological information acquiring unit 81 acquires information representing a psychological state including biological information such as a heart rate or a blood oxygen concentration of the subject detected by the measuring device 20 such as a smart watch from the information terminal 12 via the communication unit 62. The sight line acquisition unit 82 detects a sight line movement of the subject from the moving image acquired from the information terminal 12 via the communication unit 62. Or the sight line acquisition section 82 may acquire information of the sight line movement of the subject detected by the measuring device 20 such as smart glasses from the information terminal 12 via the communication section 62. In the present embodiment, the operation information acquiring unit 30 and the psychological information acquiring unit 80 in the information terminal 12, and the operation information acquiring unit 70 and the psychological information acquiring unit 80 in the physical and mental state evaluation server 50 are described as similar functions. However, the action information acquiring unit and the psychological information acquiring unit may be provided on only one of the information terminal 12 side and the physical and mental state evaluation server 50 side, or may be provided with functions dispersed in both.
The disturbance information acquisition unit 83 acquires disturbance information indicating the degree of disturbance stimulus that may affect the psychological state of the subject 10 when the subject performs a predetermined physical operation from the information terminal 12 via the communication unit 62. The question-answer acquiring unit 84 acquires an answer to a question related to psychological information from the information terminal 12 via the communication unit 62. The task result acquisition unit 45 acquires the result of the cognitive task for evaluating concentration or attention from the information terminal 12 via the communication unit 62. The action information acquiring unit 46 acquires information on actions or solutions performed by the subject 10 to improve or maintain physical actions or mental states from the information terminal 12 via the communication unit 62.
The evaluation determination unit 90 analyzes and quantifies information indicating the state of motion or information indicating the state of mind, determines an evaluation value, and stores the evaluation value in the evaluation value storage unit 98. The evaluation determination unit 90 includes a reproducibility evaluation unit 94, a psychological evaluation unit 96, and a model processing unit 97.
The reproducibility evaluation unit 94 determines a reproducibility evaluation value based on the analysis result concerning the reproducibility of the predetermined physical motion performed by the subject 10. The reproducibility evaluation unit 94 in the present embodiment analyzes reproducibility of a flick motion or a swing motion from a moving image of the flick motion or the swing motion of the golf ball of the subject 10, and determines a reproducibility evaluation value based on the analysis result.
The reproducibility evaluation unit 94 selects a predetermined operation feature value as a predetermined parameter related to reproducibility based on information indicating the operation state, and quantifies the reproducibility by the degree of coincidence of the operation feature values with the lapse of time. The information indicating the motion state mentioned here is a moving image of a flick motion or swing motion of golf of the subject 10. The reproducibility evaluation unit 94 estimates three-dimensional coordinates of the anatomical feature points from the image of the subject 10 included in the moving image. The anatomical feature points mentioned here include not only body parts such as joints of the subject 10 but also feature points of shoes worn by the subject 10 or feature points of props such as golf clubs held by the subject 10. The reproducibility evaluation unit 94 selects at least any one of the motion feature amounts of the position, the trajectory, and the movement speed of the feature points based on the temporal change of the feature points estimated in the moving image. The reproducibility evaluation unit 94 quantifies the reproducibility of the motion of the subject 10 by the degree of coincidence of the motion feature values with the lapse of time.
In addition to the feature amount that is the time history characteristic that may vary in the time passage of the body motion as a whole, the motion feature amount selected by the reproducibility evaluation unit 94 also includes a feature amount that is an instantaneous characteristic that occurs at a specific instant in the body motion.
Examples of the feature quantity of the time history characteristic include a position and a trajectory of a feature point, a movement speed history, a joint angle, a joint angular velocity, a joint angular acceleration, an upper limb movement speed, an upper limb movement trajectory, a time history of a trunk posture, a standing posture width, and a face orientation. Other characteristic amounts as time history characteristics include the rhythm or velocity of the swing sound or the hitting sound, the face orientation obtained from the line of sight, the trajectory of the line of sight, the pupil diameter, the swing velocity detected by the motion sensor, the upper limb movement velocity, the upper limb movement trajectory, the hitting intensity, and the like. Examples of the characteristic amount of the instantaneous characteristic generated at a specific moment in the body motion include a maximum swing speed, a striking coefficient, a movable range, a swing sound, a striking sound, and the like.
The various motion feature amounts selected by the reproducibility evaluation unit 94 may be normalized by the standard deviation, or may be set as a variation coefficient value obtained by dividing the standard deviation by the average value. For example, when a test technique is detected by normalization, which is a case where reproducibility is significantly poor, such as a clear error that may occur in the execution of a predetermined physical action, such a test technique may be processed so as to be excluded from the evaluation targets.
The reproducibility evaluation unit 94 may determine the reproducibility evaluation value by the degree of coincidence in a plurality of actions for each type of feature quantity. The reproducibility evaluation unit 94 may determine the overall reproducibility evaluation value by multiplying the feature values or the respective reproducibility evaluation values by the sum of the weight coefficients corresponding to the respective importance levels. For example, a weight coefficient alpha is given to a feature quantity as an instantaneous characteristic, a weight coefficient beta is given to a feature quantity as a time history characteristic, the weight coefficient α is equal to or larger than the weight coefficient β (α+.beta.). Thus, the feature quantity with respect to the moment directly affecting the athletic performance can be weighted higher than other moments.
In addition, in the case where a plurality of types of feature amounts are evaluated as feature amounts that are instantaneous characteristics, in order to commonalize the weight coefficients α given thereto, the feature amounts that are instantaneous characteristics may be normalized by multiplying individual normalization coefficients for each type of feature amounts. Similarly, in the case where a plurality of types of feature amounts are evaluated as feature amounts that are time history characteristics, in order to commonly use the weighting coefficients β given thereto, the feature amounts that are time history characteristics may be normalized by multiplying individual normalization coefficients for each type of feature amounts. Further, since there are individual differences in the weight coefficient α, the weight coefficient β, or the appropriate value as the normalization coefficient for each feature amount and the numerical difference of these values, it is possible to machine-learn various feature amounts and calculate appropriate values for the subjects, respectively.
The psychological assessment section 96 decides a psychological assessment value based on the analysis result concerning the psychological state of the subject 10. The psychological assessment unit 96 in this embodiment performs the following analysis: based on the biological information of the subject 10 or information indicating the psychological state such as the answer information input from the subject 10, the psychological state is quantified according to a predetermined parameter related to the tendency of the psychological state. The biological information mentioned here is, for example, heart rate or oxygen concentration in blood measured by a wearable device such as a smart watch, line of sight movement detected by a wearable device such as smart glasses, or the like. As a predetermined parameter related to the tendency of the psychological state, the psychological assessment unit 96 quantifies the psychological state based on at least any one of the tendency of fluctuation of the biological information, the tendency of the answer to the subject 10 to the question related to the psychological information, and the result of the cognitive task for evaluating the concentration or the attention of the subject 10. The psychological assessment unit 96 may quantify the psychological state based on a predetermined parameter related to the tendency of the psychological state according to the degree of the disturbance stimulus.
The psychological assessment section 96 may calculate a psychological assessment value from the sum of the assessment value as the short-term index and the assessment value as the medium-term index. The evaluation value of the short-term index is, for example, a value obtained by multiplying a sum of a value representing the psychological state of concentration and a value representing the psychological state of attention of the subject by a predetermined weight coefficient. The evaluation value as the medium-and-long-term index is a value obtained by multiplying a value representing the psychological state of the subject's motivation by a predetermined weight coefficient. Since there is a personal difference between the appropriate values of the weight coefficients given to the short-term index and the medium-term index, the appropriate values can be calculated for each subject by learning various feature amounts by the machine. The evaluation value as the short-term index may be an absolute value of a difference between a value indicating the psychological state of concentration and attention in a state where reproducibility is good and a value indicating the psychological state of concentration and attention when the operation is performed under the disturbance stimulus. The evaluation value as the medium-and-long-term index may be an absolute value of a difference between a value indicating a psychological state of the motivation in a state where reproducibility is good and a value indicating a psychological state of the motivation when the exercise is performed in daily life.
The model processing unit 97 performs machine learning using the reproducibility evaluation value and the psychological evaluation value as teaching data, generates a prediction model, and stores the generated prediction model in the evaluation value storage unit 98 as the personal characteristics of the subject 10. The model processing section 97 may generate the prediction model in the form of a regression model having one of the reproducibility evaluation value and the psychological evaluation value as the explanatory variable and the other as the target variable. The model processing section 97 can estimate a psychological evaluation value corresponding to a reproducibility evaluation value indicating a state of good reproducibility, based on a prediction model corresponding to the subject 10.
As information of each subject 10, the evaluation value storage 98 associates and stores the reproducibility evaluation value and the psychological evaluation value. As information of each subject 10, the evaluation value storage 98 further stores a prediction model generated based on the reproducibility evaluation value and the psychological evaluation value.
The result output unit 99 outputs information indicating the correlation between the quality of the reproducibility evaluation value and the psychological evaluation value. The result output section 99 visualizes the correlation of the quality of the reproducibility evaluation value with the psychological evaluation value by displaying, for example, the reproducibility evaluation value when the psychological evaluation value is low and the reproducibility evaluation value when the psychological evaluation value is high in an aligned manner. The result output unit 99 compares the reproducibility evaluation values obtained by the presence or absence of the disturbance stimulus, and displays the reproducibility evaluation values in the case of the presence or absence of the disturbance stimulus in a row, thereby visualizing the correlation between the quality of the reproducibility evaluation values and the presence or absence of the disturbance stimulus. The result output unit 99 further outputs a result obtained by predicting a psychological evaluation value corresponding to the reproducibility evaluation value indicating the state of good reproducibility based on the prediction model.
Fig. 3 illustrates a process of estimating the position of an anatomical feature point from an image obtained when a subject performs a flick motion of golf. The subject 10 performing a flick motion of golf is displayed in the image 110. By the image processing performed on the image 110, the positions of the main joints and the like, which are anatomical feature points, are estimated in the form of three-dimensional coordinates from the image portion of the body of the subject 10. In fig. 3, the coordinates of the estimated feature points are represented by a plurality of circles 112. The skeleton of the subject 10 is shown in the form of a so-called simple drawing by connecting a plurality of circles 112 representing feature points in bold lines 114. The position of the circle 112 or the thick line 114 is shown in such a manner as to follow the movement of the feature point according to the motion of the subject 10 in the image 110. The position or motion of such feature points is estimated by the reproducibility evaluation unit 94, and is selected based on the temporal change of the estimated feature points as motion feature amounts such as the position, trajectory, and moving speed of the feature points.
The reproducibility evaluation unit 94 estimates three-dimensional coordinates of the anatomical feature points from the image of the subject 10 included in the moving image. The reproducibility evaluation unit 94 selects an operation feature amount of at least one of the position, the trajectory, and the movement speed of the feature point based on the temporal change of the feature point estimated in the moving image. The reproducibility evaluation unit 94 quantifies the reproducibility of the motion of the subject 10 by the degree of coincidence of the motion feature values with the lapse of time.
Fig. 4 (a) and 4 (b) show examples of positions of a plurality of feature points when the same flick operation is repeated three times in two-dimensional coordinates. Fig. 4 (a) is an image obtained by photographing the front side of the subject 10, and fig. 4 (b) is an image obtained by photographing the left side of the subject 10. From these images, a simple drawing 116 is shown, which is formed by connecting a plurality of circles 112, which are anatomically characteristic points, estimated by known motion analysis techniques, with thick lines 114. The simple drawing 116 is arranged such that the center of the pelvis is located at the origin of the horizontal axis and the vertical axis, and each feature point is represented by a relative position with respect to the origin.
The dot group surrounded by the broken line 120 indicates the position of the feature point of the face of the subject 10 in the multi-tap motion, and the distribution range of the dot group indicates the range of movement of the feature point. Similarly, the dot group surrounded by the broken line 121 indicates the position of the right elbow of the subject 10 and the movement thereof, and the dot group surrounded by the broken line 122 indicates the position of the left elbow of the subject 10 and the movement thereof. The dot group surrounded by the broken line 123 indicates the position of the right wrist of the subject 10 and its movement, and the dot group surrounded by the broken line 124 indicates the position of the left wrist of the subject 10 and its movement.
Fig. 5 (a) and 5 (b) are graphs showing time history characteristics of motion characteristics when the same flick motion is repeated three times. The graph has a feature value on the vertical axis and a time on the horizontal axis. The motion feature amount mentioned here is not limited to the information of the position of the anatomical feature point shown in fig. 3, 4 (a) and 4 (b), and may be another value as long as it is information indicating motion, and may be, for example, frequency or volume obtained by voice analysis. Or pupil diameter or line of sight trajectory obtained by analyzing line of sight information, or the amplitude intensity of motion velocity, motion acceleration, or stroke.
In the example of fig. 5 (a), the motion reproducibility is evaluated by averaging the difference from the average data in each motion. Line 131 represents the characteristic amount in the first action, line 132 represents the characteristic amount in the second action, line 133 represents the characteristic amount in the third action, and line 130 represents the average value of the action characteristic amounts. The lines 131 to 133 representing three operations are arranged on the time axis with reference to a predetermined timing, for example, the moment of putting.
The average value of the feature quantity of one operation is calculated by averaging the absolute value of the difference between the feature quantity of each time in one operation indicated by the lines 131 to 133 and the feature quantity of each time in the operation of the average data indicated by the line 130 over time. The average value of the feature amounts of one motion is further averaged over three motions, and the obtained value is calculated as an evaluation value of motion reproducibility. The smaller the evaluation value is, the higher the action reproducibility is, and the larger the evaluation value is, the lower the action reproducibility is.
The smaller the deviation of the motion among the three motions, the smaller the evaluation value, and the lines 131, 132, and 133 converge on the line 130 representing the average value, indicating that the motion reproducibility is high. The larger the deviation of the motion among the three motions, the larger the evaluation value, and the lines 131, 132, and 133 do not converge on the line 130, and the distances are generated from each other, indicating that the motion reproducibility is low.
In addition, even in one physical movement, there is a case where a part having high reproducibility of movement and a part having low reproducibility of movement are included locally in the passage of time. For example, on the time axis, the portions of the lines 131, 132, and 133 having a narrow interval represent the operation portions having relatively high operation reproducibility, and the portions of the lines 131, 132, and 133 having a wide interval represent the operation portions having relatively low operation reproducibility.
In the example of fig. 5 (b), the operation reproducibility is evaluated by the product of the differences in the operation feature amounts in the time course. Line 131 represents the feature quantity in the first action, line 132 represents the feature quantity in the second action, and line 133 represents the feature quantity in the third action. In fig. 5 (b), hatching is added to the range from the minimum feature amount to the maximum feature amount per hour indicated by lines 131 to 133. The larger the area of the hatched area, the lower the motion reproducibility is, and the smaller the area of the hatched area is, the higher the motion reproducibility is. The area of the hatched area is obtained by a differential product method in which the difference between the maximum feature amount and the minimum feature amount at time k is integrated, and is used as an evaluation value of motion reproducibility. The smaller the evaluation value is, the higher the action reproducibility is, and the larger the evaluation value is, the lower the action reproducibility is.
Fig. 6 shows an example of a screen that compares and displays the motion reproducibility in the case of a good psychological state with the motion reproducibility in the case of a poor psychological state. In the left column 140, a reproducibility evaluation value such as "reproducibility 90" is displayed as a character and a circle chart as the reproducibility of the motion in the case where the psychological state is good. In the right column 141, a reproducibility evaluation value such as "reproducibility 30" is displayed as a character and a circle chart as the reproducibility of the motion in the case of poor psychological state. The left column 140 displays the simple drawing 116 showing the motion reproducibility in the case of a good psychological state in an animation, and the right column 141 displays the simple drawing 116 showing the motion reproducibility in the case of a poor psychological state in an animation. The left column 140 displays annotations of feature amounts indicating characteristic evaluation values when the psychological state is good. The right column 141 displays an annotation of the feature quantity indicating the evaluation value of the feature in the case where the psychological state is poor.
(second embodiment)
The second embodiment applies the present invention to the relationship of reproducibility of the running posture and the psychological state of the subject 10, which is different from the first embodiment which applies the present invention to the relationship of reproducibility of the putting motion or swing motion of golf and the psychological state. Hereinafter, differences from the first embodiment will be mainly described, and description of common points will be omitted.
In the example of the second embodiment, when the subject 10 performs the running exercise by using the camera function of the information terminal 12, the captured image is transmitted to the physical and mental state estimation server 50 as information indicating the exercise state. A measurement device 20 such as a smart watch or a pedometer that acquires biological information, for example, information indicating a psychological state such as heart rate, oxygen concentration in blood, pupil reaction, or information indicating an operational state such as the rhythm or speed of a swing arm is worn on, for example, the wrist or the foot of the subject 10.
Fig. 7 illustrates a process of estimating the position of an anatomical feature point from an image obtained when a subject performs a running motion. The subject 10 performing the running motion is displayed in the image 110. By the image processing performed on the image 110, the positions of the main joints and the like, which are anatomical feature points, are estimated in the form of three-dimensional coordinates from the image portion of the body of the subject 10. In fig. 7, the coordinates of the estimated feature points are represented by a plurality of circles 112. The skeleton of the subject 10 is shown in the form of a so-called simple drawing by connecting a plurality of circles 112 representing feature points in bold lines 114. The position of the circle 112 or the thick line 114 is shown in such a manner as to follow the movement of the feature point according to the motion of the subject 10 in the image 110.
The description is made with reference to fig. 2. The measurement device 20 may be a wearable device such as a smart watch worn on the wrist of the subject, smart glasses worn on the eyes like glasses or sunglasses, a pedometer worn on the ankle or shoes, or the like. The motion detection unit 26 or the motion acquisition unit 76 when the measuring device 20 is worn on an ankle or a shoe acquires information such as acceleration, pitch, and stride of the foot detected by a motion sensor such as a nine-axis sensor. The moving image acquisition unit 32 or the moving image acquisition unit 72 acquires an image obtained when the subject 10 performs the running motion. The sound acquisition unit 34 or the sound acquisition unit 74 acquires breathing sounds, landing sounds, and the like when the subject 10 performs running motion.
The reproducibility evaluation unit 94 analyzes reproducibility of the running posture, the pitch, the stride, and the like based on the moving image of the running motion of the subject 10 or the motion information acquired by the motion detection unit 26 or the motion acquisition unit 76, and determines a reproducibility evaluation value based on the analysis result. The reproducibility evaluation unit 94 selects a predetermined operation feature value as a predetermined parameter related to reproducibility based on information indicating the operation state, and quantifies the reproducibility by the degree of coincidence of the operation feature values with the lapse of time. The information indicating the motion state mentioned here is a moving image of the running motion of the subject 10 or information of the motion acquired by the motion detecting unit 26 or the motion acquiring unit 76. The reproducibility evaluation unit 94 estimates three-dimensional coordinates of the anatomical feature points from the image of the subject 10 included in the moving image. The anatomical feature points mentioned here include not only body parts such as joints of the subject 10 but also feature points of shoes worn by the subject 10. The reproducibility evaluation unit 94 estimates the position and the motion of the feature point as shown in fig. 7, and selects the position and the motion based on the temporal change of the estimated feature point as motion feature amounts such as the position, the trajectory, and the moving speed of the feature point.
The feature values selected as the time history characteristics by the reproducibility evaluation unit 94 include, for example, the position of the feature point and its trajectory, the movement speed history, the joint angle, the joint angular velocity, the joint angular acceleration, the upper limb movement speed, the upper limb movement trajectory, the time history of the trunk posture, the pitch, the stride, and the face orientation. Other characteristic amounts of the time history characteristics include the pitch or rhythm of the swing arm or the landing in the breathing sound or the landing sound, the face orientation obtained from the line of sight, the trajectory of the line of sight, the pupil diameter, the swing speed of the arm or the foot detected by the motion sensor, the upper limb movement speed, the upper limb movement trajectory, the running speed, and the like. The running pace may be calculated based on the pitch and the stride, or may be calculated based on a travel history in which position information received by the measuring device 20 from a satellite positioning system such as a global positioning system (Global Positioning System, GPS) is recorded. Examples of the characteristic amounts of the instantaneous characteristic generated at a specific moment in the body motion include the maximum swing speed of the arm or the foot, the landing impact, the movable range, the breathing sound, the landing, and the intensity of the pedaling.
In the second embodiment, the correlation between the merits of the psychological states and the merits of the action reproductions may be visualized, and the reproduction evaluation values corresponding to the various conditions different in psychological states may be displayed in comparison. Therefore, the subject 10 can search for clues for improvement in mental state by himself/herself, for example, in the case where the running performance is reduced.
(third embodiment)
The third embodiment applies the present invention to the correlation of the reproducibility of the trick on the slide board and the psychological state of the subject 10, which is different from the first and second embodiments in which the present invention is applied to the correlation of the reproducibility of the flick motion or swing motion of golf or the running motion and the psychological state. Hereinafter, differences from the first and second embodiments will be mainly described, and description of common points will be omitted.
In the example of the third embodiment, a test of performing a skill action called a skateboard trick by the subject 10 is photographed using the camera function of the information terminal 12, and the photographed image is transmitted as information indicating the action state to the physical and mental state evaluation server 50. A measurement device 20 such as a smart watch that acquires biological information, for example, information indicating a psychological state such as heart rate, blood oxygen concentration, pupil reaction, or information indicating an operational state such as rotation angle, altitude, speed, and dead time is worn on the wrist, helmet, ankle, skateboard, or the like of the subject 10.
Fig. 8 illustrates a process of estimating the position of an anatomical feature point from images obtained when a test of a stunt is performed by a subject. The subject 10 performing the trick play is shown in the view 110. By the image processing performed on the image 110, the positions of the main joints and the like, which are anatomical feature points, are estimated in the form of three-dimensional coordinates from the image portion of the body of the subject 10. In fig. 8, the coordinates of the estimated feature points are represented by a plurality of circles 112. The skeleton of the subject 10 is shown in the form of a so-called simple drawing by connecting a plurality of circles 112 representing feature points in bold lines 114. The position of the circle 112 or the thick line 114 is shown in such a manner as to follow the movement of the feature point according to the motion of the subject 10 in the image 110.
The description is made with reference to fig. 2. The measurement device 20 may be a wearable device such as a smart watch worn on the wrist of the subject, smart glasses worn on the eyes like glasses or sunglasses, a sensor worn on a helmet, a knee pad, an ankle, a skateboard, or the like. The motion detecting unit 26 or the motion acquiring unit 76 in the case of wearing the measuring device 20 on a helmet, a knee pad, an ankle, a skateboard, or the like acquires information such as a rotation angle, a height, a speed, a dead time, or the like detected by a motion sensor such as a nine-axis sensor. The video image acquisition unit 32 or the video image acquisition unit 72 acquires video images obtained when the subject 10 is subjected to a test of a trick motion. The sound acquisition unit 34 or the sound acquisition unit 74 acquires breathing sounds, traveling sounds, landing sounds, and the like when the subject 10 performs a special action.
The reproducibility evaluation unit 94 analyzes the reproducibility of the special effects based on the deviation degree of the posture or the motion with respect to the special effects in the specific trial, based on the moving image of the special effects performed by the subject 10 or the information of the motion acquired by the motion detection unit 26 or the motion acquisition unit 76, and determines a reproducibility evaluation value based on the analysis result. The specific test item mentioned here may be, for example, a dynamic image of a past test item performed by the subject 10 itself or information of an action. Alternatively, instead of the test of the subject 10 itself, for example, information on a moving image or motion of a past test performed by another person such as an olympic player, or information on a moving image or motion of a simple figure-shaped motion obtained as a model may be manually operated or set.
The reproducibility evaluation unit 94 selects a predetermined operation feature value as a predetermined parameter related to reproducibility based on information indicating the operation state, and quantifies the reproducibility by the degree of coincidence of the operation feature values with the lapse of time. The high degree of reproducibility of the tricks can be defined as the success rate of the tricks. The information indicating the operation state mentioned here is a moving image of the test of the subject 10 or information of the operation acquired by the operation detecting unit 26 or the operation acquiring unit 76. The reproducibility evaluation unit 94 estimates three-dimensional coordinates of the anatomical feature points from the image of the subject 10 included in the moving image. The anatomical feature points mentioned here include not only body parts such as joints of the subject 10 but also feature points of appliances such as shoes or skateboards worn by the subject 10. The reproducibility evaluation unit 94 estimates the position and the motion of the feature point as shown in fig. 8, and selects the position and the motion based on the temporal change of the estimated feature point as motion feature amounts such as the position, the trajectory, and the moving speed of the feature point.
The feature values selected as the time history characteristics by the reproducibility evaluation unit 94 include, for example, the position of the feature point, the trajectory thereof, the moving speed, the moving direction, the body orientation, the posture, the rotation angle, the distance from the segment, the height, the dead time, the joint angle, the joint angular velocity, the joint angular acceleration, the upper limb movement speed, the upper limb movement trajectory, the time history of the trunk posture, and the face orientation. Other characteristic amounts of the time history characteristics include timing of a stunt in a traveling sound or a landing sound, a portion of the skateboard contacting the section, a face orientation obtained from a line of sight, a trajectory of the line of sight, a pupil diameter, a direction and acceleration of a stunt motion detected by the motion sensor, a moving speed, an upper limb moving trajectory, and the like. Examples of the characteristic amount of the instantaneous characteristic generated at a specific moment in the body motion include the speed or intensity of jump, the landing impact, and the movable range.
In the third embodiment, the correlation between the merits of the psychological states and the merits of the action reproductions may be visualized, and the reproduction evaluation values corresponding to the various conditions different in psychological states may be displayed in comparison. Therefore, the subject 10 can search for a clue for improvement in the psychological state by itself, for example, when the success rate of the trick decreases.
The present invention is not limited to the above-described embodiments, and each structure may be appropriately modified within a range not departing from the gist of the present invention.
In the above embodiments, examples of the correlation between the reproducibility and the psychological state of the stunts applied to the putting motion or swing motion, running motion, and skateboard of golf have been described. In the modified example, the present invention can be applied to the reproducibility of other motions such as other physical motions, for example, a swing of a baseball or a kick of a soccer ball, a shooting motion in a ball game such as martial arts and basketball, a scoring game such as a figure skating, a dance, a gymnastics, and an athletic game such as a jump or a throwing, and the reproducibility of a musical instrument performance such as a piano.
The following modes can be obtained by generalizing the above embodiments.
[ form 1 ]
A physical and mental state assessment system, comprising:
a reproducibility evaluation unit that determines a reproducibility evaluation value based on an analysis result regarding reproducibility of a predetermined physical motion performed by the subject;
a psychological assessment unit that determines a psychological assessment value based on an analysis result concerning the psychological state of the subject;
an evaluation value storage unit that stores the reproducibility evaluation value in association with the psychological evaluation value; and a result output unit that outputs information indicating the relevance of the quality of the reproducibility evaluation value to the psychological evaluation value.
[ form 2 ]
The physical and mental state estimation system according to aspect 1, further comprising:
an operation information acquisition unit that acquires information indicating an operation state of the subject in which the predetermined body operation is repeatedly performed; and
a psychological information acquisition unit that acquires information indicating a psychological state of the subject,
the reproducibility evaluation unit performs analysis of quantifying the reproducibility by a predetermined parameter related to the reproducibility of the predetermined physical movement based on the information indicating the movement state, and decides the reproducibility evaluation value based on the analysis result,
The psychological assessment unit performs analysis for quantifying the psychological state based on information indicating the psychological state by a predetermined parameter related to the tendency of the psychological state, and decides the psychological assessment value based on the analysis result.
[ form 3 ]
The physical and mental state estimation system according to aspect 2, wherein: the reproducibility evaluation unit selects a predetermined motion feature value as a predetermined parameter related to the reproducibility based on the information indicating the motion state, and quantifies the reproducibility by the degree of coincidence of the motion feature value with the lapse of time.
[ morphology 4 ]
The physical and mental state estimation system according to any one of the aspects 2 to 3, wherein: the psychological assessment unit quantifies the psychological state based on at least one of a tendency of the subject to change in predetermined biological information, a tendency of the subject to answer to a question related to psychological information, and a result of a cognitive task of assessing the subject's concentration or concentration, as predetermined parameters related to the tendency of the psychological state.
[ shape 5 ]
The physical and mental state estimation system according to any one of the aspects 2 to 4, wherein: the psychological information acquiring section further acquires information indicating the degree of disturbance stimulus that may affect the psychological state of the subject,
The psychological assessment unit quantifies the psychological state based on a predetermined parameter related to the tendency of the psychological state according to the degree of the disturbance stimulus.
[ shape 6 ]
The physical and mental state estimation system according to any one of the aspects 1 to 5, wherein: the evaluation value storage section further stores a prediction model generated based on the reproducibility evaluation value and the psychological evaluation value,
the result output unit further outputs a result of predicting the psychological assessment value corresponding to the reproducibility assessment value indicating the state of good reproducibility based on the prediction model.
[ shape 7 ]
A method for assessing physical and mental state, comprising:
a step of determining a reproducibility evaluation value based on an analysis result concerning reproducibility of a predetermined physical motion performed by the subject;
a process of deciding a psychological assessment value based on an analysis result related to the psychological state of the subject;
a process of establishing and recording a correspondence between the reproducibility evaluation value and the psychological evaluation value; and
and outputting information indicating the correlation between the quality of the reproducibility evaluation value and the psychological evaluation value.

Claims (7)

1. A physical and mental state assessment system, comprising:
a reproducibility evaluation unit that determines a reproducibility evaluation value based on an analysis result regarding reproducibility of a predetermined physical motion performed by the subject;
a psychological assessment unit that determines a psychological assessment value based on an analysis result concerning the psychological state of the subject;
an evaluation value storage unit that stores the reproducibility evaluation value in association with the psychological evaluation value; and
and a result output unit that outputs information indicating the relevance between the quality of the reproducibility evaluation value and the psychological evaluation value.
2. The physical and mental state assessment system according to claim 1, further comprising:
an operation information acquisition unit that acquires information indicating an operation state of the subject in which the predetermined body operation is repeatedly performed; and
a psychological information acquisition unit that acquires information indicating a psychological state of the subject,
the reproducibility evaluation unit performs analysis of quantifying the reproducibility by a predetermined parameter related to the reproducibility of the predetermined physical movement based on the information indicating the movement state, and decides the reproducibility evaluation value based on the analysis result,
The psychological assessment unit performs analysis for quantifying the psychological state based on information indicating the psychological state by a predetermined parameter related to the tendency of the psychological state, and decides the psychological assessment value based on the analysis result.
3. The physical and mental state assessment system according to claim 2, wherein: the reproducibility evaluation unit selects a predetermined motion feature value as a predetermined parameter related to the reproducibility based on the information indicating the motion state, and quantifies the reproducibility by the degree of coincidence of the motion feature value with the lapse of time.
4. The physical and mental state assessment system according to claim 2, wherein: the psychological assessment unit quantifies the psychological state based on at least one of a tendency of the subject to change in predetermined biological information, a tendency of the subject to answer to a question related to psychological information, and a result of a cognitive task of assessing the subject's concentration or concentration, as predetermined parameters related to the tendency of the psychological state.
5. The physical and mental state assessment system according to claim 2, wherein: the psychological information acquiring section further acquires information indicating the degree of disturbance stimulus that may affect the psychological state of the subject,
The psychological assessment unit quantifies the psychological state based on a predetermined parameter related to the tendency of the psychological state according to the degree of the disturbance stimulus.
6. The physical and mental state evaluation system according to claim 1 or 2, characterized in that: the evaluation value storage section further stores a prediction model generated based on the reproducibility evaluation value and the psychological evaluation value,
the result output unit further outputs a result of predicting the psychological assessment value corresponding to the reproducibility assessment value indicating the state of good reproducibility based on the prediction model.
7. A method for assessing physical and mental state, comprising:
a step of determining a reproducibility evaluation value based on an analysis result concerning reproducibility of a predetermined physical motion performed by the subject;
a process of deciding a psychological assessment value based on an analysis result related to the psychological state of the subject;
a process of establishing and recording a correspondence between the reproducibility evaluation value and the psychological evaluation value; and
and outputting information indicating the correlation between the quality of the reproducibility evaluation value and the psychological evaluation value.
CN202311078827.XA 2022-08-31 2023-08-25 Physical and mental state evaluation system and physical and mental state evaluation method Pending CN117617971A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2022-137591 2022-08-31
JP2023110567A JP2024035087A (en) 2022-08-31 2023-07-05 Mental and physical condition evaluation system and mental and physical condition evaluation method
JP2023-110567 2023-07-05

Publications (1)

Publication Number Publication Date
CN117617971A true CN117617971A (en) 2024-03-01

Family

ID=90030983

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311078827.XA Pending CN117617971A (en) 2022-08-31 2023-08-25 Physical and mental state evaluation system and physical and mental state evaluation method

Country Status (1)

Country Link
CN (1) CN117617971A (en)

Similar Documents

Publication Publication Date Title
US11832971B2 (en) Wearable device utilizing flexible electronics
KR101687252B1 (en) Management system and the method for customized personal training
EP2451339B1 (en) Performance testing and/or training
Kranz et al. The mobile fitness coach: Towards individualized skill assessment using personalized mobile devices
WO2019114708A1 (en) Motion data monitoring method and system
JP2019000653A (en) Calculating pace and energy expenditure from athletic movement attributes
KR101963682B1 (en) Data management system for physical measurement data by performing sports contents based on augmented reality
US20130171596A1 (en) Augmented reality neurological evaluation method
CN107205661B (en) Energy consumption calculation using data from multiple devices
KR20160045833A (en) Energy expenditure device
Saponara Wearable biometric performance measurement system for combat sports
Ma et al. Basketball movements recognition using a wrist wearable inertial measurement unit
JP6560354B2 (en) Energy consumption calculation using data from multiple devices
JP2017000481A (en) Analysis system and analysis method
KR20190089568A (en) Taekwondo poomsae evaluation system using motion sensing technics based on wearable device
US20230355135A1 (en) Intelligent gait analyzing apparatus
CN117617971A (en) Physical and mental state evaluation system and physical and mental state evaluation method
EP4331484A1 (en) Mental/physical state evaluation system and mental/physical state evaluation method
JPWO2018179664A1 (en) Information processing apparatus, information processing method and program
JP2024035087A (en) Mental and physical condition evaluation system and mental and physical condition evaluation method
JP2005102773A (en) Student behavior management system
Lam Using Accelerometers to Score a Multi-Domain Return-to-Play Assessment for Youth Post-Concussion
US20230178233A1 (en) Biomechanics assessment system and biomechanical sensing device and biomechanical assessment platform thereof
US20220370853A1 (en) J-sleeve system
Sharma et al. Scope of Embedding Supervised learning Techniques on Sensors data in Sports Equipments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication