WO2017130339A1 - Information processing method, information processing device, and program - Google Patents

Information processing method, information processing device, and program Download PDF

Info

Publication number
WO2017130339A1
WO2017130339A1 PCT/JP2016/052397 JP2016052397W WO2017130339A1 WO 2017130339 A1 WO2017130339 A1 WO 2017130339A1 JP 2016052397 W JP2016052397 W JP 2016052397W WO 2017130339 A1 WO2017130339 A1 WO 2017130339A1
Authority
WO
WIPO (PCT)
Prior art keywords
training
predetermined
threshold
unit
signal
Prior art date
Application number
PCT/JP2016/052397
Other languages
French (fr)
Japanese (ja)
Inventor
晋 中村
Original Assignee
株式会社ジェイアイエヌ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ジェイアイエヌ filed Critical 株式会社ジェイアイエヌ
Priority to PCT/JP2016/052397 priority Critical patent/WO2017130339A1/en
Priority to JP2017563464A priority patent/JP6689889B2/en
Publication of WO2017130339A1 publication Critical patent/WO2017130339A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports

Definitions

  • the present invention relates to an information processing method, an information processing apparatus, and a program.
  • the conventional technique basically calculates the amount of exercise, it is not assumed to be applied to training that maintains a predetermined posture such as training on the trunk for a predetermined time.
  • a predetermined posture such as training on the trunk for a predetermined time.
  • it is preferable to evaluate the stability of posture but there is no conventional technique for evaluating the stability of posture in training relating to the trunk.
  • the disclosed technique aims to be able to appropriately evaluate the user's posture in training for maintaining the predetermined posture for a predetermined time.
  • An information processing method is an information processing method executed by a computer having a control unit and a storage unit, wherein the control unit receives information indicating predetermined training related to a trunk, the storage Threshold information stored in the unit, wherein a threshold for the predetermined training is specified based on the threshold information associated with a training for each trunk, and detected from a sensor attached to a user to be evaluated Obtaining a signal value to be performed, and evaluating the posture of the user during the predetermined training based on a comparison result between the signal value and the threshold value.
  • FIG. 1 is a diagram illustrating an example of an information processing system 1 in the embodiment.
  • the information processing system 1 shown in FIG. 1 includes an external device 10 and eyewear 30, and the external device 10 and eyewear 30 are connected via a network so that data communication is possible.
  • the eyewear 30 mounts the processing device 20 on a temple portion, for example.
  • the processing device 20 includes a three-axis acceleration sensor and a three-axis angular velocity sensor (may be a six-axis sensor).
  • the eyewear 30 may have bioelectrodes 31, 33, and 35 at the pair of nose pads and the bridge portion, respectively.
  • electrooculogram signals acquired from these biological electrodes are transmitted to the processing device 20.
  • the installation position of the processing apparatus 20 is not necessarily a temple, but may be positioned in consideration of a balance when the eyewear 30 is attached.
  • the external device 10 is an information processing device having a communication function.
  • the external device 10 is a mobile communication terminal such as a mobile phone or a smartphone possessed by a user, a PC (Personal Computer), a tablet terminal, or the like.
  • the external device 10 evaluates the posture during training related to the trunk (hereinafter also referred to as trunk training) based on the sensor signal received from the processing device 20.
  • trunk training the posture during training related to the trunk
  • the external device 10 will be described as the information processing device 10.
  • FIG. 2 is a schematic configuration diagram illustrating a hardware configuration of the information processing apparatus 10 according to the embodiment.
  • a typical example of the information processing apparatus 10 is a mobile phone such as a smartphone.
  • a network such as a mobile terminal that can be wirelessly or wired connected to a network, or an electronic device that includes a touch panel such as a tablet terminal.
  • a general-purpose device capable of displaying a screen while processing data while communicating using the information processing apparatus 10 may correspond to the information processing apparatus 10 in the embodiment.
  • the information processing apparatus 10 includes, for example, a rectangular thin housing (not shown), and a touch panel 102 is configured on one surface of the housing.
  • each component is connected to the main control unit 150.
  • the main control unit 150 is a processor, for example.
  • the main control unit 150 includes a mobile communication antenna 112, a mobile communication unit 114, a wireless LAN communication antenna 116, a wireless LAN communication unit 118, a storage unit 120, a speaker 104, a microphone 106, a hard button 108, and a hard key 110. And the 6-axis sensor 111 is connected.
  • the main control unit 150 is further connected with a touch panel 102, a camera 130, and an external interface 140.
  • the external interface 140 includes an audio output terminal 142.
  • the touch panel 102 has functions of both a display device and an input device, and includes a display (display screen) 102A that bears the display function and a touch sensor 102B that bears the input function.
  • the display 102A is configured by a general display device such as a liquid crystal display or an organic EL (Electro Luminescence) display.
  • the touch sensor 102B includes an element for detecting a contact operation disposed on the upper surface of the display 102A and a transparent operation surface stacked on the element.
  • a contact detection method of the touch sensor 102B an arbitrary method among known methods such as a capacitance method, a resistance film method (pressure-sensitive method), and an electromagnetic induction method can be adopted.
  • the touch panel 102 as a display device displays an image of an application (hereinafter also referred to as an application) generated by the execution of the program 122 by the main control unit 150.
  • the touch panel 102 as an input device detects an action of a contact object (including a player's finger, stylus, and the like. Hereinafter, a case of a “finger” will be described as a representative example).
  • the operation input is received, and information on the contact position is given to the main control unit 150.
  • the movement of the finger is detected as coordinate information indicating the position or area of the contact point, and the coordinate information is represented as coordinate values on two axes of the short side direction and the long side direction of the touch panel 102, for example.
  • the information processing apparatus 10 is connected to the network N through the mobile communication antenna 112 and the wireless LAN communication antenna 116, and can perform data communication with the processing apparatus 20.
  • the storage unit 120 records the program 122, and the storage unit 120 may be separate from the external device 10, and may be a recording medium such as an SD card or a CD-RAM.
  • FIG. 3 is a block diagram illustrating an example of the configuration of the processing device 20 according to the embodiment.
  • the processing device 20 includes a processing unit 202, a transmission unit 204, a 6-axis sensor 206, and a power supply unit 208.
  • each of the bioelectrodes 31, 33, and 35 is connected to the processing unit 202 using an electric wire via an amplification unit, for example.
  • each part of the processing apparatus 20 may be distributed in a pair of temples instead of being provided in one temple.
  • the 6-axis sensor 206 is a 3-axis acceleration sensor and a 3-axis angular velocity sensor. Each of these sensors may be provided separately.
  • the 6-axis sensor 206 outputs the detected sensor signal to the processing unit 202.
  • the processing unit 202 is, for example, a processor, processes the sensor signal obtained from the 6-axis sensor 206 as necessary, and outputs the processed signal to the transmission unit 204. For example, the processing unit 202 generates the following six signals using the sensor signals from the six-axis sensor 206, and acquires the signal value of each signal.
  • First signal signal indicating acceleration in the X axis direction
  • Second signal signal indicating acceleration in the Y axis direction
  • Third signal signal indicating acceleration in the Z axis direction
  • Fourth signal pitch angle 5th signal: signal indicating roll angle
  • 6th signal signal indicating yaw angle
  • the first signal is a signal indicating acceleration in the longitudinal direction of the head
  • the second signal is a signal indicating acceleration in the lateral direction of the head
  • the third signal is a signal indicating acceleration in the vertical direction of the head.
  • the fourth signal (pitch angle) indicates, for example, the vertical tilt of the head
  • the fifth signal (roll angle) indicates, for example, movement around the head's traveling axis
  • the sixth signal (yaw angle) indicates the head.
  • the left and right inclinations of are shown.
  • the fourth to sixth signals may be calculated using a known technique.
  • the processing unit 202 may only amplify the sensor signal obtained from the 6-axis sensor 206.
  • the transmission unit 204 transmits each signal value of the sixth signal to the information processing apparatus 10 from the first signal processed by the processing unit 202.
  • the transmission unit 204 transmits the sensor signal or each data to the information processing apparatus 10 by wireless communication such as Bluetooth (registered trademark) and wireless LAN, or wired communication.
  • the power supply unit 208 supplies power to the processing unit 202, the transmission unit 204, the 6-axis sensor 206, and the like.
  • FIG. 4 is a diagram illustrating an example of the configuration of the information processing apparatus 10 according to the embodiment.
  • the information processing apparatus 10 includes a storage unit 302, a communication unit 304, and a control unit 306.
  • the storage unit 302 can be realized by, for example, the storage unit 120 shown in FIG.
  • the storage unit 302 relates to data related to the core training application in the embodiment, for example, data received from the processing device 20, training information related to training, threshold information used for determining posture of the user during training, and the purpose of training. Purpose information, result information of user posture determination, screen information displayed on the screen, and the like are stored.
  • the communication unit 304 can be realized by the mobile communication unit 114, the wireless LAN communication unit 118, and the like, for example.
  • the communication unit 304 receives data from the processing device 20, for example.
  • the communication unit 304 may transmit data processed in the information processing apparatus 10 to a server (not shown). That is, the communication unit 304 functions as a transmission unit and a reception unit.
  • the control unit 306 can be realized by the main control unit 150, for example.
  • the control unit 306 executes a trunk training application.
  • the trunk training application in the embodiment has functions of performing guidance for trunk training, evaluating the posture of a user during training, and automatically generating a training menu.
  • the control unit 306 includes a reception unit 312, a specification unit 314, an acquisition unit 316, an evaluation unit 318, a generation unit 320, and a display control unit 322.
  • the functions of the control unit 306 will be described separately for an attitude determination function and a menu generation function.
  • the reception unit 312 receives information (hereinafter, also referred to as a training ID) indicating predetermined training related to the trunk based on a user operation. For example, when the user's pressing on a plurality of trainings displayed on the display screen is detected, the receiving unit 312 receives a training ID corresponding to the pressing. In addition, the reception unit 312 may receive a training ID from another application in the internal or external device.
  • a training ID information indicating predetermined training related to the trunk based on a user operation. For example, when the user's pressing on a plurality of trainings displayed on the display screen is detected, the receiving unit 312 receives a training ID corresponding to the pressing.
  • the reception unit 312 may receive a training ID from another application in the internal or external device.
  • the identifying unit 314 identifies threshold values for the predetermined training accepted by the accepting unit 312 based on threshold information stored in the storage unit 302 and associated with threshold values for each training related to the trunk. For example, the specifying unit 314 specifies a threshold for a predetermined training indicated by the training ID received by the receiving unit 312 with reference to the threshold information.
  • the number of specified thresholds is not particularly limited as long as one or a plurality of thresholds corresponding to the content of training is set.
  • the threshold information is associated with a threshold of a signal to be used for each training. The threshold information will be described later with reference to FIG.
  • the obtaining unit 316 obtains a signal value detected from a sensor attached to a user who is an evaluation target and performs predetermined training indicated by the training ID. For example, the acquisition unit 316 directly acquires a signal value of a signal used for posture determination from a sensor attached to the user (for example, the 6-axis sensor 206), or is temporarily stored in the storage unit 302. The signal value obtained indirectly.
  • the evaluation unit 318 evaluates the posture of the user during a predetermined training based on the comparison result between the signal value acquired by the acquisition unit 316 and the threshold value specified by the specifying unit 314. For example, the evaluation unit 318 evaluates the posture stability based on whether or not the absolute value of the signal value is less than a threshold value. More specifically, the evaluation unit 318 determines that the posture is stable if the signal value is within a predetermined range indicated by the threshold value, and if the signal value is not within the predetermined range, the posture is Judge that it is not stable. Thereby, the evaluation unit 318 can evaluate the stability of the posture.
  • the display control unit 322 controls the evaluation result of the evaluation unit 318 to be displayed on the display screen. Thereby, evaluation with respect to the posture of the user during trunk training can be fed back to the user.
  • the identifying unit 314 may use threshold information in which one or a plurality of signals to be compared and a threshold value of this signal are associated with each other for posture determination for posture determination. For example, the specifying unit 314 specifies a signal to be compared and a threshold value for the predetermined training received by the receiving unit 312 based on the threshold information.
  • the acquiring unit 316 acquires the signal value of the comparison target signal specified by the specifying unit 314 from the sensor attached to the user.
  • the acquisition unit 316 may acquire all signals (first to sixth signals) from the sensor and acquire only the specified signal from the signals.
  • the acquisition unit 316 may notify the sensor of the specified signal and acquire only the specified signal from the sensor.
  • the specifying unit 314 may use threshold information in which a peak threshold related to a signal from the sensor is further associated with each posture for posture determination. For example, the specifying unit 314 further specifies the peak threshold value of the comparison target signal for the predetermined training based on the threshold information.
  • the acquisition unit 316 includes calculating a difference value between peaks of signals to be compared.
  • FIG. 5 is a block diagram illustrating an example of the function of the acquisition unit 316.
  • the acquisition unit 316 illustrated in FIG. 5 includes a difference value calculation unit 3162.
  • the difference value calculation unit 3162 calculates a difference value between peaks of the posture signal repeated every predetermined time. For example, when the exercise of maintaining the same posture for a predetermined time is repeated a plurality of times, the difference value calculation unit 3162 obtains a peak value within this predetermined time, and the difference between this peak value and the peak value within the next predetermined time The values are calculated one after another.
  • FIG. 6 is a block diagram illustrating an example of the function of the evaluation unit 318.
  • the evaluation unit 318 illustrated in FIG. 6 includes a first determination unit 3181, a first evaluation unit 3183, a second evaluation unit 3185, a score calculation unit 3187, and a second determination unit 3189.
  • the first determination unit 3181 determines whether or not the signal value of the signal acquired during the predetermined training is included in the predetermined range represented by the threshold value. In the embodiment, for example, since training with relatively small movement is assumed, a signal value exceeding a predetermined range indicates that the movement is large.
  • the first determination unit 3181 performs the comparison determination using the signal value of the signal corresponding to the training and the threshold value of the signal.
  • the first determination unit 3181 outputs the comparison result to the first evaluation unit 3183.
  • the first evaluation unit 3183 evaluates that the posture is stable if the absolute value of the signal value is less than the threshold value, and the absolute value of the signal value sets the threshold value. If it exceeds, evaluate that the posture is unstable.
  • the second evaluation unit 3185 evaluates the posture stability based on whether or not the absolute value of the difference value calculated by the difference value calculation unit 3162 is less than the peak threshold value stored in the storage unit 302. May be. For example, the second evaluation unit 3185 determines that the postures at the time of repetition are the same if the absolute value of the difference value between the peaks of the signals related to the postures repeated every predetermined time is less than the threshold, and the postures are stabilized. It is considered. Further, if the absolute value of the difference value between the peaks is equal to or greater than the threshold value, the second evaluation unit 3185 determines that the posture at the time of repetition is different, and regards the posture as unstable. Thereby, it is possible to appropriately evaluate the stability of the posture with respect to repetition of the predetermined posture executed in the trunk training.
  • the peak threshold value may be set for each signal.
  • the score calculation unit 3187 may calculate a score indicating the stability of the posture according to the number of times that the absolute value of the signal value or the absolute value of the difference value is equal to or greater than the threshold value. For example, the score calculation unit 3187 may calculate the score by subtracting the number of times that the absolute value is equal to or greater than the threshold value from the full score of 100 points. Thereby, it is possible to provide the user with an evaluation index that can be easily grasped with respect to the user's posture.
  • the second determination unit 3189 uses the signal value from the sensor to determine whether the user is performing a predetermined training. For example, in the accepted training, the second determination unit 3189 has a signal value close to 0 and a small fluctuation or a signal value around the predetermined value even though it is during a predetermined time for maintaining a predetermined posture. If it does not change, it is determined that training is not in progress.
  • the score calculation unit 3187 may reflect the determination result by the second determination unit 3189 as to whether or not the user is performing the predetermined training in the score. For example, when it is determined that training is not being performed based on the sensor signal, the score calculation unit 3187 decreases the score from the current value by a predetermined value. Note that the score calculation unit 3187 may increase the value of the predetermined value as the time determined to be not during training is longer. By reflecting the determination result during such training in the score, it is possible to give the user an opportunity to continue the repeated training, for example.
  • each evaluation result by the first evaluation unit 3183 and the second evaluation unit 3185, and the score calculated by the score calculation unit 3187 are displayed on the display screen by the display control unit 322. Thereby, a more detailed evaluation result can be notified to the user.
  • the specifying unit 314 specifies one or more trainings for a predetermined attribute based on the training information stored in the storage unit 302 and associated with one or more attributes for each training. .
  • the specifying unit 314 specifies one or a plurality of trainings for the predetermined attribute.
  • the type, rest time, required time, predetermined part, purpose, and the like are associated as attributes. The training information will be described later with reference to FIG.
  • the generating unit 320 generates a training menu by combining a predetermined number of trainings from one or a plurality of trainings specified by the specifying unit 314. For example, when extracting the training, the generation unit 320 uses the time required for the training so that it is shorter than the predetermined time as a whole, or only the same type of training is not extracted using the type of training. Or Thereby, since the generation unit 320 generates a menu by appropriately combining a plurality of trainings, the training can be continued without getting tired of the user.
  • FIG. 7 is a diagram illustrating an example of functions of the generation unit 320.
  • the generation unit 320 illustrated in FIG. 7 includes a first generation unit 3202 and a second generation unit 3204.
  • the first generation unit 3202 generates a menu from the received predetermined purpose
  • the second generation unit 3204 generates a menu from the received predetermined part.
  • the accepting unit 312 accepts information indicating a predetermined purpose related to training (hereinafter also referred to as purpose-specific ID).
  • the predetermined purpose is, for example, “to train the basic trunk”, “to make the walking posture clean”, “easy to lose weight”, or the like.
  • the storage unit 302 stores purpose information in which purpose-specific IDs and purpose names are associated (for example, see FIG. 10).
  • the specifying unit 314 specifies a plurality of trainings for the predetermined purpose indicated by the received purpose-specific ID based on the training information associated with the attributes including the purpose and type regarding the training.
  • the types of training include, for example, “stretch”, “strengthening”, “balance interlock”, and the like.
  • the first generation unit 3202 extracts each type of training from the plurality of trainings specified by the specifying unit 314 so that the ratio of each type becomes a predetermined rate, and performs each type of training extracted. Combine to generate a training menu.
  • the first generation unit 3202 extracts training so that the ratio of “stretch”, “strengthening”, and “balance interlock” is 3: 2: 1 as the type of training. More specifically, the first generation unit 3202 extracts three “stretch” randomly, two “strengthen” randomly, and one “balance interlock” randomly from the identified training. These are combined to generate a menu.
  • a training menu is automatically generated at random, even if the same purpose is selected, the same menu is not used every time, and the training can be continued without causing the user to feel bored.
  • the accepting unit 312 accepts information indicating a predetermined part related to the user's body (hereinafter also referred to as a part ID). For example, the reception unit 312 receives a part ID from another application or receives a part ID based on a user operation.
  • the predetermined part is, for example, the back, the abdomen, the ankle, or the like.
  • the other application is, for example, an application that supports running or walking. When this other application can detect a site that is a weak point of running or walking, the reception unit 312 receives a site ID from this other application.
  • Other applications may be installed in an external device or installed in the same device.
  • the specifying unit 314 specifies a plurality of trainings for a predetermined part indicated by the part ID based on training information associated with an attribute including one or a plurality of parts related to the user's body.
  • the identifying unit 314 refers to the training information stored in the storage unit 302 and identifies the training associated with a predetermined part (for example, “back”).
  • the second generation unit 3204 generates a training menu by combining a predetermined number of trainings from among the plurality of trainings specified by the specifying unit 314. For example, the second generation unit 3204 generates a menu by randomly extracting a predetermined number from a plurality of trainings associated with “back”. Thereby, a training menu can be automatically generated for a predetermined part that the user wants to train or a predetermined part notified from another application. Further, the first generation unit 3202 and the second generation unit 3204 may extract the training in consideration of the time required for each training and other attributes.
  • each menu generated by the first generation unit 3202 and the second generation unit 3204 is displayed on the display screen by the display control unit 322. Thereby, the user can grasp
  • FIG. 8 is a diagram illustrating an example of threshold information.
  • training identification information for example, training ID
  • a training name for example, a training name
  • parameters for specifying a comparison target for example, judgeFactor and judgePeak
  • a threshold value of the parameter for example, judgeFactor and judgePeak
  • JudgeFactor is represented by 6 digits of 0 or 1, and corresponds to the first signal (thr_accX), second signal (thr_accY),. Further, “0” indicates that it is not a comparison target, and “1” indicates that it is a comparison target. For example, when the judgeFactor is “000110”, it indicates that the fourth signal and the fifth signal are used as comparison targets.
  • the judgeFactor is used by the first evaluation unit 3183. Further, the numerical values set in each signal row shown in FIG. 8 indicate threshold values. For example, the threshold for the first signal (thr_accX) with ID “4” is “2”.
  • JudgePeak is represented by 6 digits of 0 or 1, and corresponds to the peak of the first signal, the peak of the second signal,..., The peak of the sixth signal (thr_yaw_pk) in order from the left. Further, “0” indicates that it is not a comparison target, and “1” indicates that it is a comparison target. For example, when “judge Peak” is “010010”, it indicates that the peak of the second signal and the peak of the fifth signal are used as comparison targets. The judge Peak is used by the second evaluator 3185. Moreover, the numerical value provided in the peak row shown in FIG. 8 indicates the threshold value. For example, the threshold for the peak of the sixth signal with ID “3” is “2”.
  • the threshold value of the fourth signal (thr_pitch) is “1”, and the threshold value between peaks is not applied.
  • the training name “Standing Rotation”, judgeFactor “111000”, and judge Peak “000001” are associated with the training of No. 3.
  • the threshold value of the first signal (thr_accX) is “1”
  • the threshold value of the second signal (thr_accY) is “1”
  • the threshold value of the third signal (thr_accZ) is “1”.
  • the specifying unit 314 can easily specify a signal to be compared and a threshold for training by referring to the judgeFactor and the judgePeak and the threshold.
  • FIG. 9 is a diagram showing an example of training information.
  • training identification information for example, training ID
  • the training name for example, training ID
  • type (effect) for example, lottery exception flag
  • rest time for example, number of times
  • required time for example, predetermined part (for example, back, trunk side surface) ID for each purpose.
  • the type indicates the type or effect of training. Types include, for example, stretching, strengthening, and balance interlocking.
  • the lottery exception flag indicates the number of training that is not executed simultaneously. For example, “Agura Rotation” of training ID “5” is not included in the training menu at the same time as “Standing Rotation” of training ID “3”. These are the same types of convolution training among stretching training.
  • the rest time indicates the rest time of the posture that the user takes during training. For example, when the stationary time is “3”, this indicates that the posture is maintained for 3 seconds.
  • the number of times indicates the number of times a predetermined posture is repeated in one training. For example, when the number of times is “3”, this indicates that the posture is repeated three times.
  • the required time indicates the time required for one training.
  • the predetermined part indicates a human part effective for the training.
  • the part name is used as the part ID, but a number or the like may be used.
  • One training may be effective for a plurality of parts. “ ⁇ ” is set for a portion having an effect, and “ ⁇ ” is set for a portion having no effect.
  • the purpose-specific ID is information for identifying the purpose of the training. As will be described later with reference to FIG. 10, purpose information associated with the purpose of training is stored in the storage unit 302 for each purpose ID. For example, the purpose-specific ID “1” is “train the basic trunk”, and the purpose-specific ID “2” is “clean walking posture”.
  • the specifying unit 314 can specify the predetermined training using the purpose-specific ID and can specify the predetermined training using the predetermined part. .
  • FIG. 10 is a diagram showing an example of the purpose information.
  • a training purpose name is associated with each purpose ID.
  • the purpose ID “1” is associated with the purpose name “Train the basic trunk”.
  • FIG. 11 is a diagram showing an example of evaluation result information for the user's posture.
  • a score evaluated in each training, a signal value of each signal, and a threshold value of a peak value are associated with a predetermined user.
  • the user “AAA” has scored “78” for the training “Lunge”. As shown in FIG. 11, for each training, a score indicating an evaluation regarding the posture is associated.
  • FIG. 12 is a diagram illustrating a specific example of threshold comparison for a predetermined signal.
  • the fluctuation of one signal S1 among the first to sixth signals is shown.
  • T1 1st set (right) guidance start T2: 1st set (right) ready posture start T3: 1st set (right) posture rest start T4: 1st set (right) posture rest end T5: 1 Set posture (right) posture end and first set (left) guidance start T6: first set (left) ready posture start T7: first set (left) posture static start T8: first set (left) Posture stationary end T9: First set (left) posture end and second set (right) guidance start T10: Second set (right) ready posture start T11: Second set (right) posture stationary start ⁇ ⁇ ⁇
  • the first determination unit 3181 determines whether or not the signal S1 is included in a predetermined range between the corresponding threshold value Th1 and threshold value Th2.
  • the threshold value Th1 is a threshold value set in the reference value + storage unit 302
  • the threshold value Th2 is a threshold value set in the reference value ⁇ storage unit 302.
  • the reference value is, for example, an average value of signal values within the moving time window.
  • the first determination unit 3181 determines that the values V1, V2, V3, and V4 are not included in the predetermined range in the section T3-T4. In this case, the first evaluation unit 3183 evaluates that the posture with respect to these values is not stable.
  • the threshold value Th1 and the threshold value Th2 may be set as the threshold value stored in the storage unit 320 for each training period.
  • the determination of whether or not the value of the signal S1 falls within the predetermined range is not limited to the posture stationary time, and may be applied to, for example, the posture preparation time (T2-T3, etc.).
  • the difference value calculation unit 3162 calculates a difference value between the peak value P1 near T3 and the peak value P2 near T11 with respect to posture stability in repeated postures.
  • the first determination unit 3181 determines whether or not the difference value between peaks (P1 ⁇ P2) is less than a threshold value. If the difference value between the peaks is less than the threshold, the second evaluation unit 3185 evaluates that the repeated posture is the same and is stable, and if the difference value between the peaks is equal to or greater than the threshold, the repeated posture Is different and is evaluated as not stable.
  • the score calculation unit 3187 may subtract a preset value from the current score.
  • FIG. 13 is a diagram illustrating an example of the setting screen.
  • the screen shown in FIG. 13 is a screen that is displayed, for example, in an initial stage after the application is started.
  • the user can select either the “Train training by purpose” button B10 or the “Training from another application” button B12. This is a screen to be selected.
  • the reception unit 312 notifies the display control unit 322 of information indicating this button press, and the display control unit 322 displays a screen that allows the user to select a purpose. Control (for example, see FIG. 14).
  • the display control unit 322 notifies the display control unit 322 of information indicating this button press, and the display control unit 322 displays a screen that allows the user to select another application. (See, for example, FIG. 19).
  • FIG. 14 is a diagram showing an example of the purpose selection screen.
  • the screen shown in FIG. 14 is a screen on which buttons for selecting one purpose from one or a plurality of purposes are displayed. For example, a “basic trunk training” button B20 indicating training for training the basic trunk and a “want to clean walking posture” button B22 indicating training for cleaning the walking posture are displayed.
  • the receiving unit 312 detects the pressing of the button B20 and notifies the specifying unit 314 of the purpose-specific ID “1” for training the basic trunk.
  • the specifying unit 314 acquires the purpose-specific ID “1”, refers to the training information illustrated in FIG. 9, and specifies a plurality of trainings corresponding to the purpose-specific ID “1”.
  • the generation unit 320 extracts a predetermined number of trainings by random lottery using random numbers from the training specified by the specifying unit 314, for example, and generates a training menu.
  • FIG. 15 is a diagram showing an example of a training menu display screen.
  • four trainings of “stretch” TR10, “position” TR12, “lift” TR14, and “up” TR16 are extracted by the generation unit 320.
  • the screen shown in FIG. 15 displays a required time of 10 minutes when all four trainings are performed.
  • each training can be selected in a button format.
  • the training is selected. Press the “Start” button to start the training.
  • the accepting unit 312 accepts that the user has pressed the “up” TR16 button, and then accepts that the “start training” button has been pressed.
  • the display control unit 322 displays training guidance corresponding to the “up” TR16 on the display screen, and the specifying unit 314 displays the signal to be compared and the threshold corresponding to the “up” TR16 in FIG. It specifies with reference to the threshold value information to show.
  • FIG. 16 is a diagram showing an example of a training guidance screen for the “up” TR 16.
  • guidance for “stretching the iliopsoas muscle” which is a training corresponding to “up” is displayed.
  • the screen shown in FIG. 16 displays “30 seconds” as the required time and the number of sets “3set”.
  • the display control unit 322 controls to display the training screen according to a preset procedure.
  • the procedure is (1) Guidance start (preparation posture), (2) Posture transition count, (3) Posture stop, and (4) Count back to the preparatory posture.
  • the procedures (2) to (4) are repeated for the number of sets.
  • the evaluation unit 318 evaluates the posture of the user as described above by using the signal sensed by the 6-axis sensor 206 mounted on the eyewear 30 during (2) to (4).
  • the evaluation unit 318 may calculate a score as an evaluation index for posture.
  • FIG. 17 is a diagram showing an example of the score display screen.
  • the display control unit 322 performs control so that “76” calculated by the score calculation unit 3187 is displayed on the screen as the score SC10 with respect to the posture of the user with respect to the “intestinal psoas muscle stretch” training. To do.
  • FIG. 18 is a diagram illustrating an example of a training result screen.
  • a character is displayed, and a score is displayed for each part of the character.
  • an average score of scores of all the trainings performed by the user associated with the part is displayed.
  • the score displayed on the abdomen is the average score of the training A and B scores.
  • the average score is calculated by the score calculation unit 3187.
  • FIG. 19 is a diagram illustrating an example of an application selection screen.
  • the screen shown in FIG. 19 is a screen on which buttons for selecting one application from one or a plurality of applications are displayed. For example, a “RUN” button B30 indicating another application and a “WALK” button 32 are displayed.
  • the accepting unit 312 detects the pressing of the button B30, and acquires a region ID indicating the region “thigh sole” from the application corresponding to “RUN” using a URL scheme or the like.
  • FIG. 20 is a diagram illustrating an example of a screen in which a predetermined part is notified from another application.
  • the screen shown in FIG. 20 indicates that “back of thigh” is notified from the running application.
  • the part notified from the running application is shown using the character, and a “training” button B40 for starting training is displayed.
  • the accepting unit 312 notifies the specifying unit 314 of a site ID indicating the site “thigh back side”.
  • the specifying unit 314 acquires the part ID, refers to the training information illustrated in FIG. 9, and specifies a plurality of trainings corresponding to the part “thigh reverse side” indicated by the part ID.
  • the generation unit 320 extracts a predetermined number of trainings by random lottery using random numbers from the training specified by the specifying unit 314, for example, and generates a training menu.
  • FIG. 21 is a diagram showing an example of a training menu display screen.
  • the “Lung Stretch” TR20, “Hamstring Stretch” TR22, “Intestinal Lumbar Muscle Stretch” TR24, “Plank Position (with Knee)” TR26, and “Squat on the Thigh Back” TR28 are generated by the generation unit 320.
  • “Training from one leg to raising one hand” TR30 is extracted.
  • the screen shown in FIG. 21 displays a required time of 11 minutes when all six trainings are performed.
  • each training can be selected in a button format.
  • the training is selected. Press the “Start” button to start the training.
  • the generation unit 320 may generate a training menu so that each type of training has a predetermined ratio. For example, when the ratio of stretch: strengthening: balance interlock is 3: 2: 1, the generation unit 320 changes the type “stretch”, “range stretch” TR20, “hamstring stretch” TR22, and “intestinal lumbar muscle stretch”. "TR24" is extracted, two types of "plank position (with knee)” TR26 and “thigh squat squat” TR28 are extracted from the type “strengthened”, and from the type "balance interlock” One of "Third hand up from” TR30 is extracted, and a menu is automatically generated.
  • a predetermined procedure is displayed on the screen as described above, and the user performs training according to the procedure.
  • the evaluation unit 318 evaluates the posture of the user during training.
  • FIG. 22 is a flowchart illustrating an example of overall processing of an application that executes trunk training in the embodiment.
  • the flowchart shown in FIG. 22 is a process performed when this application is executed by the user wearing the eyewear 30, operating the information processing apparatus 10, and touching the application icon described above by the user. is there.
  • the connection between the processing device 20 and the information processing device 10 may be performed in advance.
  • step S102 shown in FIG. 22 the control unit 306 determines based on the user operation whether the purpose of training or another application has been selected. For example, the control unit 306 determines whether or not the user has touched the “Training by Purpose” button B10 or the “Training from another application” button B12 illustrated in FIG. If the purpose is selected, the process proceeds to step S104, and if another application is selected, the process proceeds to step S108.
  • step S104 the display control unit 322 displays a plurality of purposes on the screen and allows the user to select one purpose.
  • the accepting unit 312 accepts the selected purpose. For example, information indicating the purpose corresponding to the pressed button is detected by the user pressing either the “Train the basic trunk” button B20 or the “I want to clean my walking posture” button B22 shown in FIG. To get.
  • step S ⁇ b> 106 when the generation unit 320 acquires information indicating the purpose from the reception unit 312, the generation unit 320 refers to the training information illustrated in FIG. 9 and combines a predetermined number of trainings from among the trainings associated with the purpose. Is generated. At this time, for example, a training menu screen shown in FIG. 15 is displayed. The menu generation process according to the purpose will be described later with reference to FIG.
  • step S108 the display control unit 322 displays a plurality of other applications on the screen and causes the user to select one application.
  • the accepting unit 312 accepts the selected application. For example, the user detects which of the “RUN” button B30 or the “WALK” button B32 shown in FIG. 19 is pressed, and acquires information indicating an application corresponding to the pressed button.
  • step S110 when the generation unit 320 acquires information indicating an application from the reception unit 312, the generation unit 320 acquires information indicating a predetermined part from the application using a URL scheme or the like. At this time, as shown in FIG. 20, the acquired predetermined part may be notified to the user.
  • the generation unit 320 refers to the training information illustrated in FIG. 9 and generates a menu by combining a predetermined number of trainings from among the trainings associated with a predetermined part acquired from another application. At this time, for example, a training menu screen shown in FIG. 21 is displayed. The menu generation process corresponding to the predetermined part will be described later with reference to FIG.
  • step S112 the control unit 306 determines whether training has started. For example, the control unit 306 determines whether or not a button for starting training displayed on the screen has been pressed by the user. If training is started (step S112—YES), the process proceeds to step S114. If training is not started (step S112—NO), the process returns to step S112.
  • step S114 the control unit 306 starts a training process.
  • the control unit 306 evaluates the posture during training using a signal sensed by the 6-axis sensor 206 mounted on the eyewear 30. This training process will be described later with reference to FIG.
  • step S116 the display control unit 322 controls to display the training result on the screen. At this time, for example, a training result screen shown in FIG. 18 is displayed.
  • FIG. 23 is a flowchart showing an example of a menu generation process according to the purpose.
  • the identifying unit 314 refers to the training information illustrated in FIG. 9 and identifies one or a plurality of trainings based on the accepted purpose.
  • step S204 the first generation unit 3202 extracts the training so that the ratio of each type becomes a predetermined ratio based on the specified type of training. For example, when the types of training are A, B, and C, the training is randomly extracted so as to be 1: 1: 1.
  • the predetermined ratio can be set as appropriate.
  • step S206 the first generation unit 3202 generates a menu by combining the extracted training.
  • the first generation unit 3202 may determine the order of training according to a predetermined criterion such as the time required for training or the same training type not continuing.
  • the user can automatically generate a training menu suitable for the purpose only by selecting the purpose of training the trunk. Even if the same purpose is selected repeatedly, the training is randomly extracted, so that the training can be continued without getting tired of the user instead of the same training menu every time.
  • FIG. 24 is a flowchart showing an example of menu generation processing corresponding to a predetermined part.
  • the reception unit 312 acquires information indicating a predetermined part from another application using a URL scheme or the like.
  • the identifying unit 314 refers to the training information illustrated in FIG. 9 and identifies one or more trainings based on the received predetermined part. For example, the specifying unit 314 specifies the training associated with the predetermined part as having an effect.
  • step S204 the second generation unit 3204 randomly extracts a predetermined number of trainings from the identified trainings, and generates a menu by combining the extracted trainings.
  • the second generation unit 3204 may determine the order of training according to a predetermined criterion such as the time required for training and the same training type not continuing.
  • a training menu suitable for a predetermined part notified from another application can be automatically generated. Even if the same part is repeatedly notified, since the training is randomly extracted, the training can be continued without getting tired of the user instead of the same training menu every time.
  • the process shown in FIG. 24 may be executed by inputting a part that the user wants to train.
  • FIG. 25 is a flowchart showing an example of the training process.
  • the display control unit 322 performs control so as to display a screen that prompts the user to perform a posture before training (also referred to as a preparation posture).
  • the evaluation unit 318 may also evaluate the preparation posture based on the signal from the sensor.
  • step S404 the display control unit 322 controls the user to display a screen for maintaining a posture during training (also referred to as a stationary posture).
  • the evaluation unit 318 evaluates the stationary posture based on the signal from the sensor.
  • step S406 the score calculation unit 3187 scores a posture during training using a threshold value for the training, and calculates a score.
  • the score calculation unit 3187 scores a posture during training using a threshold value for the training, and calculates a score.
  • a signal corresponding to the training it is possible to omit processing using a signal having low relevance and contribute to power saving.
  • step S408 the control unit 306 determines whether or not all sets have been completed. If all sets have been completed (step S408-YES), the process proceeds to step S410. If all sets have not been completed (step S408-NO), the process returns to S402.
  • step S410 the score calculation unit 3187 calculates the average score (average score) by summing up the scores in each set and dividing by the number of sets.
  • the calculated average score is displayed on the screen by the display control unit 322.
  • Each of the processing steps included in the processing flow described with reference to FIGS. 22 to 24 can be executed in any order or in parallel within a range in which there is no contradiction in processing contents. Other steps may be added in between. Further, a step described as one step for convenience can be executed by being divided into a plurality of steps, while a step described as being divided into a plurality of steps for convenience can be grasped as one step.
  • the embodiment it is possible to appropriately evaluate the posture of the user in the training for maintaining the predetermined posture for a predetermined time.
  • by changing the signal value used for each training it is possible to perform an evaluation suitable for the training, and further, it is possible to contribute to power saving by omitting unnecessary processing. it can.
  • by automatically generating a training menu based on a predetermined purpose or part by automatically generating a training menu based on a predetermined purpose or part,
  • eyewear 30 is glasses
  • eyewear is not limited to this.
  • the eyewear may be any device related to the eye, and may be a face wearing device or a head wearing device such as glasses, sunglasses, goggles and a head mounted display and their frames.
  • the eyewear 30 may be provided with a biological electrode.
  • eye movement or blink may be detected based on an electrooculogram signal that can be obtained from the biological electrode.
  • each data that can be acquired from the 6-axis sensor 206 may be stored in association with the line-of-sight movement and the blink. This makes it possible to analyze blinks and line-of-sight movement during exercise.
  • the detection data from the six-axis sensor 206 mounted on the eyewear 30 has been described.
  • the detection data from the six-axis sensor 111 mounted on the information processing apparatus 10 may be used. It is possible to execute the application described in the example. That is, the 6-axis sensor only needs to be mounted not only on the head but also on any position of the human body.
  • the 6-axis sensor is preferably attached to the trunk.
  • the posture at the time of training regarding the trunk can be evaluated without making the user feel bothersome about wearing the sensor.
  • the application described above has been described using an example in which the application is applied to trunk training.
  • the present invention can also be applied to cases where effective sensor signals differ for each training.
  • the generation of training menus can also be applied to training that has a plurality of types and training that is effective for parts with different results.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Tools (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An information processing method according to disclosed technology is executed on a computer having a control unit and memory. The control unit executes: acceptance of information indicating prescribed training concerning the torso; identification of a threshold value with respect to the prescribed training on the basis of threshold information that is stored in the memory and in which a threshold value is associated with each training concerning the torso; acquisition of the value of a signal detected from a sensor mounted on a user to be evaluated; and evaluation of the posture of the user during the prescribed training on the basis of the result of comparing the signal value and the threshold value.

Description

情報処理方法、情報処理装置及びプログラムInformation processing method, information processing apparatus, and program
 本発明は、情報処理方法、情報処理装置及びプログラムに関する。 The present invention relates to an information processing method, an information processing apparatus, and a program.
 従来、体幹部に装着した加速度センサからの加速度情報を用いて、運動内容を判別し、各運動内容の運動量を算出する技術が知られている(例えば、特許文献1参照)。 2. Description of the Related Art Conventionally, a technique is known in which exercise information is determined using acceleration information from an acceleration sensor attached to a trunk and the amount of exercise of each exercise content is calculated (see, for example, Patent Document 1).
特開2006-271893号公報JP 2006-271893 A
 しかしながら、従来技術では、基本的に運動量を算出するものであるため、体幹に関するトレーニングのような所定姿勢を所定時間維持するトレーニングに対して適用することは想定されていない。また、体幹に関するトレーニングについては、姿勢の安定性を評価することが好ましいが、従来技術では、体幹に関するトレーニングにおける姿勢の安定性を評価するようなものは存在しない。 However, since the conventional technique basically calculates the amount of exercise, it is not assumed to be applied to training that maintains a predetermined posture such as training on the trunk for a predetermined time. In addition, for training relating to the trunk, it is preferable to evaluate the stability of posture, but there is no conventional technique for evaluating the stability of posture in training relating to the trunk.
 そこで、開示の技術は、所定姿勢を所定時間維持するトレーニングにおけるユーザの姿勢について、適切な評価を行うことができることを目的とする。 Therefore, the disclosed technique aims to be able to appropriately evaluate the user's posture in training for maintaining the predetermined posture for a predetermined time.
 開示の技術の一態様における情報処理方法は、制御部及び記憶部を有するコンピュータが実行する情報処理方法であって、前記制御部は、体幹に関する所定のトレーニングを示す情報を受け付けること、前記記憶部に記憶された閾値情報であって、体幹に関するトレーニングごとに閾値が関連付けられた前記閾値情報に基づき、前記所定のトレーニングに対する閾値を特定すること、評価対象のユーザに装着されたセンサから検出される信号値を取得すること、前記信号値と前記閾値との比較結果に基づいて、前記所定のトレーニング時の前記ユーザの姿勢を評価すること、を実行する。 An information processing method according to an aspect of the disclosed technology is an information processing method executed by a computer having a control unit and a storage unit, wherein the control unit receives information indicating predetermined training related to a trunk, the storage Threshold information stored in the unit, wherein a threshold for the predetermined training is specified based on the threshold information associated with a training for each trunk, and detected from a sensor attached to a user to be evaluated Obtaining a signal value to be performed, and evaluating the posture of the user during the predetermined training based on a comparison result between the signal value and the threshold value.
 開示の技術によれば、所定姿勢を所定時間維持するトレーニングにおけるユーザの姿勢について、適切な評価を行うことができる。 According to the disclosed technique, it is possible to appropriately evaluate the posture of the user in training for maintaining the predetermined posture for a predetermined time.
実施例における情報処理システムの一例を示す図である。It is a figure which shows an example of the information processing system in an Example. 実施例における情報処理装置のハードウェア構成を示す概略構成図である。It is a schematic block diagram which shows the hardware constitutions of the information processing apparatus in an Example. 実施例における処理装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the processing apparatus in an Example. 実施例における情報処理装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the information processing apparatus in an Example. 実施例における取得部の機能の一例を示すブロック図である。It is a block diagram which shows an example of the function of the acquisition part in an Example. 実施例における評価部の機能の一例を示すブロック図である。It is a block diagram which shows an example of the function of the evaluation part in an Example. 実施例における生成部の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the production | generation part in an Example. 閾値情報の一例を示す図である。It is a figure which shows an example of threshold value information. トレーニング情報の一例を示す図である。It is a figure which shows an example of training information. 目的情報の一例を示す図である。It is a figure which shows an example of the objective information. ユーザの姿勢に対する評価結果情報の一例を示す図である。It is a figure which shows an example of the evaluation result information with respect to a user's attitude | position. 所定の信号における閾値比較の具体例を示す図である。It is a figure which shows the specific example of the threshold value comparison in a predetermined signal. 設定画面の一例を示す図である。It is a figure which shows an example of a setting screen. 目的選択画面の一例を示す図である。It is a figure which shows an example of the purpose selection screen. トレーニングメニュー表示画面の一例を示す図である。It is a figure which shows an example of a training menu display screen. トレーニングのガイダンス画面の一例を示す図である。It is a figure which shows an example of the guidance screen of training. スコア表示画面の一例を示す図である。It is a figure which shows an example of a score display screen. トレーニング結果画面の一例を示す図である。It is a figure which shows an example of a training result screen. アプリ選択画面の一例を示す図である。It is a figure which shows an example of an application selection screen. 他のアプリケーションから所定の部位が通知された画面の一例を示す図である。It is a figure which shows an example of the screen in which the predetermined site | part was notified from the other application. トレーニングメニュー表示画面の一例を示す図である。It is a figure which shows an example of a training menu display screen. 実施例における体幹トレーニングを実行するアプリケーションの全体処理の一例を示すフローチャートである。It is a flowchart which shows an example of the whole process of the application which performs trunk training in an Example. 目的に応じたメニューの生成処理の一例を示すフローチャートである。It is a flowchart which shows an example of the production | generation process of the menu according to the objective. 所定の部位に応じたメニューの生成処理の一例を示すフローチャートである。It is a flowchart which shows an example of the production | generation process of the menu according to a predetermined part. トレーニング処理の一例を示すフローチャートである。It is a flowchart which shows an example of a training process.
 以下、図面を参照して本発明の実施の形態を説明する。ただし、以下に説明する実施形態は、あくまでも例示であり、以下に明示しない種々の変形や技術の適用を排除する意図はない。即ち、本発明は、その趣旨を逸脱しない範囲で種々変形して実施することができる。また、以下の図面の記載において、同一または類似の部分には同一または類似の符号を付して表している。図面は模式的なものであり、必ずしも実際の寸法や比率等とは一致しない。図面相互間においても互いの寸法の関係や比率が異なる部分が含まれていることがある。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. However, the embodiment described below is merely an example, and there is no intention to exclude various modifications and technical applications that are not explicitly described below. That is, the present invention can be implemented with various modifications without departing from the spirit of the present invention. In the following description of the drawings, the same or similar parts are denoted by the same or similar reference numerals. The drawings are schematic and do not necessarily match actual dimensions and ratios. In some cases, the dimensional relationships and ratios may be different between the drawings.
 [実施例]
 実施例では、加速度センサ及び角速度センサを搭載する対象として、アイウエアを例に挙げる。図1は、実施例における情報処理システム1の一例を示す図である。図1に示す情報処理システム1は、外部装置10とアイウエア30とを含み、外部装置10とアイウエア30は、ネットワークを介して接続され、データ通信可能になっている。
[Example]
In the embodiment, eyewear is taken as an example of an object on which an acceleration sensor and an angular velocity sensor are mounted. FIG. 1 is a diagram illustrating an example of an information processing system 1 in the embodiment. The information processing system 1 shown in FIG. 1 includes an external device 10 and eyewear 30, and the external device 10 and eyewear 30 are connected via a network so that data communication is possible.
 アイウエア30は、例えばテンプル部分に処理装置20を搭載する。処理装置20は、3軸加速度センサ及び3軸角速度センサ(6軸センサでもよい)を含む。また、アイウエア30は、一対のノーズパッド及びブリッジ部分にそれぞれ生体電極31、33、35を有してもよい。アイウエア30に生体電極が設けられる場合、これらの生体電極から取得される眼電位信号は、処理装置20に送信される。 The eyewear 30 mounts the processing device 20 on a temple portion, for example. The processing device 20 includes a three-axis acceleration sensor and a three-axis angular velocity sensor (may be a six-axis sensor). The eyewear 30 may have bioelectrodes 31, 33, and 35 at the pair of nose pads and the bridge portion, respectively. When biological electrodes are provided on the eyewear 30, electrooculogram signals acquired from these biological electrodes are transmitted to the processing device 20.
 処理装置20の設置位置は、必ずしもテンプルである必要はないが、アイウエア30が装着された際のバランスを考慮して位置決めされればよい。 The installation position of the processing apparatus 20 is not necessarily a temple, but may be positioned in consideration of a balance when the eyewear 30 is attached.
 外部装置10は、通信機能を有する情報処理装置である。例えば、外部装置10は、ユーザが所持する携帯電話及びスマートフォン等の携帯通信端末やPC(Personal Computer)、タブレット型端末等である。外部装置10は、処理装置20から受信したセンサ信号等に基づいて、体幹に関するトレーニング(以下、体幹トレーニングとも称する。)時の姿勢を評価する。以下、外部装置10は、情報処理装置10と称して説明する。 The external device 10 is an information processing device having a communication function. For example, the external device 10 is a mobile communication terminal such as a mobile phone or a smartphone possessed by a user, a PC (Personal Computer), a tablet terminal, or the like. The external device 10 evaluates the posture during training related to the trunk (hereinafter also referred to as trunk training) based on the sensor signal received from the processing device 20. Hereinafter, the external device 10 will be described as the information processing device 10.
 <情報処理装置10のハードウェア構成>
 図2は、実施例における情報処理装置10のハードウェア構成を示す概略構成図である。情報処理装置10の典型的な一例は、スマートフォンなどの携帯電話機であるが、この他、ネットワークに無線又は有線接続可能な携帯端末、あるいはタブレット型端末のようなタッチパネルを搭載した電子機器など、ネットワークを使って通信しながらデータ処理しつつ画面表示可能な汎用機器なども実施形態における情報処理装置10に該当しうる。
<Hardware Configuration of Information Processing Apparatus 10>
FIG. 2 is a schematic configuration diagram illustrating a hardware configuration of the information processing apparatus 10 according to the embodiment. A typical example of the information processing apparatus 10 is a mobile phone such as a smartphone. In addition, a network such as a mobile terminal that can be wirelessly or wired connected to a network, or an electronic device that includes a touch panel such as a tablet terminal. A general-purpose device capable of displaying a screen while processing data while communicating using the information processing apparatus 10 may correspond to the information processing apparatus 10 in the embodiment.
 実施形態における情報処理装置10は、例えば、図示しない矩形の薄形筐体を備え、その筐体の一方の面には、タッチパネル102が構成される。情報処理装置10では、各構成要素が主制御部150に接続されている。主制御部150は、例えばプロセッサである。 The information processing apparatus 10 according to the embodiment includes, for example, a rectangular thin housing (not shown), and a touch panel 102 is configured on one surface of the housing. In the information processing apparatus 10, each component is connected to the main control unit 150. The main control unit 150 is a processor, for example.
 主制御部150には、移動体通信用アンテナ112、移動体通信部114、無線LAN通信用アンテナ116、無線LAN通信部118、記憶部120、スピーカ104、マイクロフォン106、ハードボタン108、ハードキー110及び6軸センサ111が接続されている。また、主制御部150には、さらに、タッチパネル102、カメラ130、及び外部インターフェース140が接続されている。外部インターフェース140は、音声出力端子142を含む。 The main control unit 150 includes a mobile communication antenna 112, a mobile communication unit 114, a wireless LAN communication antenna 116, a wireless LAN communication unit 118, a storage unit 120, a speaker 104, a microphone 106, a hard button 108, and a hard key 110. And the 6-axis sensor 111 is connected. The main control unit 150 is further connected with a touch panel 102, a camera 130, and an external interface 140. The external interface 140 includes an audio output terminal 142.
 タッチパネル102は、表示装置及び入力装置の両方の機能を備え、表示機能を担うディスプレイ(表示画面)102Aと、入力機能を担うタッチセンサ102Bとで構成される。ディスプレイ102Aは、例えば、液晶ディスプレイや有機EL(Electro Luminescence)ディスプレイなどの一般的な表示デバイスにより構成される。タッチセンサ102Bは、ディスプレイ102Aの上面に配置された接触操作を検知するための素子及びその上に積層された透明な操作面を備えて構成される。タッチセンサ102Bの接触検知方式としては、静電容量式、抵抗膜式(感圧式)、電磁誘導式など既知の方式のうちの任意の方式を採用することができる。 The touch panel 102 has functions of both a display device and an input device, and includes a display (display screen) 102A that bears the display function and a touch sensor 102B that bears the input function. The display 102A is configured by a general display device such as a liquid crystal display or an organic EL (Electro Luminescence) display. The touch sensor 102B includes an element for detecting a contact operation disposed on the upper surface of the display 102A and a transparent operation surface stacked on the element. As a contact detection method of the touch sensor 102B, an arbitrary method among known methods such as a capacitance method, a resistance film method (pressure-sensitive method), and an electromagnetic induction method can be adopted.
 表示装置としてのタッチパネル102は、主制御部150によるプログラム122の実行により生成されるアプリケーション(以下、アプリとも称する。)の画像を表示する。入力装置としてのタッチパネル102は、操作面に対して接触する接触物(プレイヤの指やスタイラスなどを含む。以下、「指」である場合を代表例として説明する。)の動作を検知することで、操作入力を受け付け、その接触位置の情報を主制御部150に与える。指の動作は、接触点の位置または領域を示す座標情報として検知され、座標情報は、例えば、タッチパネル102の短辺方向及び長辺方向の二軸上の座標値として表される。 The touch panel 102 as a display device displays an image of an application (hereinafter also referred to as an application) generated by the execution of the program 122 by the main control unit 150. The touch panel 102 as an input device detects an action of a contact object (including a player's finger, stylus, and the like. Hereinafter, a case of a “finger” will be described as a representative example). The operation input is received, and information on the contact position is given to the main control unit 150. The movement of the finger is detected as coordinate information indicating the position or area of the contact point, and the coordinate information is represented as coordinate values on two axes of the short side direction and the long side direction of the touch panel 102, for example.
 情報処理装置10は、移動体通信用アンテナ112や無線LAN通信用アンテナ116を通じてネットワークNに接続され、処理装置20との間でデータ通信をすることが可能である。なお、記憶部120は、プログラム122を記録し、また、記憶部120は、外部装置10と別体であってもよく、例えば、SDカードやCD-RAM等の記録媒体であってもよい。 The information processing apparatus 10 is connected to the network N through the mobile communication antenna 112 and the wireless LAN communication antenna 116, and can perform data communication with the processing apparatus 20. The storage unit 120 records the program 122, and the storage unit 120 may be separate from the external device 10, and may be a recording medium such as an SD card or a CD-RAM.
 <処理装置20の構成>
 図3は、実施例における処理装置20の構成の一例を示すブロック図である。図3に示すように、処理装置20は、処理部202、送信部204、6軸センサ206、及び電源部208を有する。また、各生体電極31、33、35は、例えば増幅部を介して電線を用いて処理部202に接続される。なお、処理装置20の各部は、一方のテンプルに設けられるのではなく、一対のテンプルに分散して設けられてもよい。
<Configuration of Processing Device 20>
FIG. 3 is a block diagram illustrating an example of the configuration of the processing device 20 according to the embodiment. As illustrated in FIG. 3, the processing device 20 includes a processing unit 202, a transmission unit 204, a 6-axis sensor 206, and a power supply unit 208. In addition, each of the bioelectrodes 31, 33, and 35 is connected to the processing unit 202 using an electric wire via an amplification unit, for example. In addition, each part of the processing apparatus 20 may be distributed in a pair of temples instead of being provided in one temple.
 6軸センサ206は、3軸加速度センサ及び3軸角速度センサである。また、これらの各センサは別個に設けられてもよい。6軸センサ206は、検出したセンサ信号を処理部202に出力する。 The 6-axis sensor 206 is a 3-axis acceleration sensor and a 3-axis angular velocity sensor. Each of these sensors may be provided separately. The 6-axis sensor 206 outputs the detected sensor signal to the processing unit 202.
 処理部202は、例えばプロセッサであり、6軸センサ206から得られるセンサ信号を必要に応じて処理し、送信部204に出力する。例えば、処理部202は、6軸センサ206からのセンサ信号を用いて、以下の6つの信号を生成し、各信号の信号値を取得する。
・第1信号:X軸方向における加速度を示す信号
・第2信号:Y軸方向における加速度を示す信号
・第3信号:Z軸方向における加速度を示す信号
・第4信号:ピッチ(Pitch)角を示す信号
・第5信号:ロール(Roll)角を示す信号
・第6信号:ヨー(Yaw)角を示す信号
The processing unit 202 is, for example, a processor, processes the sensor signal obtained from the 6-axis sensor 206 as necessary, and outputs the processed signal to the transmission unit 204. For example, the processing unit 202 generates the following six signals using the sensor signals from the six-axis sensor 206, and acquires the signal value of each signal.
First signal: signal indicating acceleration in the X axis direction Second signal: signal indicating acceleration in the Y axis direction Third signal: signal indicating acceleration in the Z axis direction Fourth signal: pitch angle 5th signal: signal indicating roll angle, 6th signal: signal indicating yaw angle
 例えば、第1信号は、頭の前後方向における加速度を示す信号であり、第2信号は、頭の横方向における加速度を示す信号であり、第3信号は、頭の鉛直方向における加速度を示す信号である。また、第4信号(ピッチ角)は、例えば頭の上下の傾きを示し、第5信号(ロール角)は、例えば頭の進行軸周りの運動を示し、第6信号(ヨー角)は、頭の左右の傾きを示す。第4~6信号については、公知の技術を用いて算出されればよい。また、処理部202は、6軸センサ206から得られるセンサ信号を増幅等するだけでもよい。 For example, the first signal is a signal indicating acceleration in the longitudinal direction of the head, the second signal is a signal indicating acceleration in the lateral direction of the head, and the third signal is a signal indicating acceleration in the vertical direction of the head. It is. The fourth signal (pitch angle) indicates, for example, the vertical tilt of the head, the fifth signal (roll angle) indicates, for example, movement around the head's traveling axis, and the sixth signal (yaw angle) indicates the head. The left and right inclinations of are shown. The fourth to sixth signals may be calculated using a known technique. Further, the processing unit 202 may only amplify the sensor signal obtained from the 6-axis sensor 206.
 送信部204は、処理部202によって処理された第1信号から第6信号の各信号値を情報処理装置10に送信する。例えば、送信部204は、Bluetooth(登録商標)及び無線LAN等の無線通信、又は有線通信によってセンサ信号又は各データを情報処理装置10に送信する。電源部208は、処理部202、送信部204、6軸センサ206等に電力を供給する。 The transmission unit 204 transmits each signal value of the sixth signal to the information processing apparatus 10 from the first signal processed by the processing unit 202. For example, the transmission unit 204 transmits the sensor signal or each data to the information processing apparatus 10 by wireless communication such as Bluetooth (registered trademark) and wireless LAN, or wired communication. The power supply unit 208 supplies power to the processing unit 202, the transmission unit 204, the 6-axis sensor 206, and the like.
 <情報処理装置10の構成>
 次に、情報処理装置10の構成について説明する。図4は、実施例における情報処理装置10の構成の一例を示す図である。情報処理装置10は、記憶部302、通信部304、及び制御部306を有する。
<Configuration of Information Processing Apparatus 10>
Next, the configuration of the information processing apparatus 10 will be described. FIG. 4 is a diagram illustrating an example of the configuration of the information processing apparatus 10 according to the embodiment. The information processing apparatus 10 includes a storage unit 302, a communication unit 304, and a control unit 306.
 記憶部302は、例えば、図2に示す記憶部120等により実現されうる。記憶部302は、実施例における体幹トレーニングのアプリケーションに関するデータ、例えば、処理装置20から受信したデータや、トレーニングに関するトレーニング情報や、トレーニング中のユーザの姿勢判定に用いる閾値情報や、トレーニングの目的に関する目的情報や、ユーザの姿勢判定の結果情報や、画面に表示される画面情報等を記憶する。 The storage unit 302 can be realized by, for example, the storage unit 120 shown in FIG. The storage unit 302 relates to data related to the core training application in the embodiment, for example, data received from the processing device 20, training information related to training, threshold information used for determining posture of the user during training, and the purpose of training. Purpose information, result information of user posture determination, screen information displayed on the screen, and the like are stored.
 通信部304は、例えば移動体通信部114や無線LAN通信部118等により実現されうる。通信部304は、例えば処理装置20からデータを受信する。また、通信部304は、情報処理装置10において処理されたデータをサーバ(不図示)に送信したりしてもよい。すなわち、通信部304は、送信部と受信部としての機能を有する。 The communication unit 304 can be realized by the mobile communication unit 114, the wireless LAN communication unit 118, and the like, for example. The communication unit 304 receives data from the processing device 20, for example. The communication unit 304 may transmit data processed in the information processing apparatus 10 to a server (not shown). That is, the communication unit 304 functions as a transmission unit and a reception unit.
 制御部306は、例えば主制御部150等により実現されうる。制御部306は、体幹トレーニングのアプリケーションを実行する。実施例における体幹トレーニングのアプリケーションは、体幹トレーニングのガイダンスを行い、トレーニング中のユーザの姿勢を評価したり、トレーニングメニューを自動生成したりする機能を有する。この機能を実現するため、制御部306は、受付部312、特定部314、取得部316、評価部318、生成部320、及び表示制御部322を有する。以下、制御部306の機能を、姿勢判定機能と、メニュー生成機能とに分けて説明する。 The control unit 306 can be realized by the main control unit 150, for example. The control unit 306 executes a trunk training application. The trunk training application in the embodiment has functions of performing guidance for trunk training, evaluating the posture of a user during training, and automatically generating a training menu. In order to realize this function, the control unit 306 includes a reception unit 312, a specification unit 314, an acquisition unit 316, an evaluation unit 318, a generation unit 320, and a display control unit 322. Hereinafter, the functions of the control unit 306 will be described separately for an attitude determination function and a menu generation function.
 ≪姿勢判定機能≫
 受付部312は、ユーザ操作に基づき、体幹に関する所定のトレーニングを示す情報(以下、トレーニングIDとも称す。)を受け付ける。例えば、受付部312は、表示画面に表示された複数のトレーニングに対するユーザの押下が検知された場合に、その押下に対応するトレーニングIDを受け付ける。また、受付部312は、内部又は外部機器の他のアプリケーションから、トレーニングIDを受け付けてもよい。
≪Attitude judgment function≫
The reception unit 312 receives information (hereinafter, also referred to as a training ID) indicating predetermined training related to the trunk based on a user operation. For example, when the user's pressing on a plurality of trainings displayed on the display screen is detected, the receiving unit 312 receives a training ID corresponding to the pressing. In addition, the reception unit 312 may receive a training ID from another application in the internal or external device.
 特定部314は、記憶部302に記憶された閾値情報であって、体幹に関するトレーニングごとに閾値が関連付けられた閾値情報に基づき、受付部312により受け付けられた所定のトレーニングに対する閾値を特定する。例えば、特定部314は、受付部312により受け付けられたトレーニングIDが示す所定のトレーニングに対する閾値を、閾値情報を参照して特定する。特定される閾値の数は、特に問わず、トレーニングの内容に応じた1又は複数の閾値が設定されていればよい。閾値情報は、トレーニングごとに、用いられる信号の閾値が関連付けられている。なお、閾値情報は、図8を用いて後述する。 The identifying unit 314 identifies threshold values for the predetermined training accepted by the accepting unit 312 based on threshold information stored in the storage unit 302 and associated with threshold values for each training related to the trunk. For example, the specifying unit 314 specifies a threshold for a predetermined training indicated by the training ID received by the receiving unit 312 with reference to the threshold information. The number of specified thresholds is not particularly limited as long as one or a plurality of thresholds corresponding to the content of training is set. The threshold information is associated with a threshold of a signal to be used for each training. The threshold information will be described later with reference to FIG.
 取得部316は、評価対象であり、トレーニングIDが示す所定のトレーニングを行うユーザに装着されたセンサから検出される信号値を取得する。例えば、取得部316は、ユーザに装着されたセンサ(例えば6軸センサ206)から、姿勢判定に用いる信号の信号値をネットワークを介して直接的に取得したり、又は一旦記憶部302に記憶された信号値を間接的に取得したりする。 The obtaining unit 316 obtains a signal value detected from a sensor attached to a user who is an evaluation target and performs predetermined training indicated by the training ID. For example, the acquisition unit 316 directly acquires a signal value of a signal used for posture determination from a sensor attached to the user (for example, the 6-axis sensor 206), or is temporarily stored in the storage unit 302. The signal value obtained indirectly.
 評価部318は、取得部316により取得された信号値と、特定部314により特定された閾値との比較結果に基づいて、所定のトレーニング時のユーザの姿勢を評価する。例えば、評価部318は、信号値の絶対値が閾値未満であるか否かに基づいて、姿勢の安定性を評価する。より具体的には、評価部318は、信号値が、閾値が示す所定範囲内に入っていれば、姿勢が安定していると判定し、信号値が所定範囲に入っていなければ、姿勢が安定していないと判定する。これにより、評価部318は、姿勢の安定性を評価することができる。 The evaluation unit 318 evaluates the posture of the user during a predetermined training based on the comparison result between the signal value acquired by the acquisition unit 316 and the threshold value specified by the specifying unit 314. For example, the evaluation unit 318 evaluates the posture stability based on whether or not the absolute value of the signal value is less than a threshold value. More specifically, the evaluation unit 318 determines that the posture is stable if the signal value is within a predetermined range indicated by the threshold value, and if the signal value is not within the predetermined range, the posture is Judge that it is not stable. Thereby, the evaluation unit 318 can evaluate the stability of the posture.
 表示制御部322は、評価部318の評価結果を表示画面に表示するよう制御する。これにより、体幹トレーニング中のユーザの姿勢に対する評価を、ユーザにフィードバックすることができる。 The display control unit 322 controls the evaluation result of the evaluation unit 318 to be displayed on the display screen. Thereby, evaluation with respect to the posture of the user during trunk training can be fed back to the user.
 また、特定部314は、姿勢判定に対し、トレーニングごとに、比較対象の1又は複数の信号及びこの信号の閾値が関連付けられた閾値情報を用いてもよい。例えば、特定部314は、この閾値情報に基づき、受付部312により受け付けられた所定のトレーニングに対する比較対象の信号及び閾値をそれぞれ特定する。 Further, the identifying unit 314 may use threshold information in which one or a plurality of signals to be compared and a threshold value of this signal are associated with each other for posture determination for posture determination. For example, the specifying unit 314 specifies a signal to be compared and a threshold value for the predetermined training received by the receiving unit 312 based on the threshold information.
 この場合、取得部316は、ユーザに装着されたセンサから、特定部314により特定された比較対象の信号の信号値を取得する。例えば、取得部316は、センサから全信号(第1~第6信号)を取得し、その中から特定された信号だけを取得するようにしてもよい。また、取得部316は、特定された信号をセンサに通知し、特定された信号だけをセンサから取得するようにしてもよい。これにより、様々なトレーニングが行われる場合に、各トレーニングに合った信号を用いて、姿勢判定をすることができる。さらに、そのトレーニングに関係が無い信号を用いた処理が行われないため、省電力化に寄与することができる。 In this case, the acquiring unit 316 acquires the signal value of the comparison target signal specified by the specifying unit 314 from the sensor attached to the user. For example, the acquisition unit 316 may acquire all signals (first to sixth signals) from the sensor and acquire only the specified signal from the signals. The acquisition unit 316 may notify the sensor of the specified signal and acquire only the specified signal from the sensor. Thereby, when various training is performed, posture determination can be performed using a signal suitable for each training. Furthermore, since processing using a signal not related to the training is not performed, it is possible to contribute to power saving.
 また、特定部314は、姿勢判定に対し、トレーニングごとに、センサからの信号に関するピーク閾値がさらに関連付けられた閾値情報を用いてもよい。例えば、特定部314は、この閾値情報に基づき、所定のトレーニングに対する比較対象の信号のピーク閾値をさらに特定する。 Further, the specifying unit 314 may use threshold information in which a peak threshold related to a signal from the sensor is further associated with each posture for posture determination. For example, the specifying unit 314 further specifies the peak threshold value of the comparison target signal for the predetermined training based on the threshold information.
 この場合、取得部316は、比較対象の信号のピーク間の差値を算出することを含む。図5は、取得部316の機能の一例を示すブロック図である。図5に示す取得部316は、差値算出部3162を含む。差値算出部3162は、所定時間毎に繰り返される姿勢の信号のピーク間の差値を算出する。例えば、同じ姿勢を所定時間維持する運動が、複数回繰り返される場合、差値算出部3162は、この所定時間内のピーク値を求め、このピーク値と次の所定時間内のピーク値との差値を次々に算出する。 In this case, the acquisition unit 316 includes calculating a difference value between peaks of signals to be compared. FIG. 5 is a block diagram illustrating an example of the function of the acquisition unit 316. The acquisition unit 316 illustrated in FIG. 5 includes a difference value calculation unit 3162. The difference value calculation unit 3162 calculates a difference value between peaks of the posture signal repeated every predetermined time. For example, when the exercise of maintaining the same posture for a predetermined time is repeated a plurality of times, the difference value calculation unit 3162 obtains a peak value within this predetermined time, and the difference between this peak value and the peak value within the next predetermined time The values are calculated one after another.
 次に、評価部318の各機能について説明する。図6は、評価部318の機能の一例を示すブロック図である。図6に示す評価部318は、第1判定部3181、第1評価部3183、第2評価部3185、スコア算出部3187、及び第2判定部3189を含む。 Next, each function of the evaluation unit 318 will be described. FIG. 6 is a block diagram illustrating an example of the function of the evaluation unit 318. The evaluation unit 318 illustrated in FIG. 6 includes a first determination unit 3181, a first evaluation unit 3183, a second evaluation unit 3185, a score calculation unit 3187, and a second determination unit 3189.
 第1判定部3181は、上述したとおり、所定のトレーニング中に取得される信号の信号値が、閾値で表される所定範囲内に含まれるか否かを判定する。実施例では、例えば動きが比較的小さいトレーニングが想定されているため、所定範囲を超える信号値は、動きが大きいことを表す。 As described above, the first determination unit 3181 determines whether or not the signal value of the signal acquired during the predetermined training is included in the predetermined range represented by the threshold value. In the embodiment, for example, since training with relatively small movement is assumed, a signal value exceeding a predetermined range indicates that the movement is large.
 また、第1判定部3181は、トレーニングごとに比較対象となる信号が異なる場合は、トレーニングに対応する信号の信号値と、その信号の閾値とを用いて比較判定を行う。第1判定部3181は、比較結果を第1評価部3183に出力する。 In addition, when the signal to be compared is different for each training, the first determination unit 3181 performs the comparison determination using the signal value of the signal corresponding to the training and the threshold value of the signal. The first determination unit 3181 outputs the comparison result to the first evaluation unit 3183.
 第1評価部3183は、第1判定部3181から取得した比較結果に基づき、信号値の絶対値が閾値未満であれば、姿勢が安定していると評価し、信号値の絶対値が閾値を超えていれば、姿勢が不安定であると評価する。 Based on the comparison result acquired from the first determination unit 3181, the first evaluation unit 3183 evaluates that the posture is stable if the absolute value of the signal value is less than the threshold value, and the absolute value of the signal value sets the threshold value. If it exceeds, evaluate that the posture is unstable.
 第2評価部3185は、差値算出部3162により算出された差値の絶対値が、記憶部302に記憶されるピーク閾値未満であるか否かにさらに基づいて、姿勢の安定性を評価してもよい。例えば、第2評価部3185は、所定時間おきに繰り返される姿勢に関する信号のピーク間の差値の絶対値が閾値未満であれば、繰り返し時の姿勢が同様であると判定し、姿勢が安定しているとみなす。また、第2評価部3185は、ピーク間の差値の絶対値が閾値以上であれば、繰り返し時の姿勢が異なると判定し、姿勢が不安定であるとみなす。これにより、体幹トレーニングで実行される所定の姿勢の繰り返しについて、姿勢の安定性を適切に評価することができる。なお、ピーク閾値は、信号毎に設定されてもよい。 The second evaluation unit 3185 evaluates the posture stability based on whether or not the absolute value of the difference value calculated by the difference value calculation unit 3162 is less than the peak threshold value stored in the storage unit 302. May be. For example, the second evaluation unit 3185 determines that the postures at the time of repetition are the same if the absolute value of the difference value between the peaks of the signals related to the postures repeated every predetermined time is less than the threshold, and the postures are stabilized. It is considered. Further, if the absolute value of the difference value between the peaks is equal to or greater than the threshold value, the second evaluation unit 3185 determines that the posture at the time of repetition is different, and regards the posture as unstable. Thereby, it is possible to appropriately evaluate the stability of the posture with respect to repetition of the predetermined posture executed in the trunk training. The peak threshold value may be set for each signal.
 スコア算出部3187は、信号値の絶対値又は差値の絶対値が閾値以上となる回数に応じて、姿勢の安定性を示すスコアを算出してもよい。例えば、スコア算出部3187は、100点満点から、絶対値が閾値以上となる回数を減算して、スコアを算出してもよい。これにより、ユーザの姿勢に対して把握しやすい評価指標をユーザに提供することができる。 The score calculation unit 3187 may calculate a score indicating the stability of the posture according to the number of times that the absolute value of the signal value or the absolute value of the difference value is equal to or greater than the threshold value. For example, the score calculation unit 3187 may calculate the score by subtracting the number of times that the absolute value is equal to or greater than the threshold value from the full score of 100 points. Thereby, it is possible to provide the user with an evaluation index that can be easily grasped with respect to the user's posture.
 第2判定部3189は、センサからの信号値を用いて、ユーザが所定のトレーニング中であるか否かを判定する。例えば、第2判定部3189は、受け付けたトレーニングにおいて、所定の姿勢を維持する所定時間中であるにもかかわらず、信号値が0近辺でかつ変動が小さかったり、信号値が所定の値周辺に遷移していなかったりするようであれば、トレーニング中ではないと判定する。 The second determination unit 3189 uses the signal value from the sensor to determine whether the user is performing a predetermined training. For example, in the accepted training, the second determination unit 3189 has a signal value close to 0 and a small fluctuation or a signal value around the predetermined value even though it is during a predetermined time for maintaining a predetermined posture. If it does not change, it is determined that training is not in progress.
 この場合、スコア算出部3187は、スコアを算出する際に、ユーザが所定のトレーニング中であるか否かの第2判定部3189による判定結果を、スコアに反映してもよい。例えば、スコア算出部3187は、センサの信号に基づいて、トレーニング中ではないと判定された場合、スコアを現在値から所定値下げるようにする。なお、スコア算出部3187は、トレーニング中ではないと判定される時間が長いほど、所定値の値を大きくしてもよい。このようなトレーニング中の判定結果をスコアに反映させることで、ユーザに対して、例えば繰り返しのトレーニングを継続して行う契機を与えることができる。 In this case, when calculating the score, the score calculation unit 3187 may reflect the determination result by the second determination unit 3189 as to whether or not the user is performing the predetermined training in the score. For example, when it is determined that training is not being performed based on the sensor signal, the score calculation unit 3187 decreases the score from the current value by a predetermined value. Note that the score calculation unit 3187 may increase the value of the predetermined value as the time determined to be not during training is longer. By reflecting the determination result during such training in the score, it is possible to give the user an opportunity to continue the repeated training, for example.
 また、第1評価部3183及び第2評価部3185による各評価結果、並びにスコア算出部3187により算出されたスコアは、表示制御部322により表示画面に表示される。これにより、ユーザに対して、より詳細な評価結果を報知することができる。 Also, each evaluation result by the first evaluation unit 3183 and the second evaluation unit 3185, and the score calculated by the score calculation unit 3187 are displayed on the display screen by the display control unit 322. Thereby, a more detailed evaluation result can be notified to the user.
 ≪メニュー生成機能≫
 図4に戻り、メニュー生成機能について説明する。まず、特定部314は、記憶部302に記憶されたトレーニング情報であって、トレーニングごとに、1又は複数の属性が関連付けられたトレーニング情報に基づき、所定の属性に対する1又は複数のトレーニングを特定する。例えば、特定部314は、受付部312により、所定の属性が受け付けられると、この所定の属性に対する1又は複数のトレーニングを特定する。例えば、トレーニング情報は、トレーニングごとに、属性として、種類、静止時間、所要時間、所定の部位、目的などが関連付けられている。なお、トレーニング情報は、図9を用いて後述する。
≪Menu generation function≫
Returning to FIG. 4, the menu generation function will be described. First, the specifying unit 314 specifies one or more trainings for a predetermined attribute based on the training information stored in the storage unit 302 and associated with one or more attributes for each training. . For example, when the receiving unit 312 receives a predetermined attribute, the specifying unit 314 specifies one or a plurality of trainings for the predetermined attribute. For example, in the training information, for each training, the type, rest time, required time, predetermined part, purpose, and the like are associated as attributes. The training information will be described later with reference to FIG.
 生成部320は、特定部314により特定された1又は複数のトレーニングの中から、所定数のトレーニングを組み合わせてトレーニングのメニューを生成する。例えば、生成部320は、トレーニングを抽出する際、トレーニングの所要時間を用いて、全体で所定時間よりも短くなるように抽出したり、トレーニングの種類を用いて、同種のトレーニングだけが抽出されないようにしたりする。これにより、生成部320は、複数のトレーニングを適宜組み合わせてメニューを生成するため、ユーザに飽きさせずにトレーニングを継続させることができる。 The generating unit 320 generates a training menu by combining a predetermined number of trainings from one or a plurality of trainings specified by the specifying unit 314. For example, when extracting the training, the generation unit 320 uses the time required for the training so that it is shorter than the predetermined time as a whole, or only the same type of training is not extracted using the type of training. Or Thereby, since the generation unit 320 generates a menu by appropriately combining a plurality of trainings, the training can be continued without getting tired of the user.
 次に、生成部320のより詳細な機能について説明する。図7は、生成部320の機能の一例を示す図である。図7に示す生成部320は、第1生成部3202と、第2生成部3204とを含む。第1生成部3202は、受け付けられた所定の目的からメニューを生成し、第2生成部3204は、受け付けられた所定の部位からメニューを生成する。 Next, a more detailed function of the generation unit 320 will be described. FIG. 7 is a diagram illustrating an example of functions of the generation unit 320. The generation unit 320 illustrated in FIG. 7 includes a first generation unit 3202 and a second generation unit 3204. The first generation unit 3202 generates a menu from the received predetermined purpose, and the second generation unit 3204 generates a menu from the received predetermined part.
 まず、所定の目的からメニューが生成される場合について説明する。ここでは、属性はトレーニングの目的及び種類を含む。受付部312は、トレーニングに関する所定の目的を示す情報(以下、目的別IDとも称す。)受け付ける。所定の目的とは、例えば、「基礎体幹を鍛える」、「歩行姿勢を綺麗にする」、「痩せやすい」などである。なお、記憶部302には、目的別IDと目的の名称とが関連付けられた目的情報が記憶されている(例えば図10参照)。 First, the case where a menu is generated for a predetermined purpose will be described. Here, the attributes include the purpose and type of training. The accepting unit 312 accepts information indicating a predetermined purpose related to training (hereinafter also referred to as purpose-specific ID). The predetermined purpose is, for example, “to train the basic trunk”, “to make the walking posture clean”, “easy to lose weight”, or the like. The storage unit 302 stores purpose information in which purpose-specific IDs and purpose names are associated (for example, see FIG. 10).
 この場合、特定部314は、トレーニングに関する目的及び種類を含む属性が関連付けられたトレーニング情報に基づき、受け付けられた目的別IDが示す所定の目的に対する複数のトレーニングを特定する。トレーニングの種類とは、例えば、「ストレッチ」、「強化」、「バランス連動」などである。 In this case, the specifying unit 314 specifies a plurality of trainings for the predetermined purpose indicated by the received purpose-specific ID based on the training information associated with the attributes including the purpose and type regarding the training. The types of training include, for example, “stretch”, “strengthening”, “balance interlock”, and the like.
 第1生成部3202は、特定部314により特定された複数のトレーニングの中から、各種類の割合が所定の割合になるようにこの各種類のトレーニングを抽出し、抽出された各種類のトレーニングを組み合わせてトレーニングのメニューを生成する。 The first generation unit 3202 extracts each type of training from the plurality of trainings specified by the specifying unit 314 so that the ratio of each type becomes a predetermined rate, and performs each type of training extracted. Combine to generate a training menu.
 例えば、第1生成部3202は、トレーニングの種類として、「ストレッチ」、「強化」、「バランス連動」の比が、3:2:1になるように、トレーニングを抽出する。より具体的には、第1生成部3202は、特定されたトレーニングの中から、「ストレッチ」をランダムに3つ、「強化」をランダムに2つ、「バランス連動」をランダムに1つ抽出し、これらを組み合わせてメニューを生成する。 For example, the first generation unit 3202 extracts training so that the ratio of “stretch”, “strengthening”, and “balance interlock” is 3: 2: 1 as the type of training. More specifically, the first generation unit 3202 extracts three “stretch” randomly, two “strengthen” randomly, and one “balance interlock” randomly from the identified training. These are combined to generate a menu.
 これにより、体幹を鍛えたいユーザの目的に合わせて、トレーニングメニューを自動で生成することができる。また、ランダムにトレーニングのメニューが自動生成されるため、同じ目的を選んだとしても、毎回同じメニューにはならずに、ユーザに飽きを感じさせずにトレーニングを継続させることができる。 This makes it possible to automatically generate a training menu according to the purpose of the user who wants to train the trunk. In addition, since a training menu is automatically generated at random, even if the same purpose is selected, the same menu is not used every time, and the training can be continued without causing the user to feel bored.
 次に、所定の部位からメニューが生成される場合について説明する。ここでは、属性は所定の部位を含む。受付部312は、ユーザの体に関する所定の部位を示す情報(以下、部位IDとも称す。)を受け付ける。例えば、受付部312は、他のアプリケーションから、部位IDを受け付けたり、ユーザ操作に基づき部位IDを受け付けたりする。所定の部位とは、例えば背中、腹部、足首などである。また、他のアプリケーションとは、例えばランニングやウォーキングをサポートするアプリケーションなどである。この他のアプリケーションが、ランニングやウォーキングの弱点となる部位を検知可能な場合、受付部312は、この他のアプリケーションから部位IDを受け付ける。この他のアプリケーションは、外部機器にインストールされたものでも、同機器にインストールされたものでもよい。 Next, a case where a menu is generated from a predetermined part will be described. Here, the attribute includes a predetermined part. The accepting unit 312 accepts information indicating a predetermined part related to the user's body (hereinafter also referred to as a part ID). For example, the reception unit 312 receives a part ID from another application or receives a part ID based on a user operation. The predetermined part is, for example, the back, the abdomen, the ankle, or the like. The other application is, for example, an application that supports running or walking. When this other application can detect a site that is a weak point of running or walking, the reception unit 312 receives a site ID from this other application. Other applications may be installed in an external device or installed in the same device.
 特定部314は、ユーザの体に関する1又は複数の部位を含む属性が関連付けられたトレーニング情報に基づき、部位IDが示す所定の部位に対する複数のトレーニングを特定する。例えば、特定部314は、記憶部302に記憶されたトレーニング情報を参照し、所定の部位(例えば「背中」)に関連付けられたトレーニングを特定する。 The specifying unit 314 specifies a plurality of trainings for a predetermined part indicated by the part ID based on training information associated with an attribute including one or a plurality of parts related to the user's body. For example, the identifying unit 314 refers to the training information stored in the storage unit 302 and identifies the training associated with a predetermined part (for example, “back”).
 第2生成部3204は、特定部314により特定された複数のトレーニングの中から、所定数のトレーニングを組み合わせてトレーニングのメニューを生成する。例えば、第2生成部3204は、「背中」に関連付けられた複数のトレーニングの中から、ランダムに所定数を抽出してメニューを生成する。これにより、ユーザが鍛えたい所定の部位や、他のアプリケーションから通知された所定の部位に対して、トレーニングメニューを自動で生成することができる。また、第1生成部3202及び第2生成部3204は、トレーニングを抽出する際に、各トレーニングの所要時間やその他の属性を考慮して抽出してもよい。 The second generation unit 3204 generates a training menu by combining a predetermined number of trainings from among the plurality of trainings specified by the specifying unit 314. For example, the second generation unit 3204 generates a menu by randomly extracting a predetermined number from a plurality of trainings associated with “back”. Thereby, a training menu can be automatically generated for a predetermined part that the user wants to train or a predetermined part notified from another application. Further, the first generation unit 3202 and the second generation unit 3204 may extract the training in consideration of the time required for each training and other attributes.
 また、第1生成部3202及び第2生成部3204により生成された各メニューは、表示制御部322により表示画面に表示される。これにより、ユーザは、自動で生成されたメニューの内容を把握することができる。 In addition, each menu generated by the first generation unit 3202 and the second generation unit 3204 is displayed on the display screen by the display control unit 322. Thereby, the user can grasp | ascertain the content of the menu produced | generated automatically.
 <データ例>
 次に、実施例における体幹トレーニングのアプリケーションに用いられる各種データの例について説明する。図8は、閾値情報の一例を示す図である。図8に示す例では、各トレーニングの識別情報(例えばトレーニングID)ごとに、トレーニング名と、比較対象を特定するためのパラメータ(judgeFactor及びjudgePeak)と、そのパラメータの閾値とが関連付けられている。
<Data example>
Next, examples of various data used for the trunk training application in the embodiment will be described. FIG. 8 is a diagram illustrating an example of threshold information. In the example illustrated in FIG. 8, for each training identification information (for example, training ID), a training name, parameters for specifying a comparison target (judgeFactor and judgePeak), and a threshold value of the parameter are associated.
 judgeFactorは、6ケタの0又は1で表され、左から順に第1信号(thr_accX)、第2信号(thr_accY)、・・・、第6信号に対応する。また、「0」は比較対象ではないことを示し、「1」は比較対象であることを示す。例えば、judgeFactorが「000110」の場合、第4信号と第5信号とが比較対象として用いられることを示す。judgeFactorは、第1評価部3183により用いられる。また、図8に示す各信号の行に設定された数値が、閾値を示す。例えば、ID「4」の第1信号(thr_accX)に関する閾値は「2」である。 JudgeFactor is represented by 6 digits of 0 or 1, and corresponds to the first signal (thr_accX), second signal (thr_accY),. Further, “0” indicates that it is not a comparison target, and “1” indicates that it is a comparison target. For example, when the judgeFactor is “000110”, it indicates that the fourth signal and the fifth signal are used as comparison targets. The judgeFactor is used by the first evaluation unit 3183. Further, the numerical values set in each signal row shown in FIG. 8 indicate threshold values. For example, the threshold for the first signal (thr_accX) with ID “4” is “2”.
 judgePeakは、6ケタの0又は1で表され、左から順に第1信号のピーク、第2信号のピーク、・・・、第6信号のピーク(thr_yaw_pk)に対応する。また、「0」は比較対象ではないことを示し、「1」は比較対象であることを示す。例えば、judgePeakが「010010」の場合、第2信号のピークと第5信号のピークとが比較対象として用いられることを示す。judgePeakは、第2評価部3185により用いられる。また、図8に示すピークの行に設けられた数値が、閾値を示す。例えば、ID「3」の第6信号のピークに関する閾値は「2」である。 JudgePeak is represented by 6 digits of 0 or 1, and corresponds to the peak of the first signal, the peak of the second signal,..., The peak of the sixth signal (thr_yaw_pk) in order from the left. Further, “0” indicates that it is not a comparison target, and “1” indicates that it is a comparison target. For example, when “judge Peak” is “010010”, it indicates that the peak of the second signal and the peak of the fifth signal are used as comparison targets. The judge Peak is used by the second evaluator 3185. Moreover, the numerical value provided in the peak row shown in FIG. 8 indicates the threshold value. For example, the threshold for the peak of the sixth signal with ID “3” is “2”.
 例えば、番号1のトレーニングについて、トレーニング名「腸腰筋ストレッチ」、judgeFactor「000110」、judgePeak「000000」が関連付けられている。この例によれば、第4信号(thr_pitch)の閾値は「1」であり、ピーク間の閾値は適用されない。 For example, for the training of number 1, the training name “stretching the iliopsoas muscle”, the judgefactor “000110”, and the judgepeak “000000” are associated. According to this example, the threshold value of the fourth signal (thr_pitch) is “1”, and the threshold value between peaks is not applied.
 また、番号3のトレーニングについて、トレーニング名「立位回旋」、judgeFactor「111000」、judgePeak「000001」が関連付けられている。この例によれば、第1信号(thr_accX)の閾値は「1」であり、第2信号(thr_accY)の閾値は「1」であり、第3信号(thr_accZ)の閾値は「1」であり、第6信号のピーク間(thr_yaw_pk)の閾値「2」である。 Also, the training name “Standing Rotation”, judgeFactor “111000”, and judge Peak “000001” are associated with the training of No. 3. According to this example, the threshold value of the first signal (thr_accX) is “1”, the threshold value of the second signal (thr_accY) is “1”, and the threshold value of the third signal (thr_accZ) is “1”. The threshold value “2” between the peaks of the sixth signal (thr_yaw_pk).
 図8に示す閾値情報によれば、特定部314は、トレーニングについて、比較対象となる信号と閾値とを、judgeFactor及びjudgePeakとその閾値とを参照することで容易に特定することができる。 According to the threshold information shown in FIG. 8, the specifying unit 314 can easily specify a signal to be compared and a threshold for training by referring to the judgeFactor and the judgePeak and the threshold.
 図9は、トレーニング情報の一例を示す図である。図9に示す例では、各トレーニングの識別情報(例えばトレーニングID)ごとに、トレーニング名、種類(効果)、抽選例外フラグ、静止時間、回数、所要時間、所定の部位(例えば背中、体幹側面、腹部など)、目的別IDが関連付けられる。 FIG. 9 is a diagram showing an example of training information. In the example shown in FIG. 9, for each training identification information (for example, training ID), the training name, type (effect), lottery exception flag, rest time, number of times, required time, predetermined part (for example, back, trunk side surface) ID for each purpose.
 種類は、トレーニングの種類又は効果を示す。種類は、例えば、ストレッチ、強化、パランス連動などがある。抽選例外フラグは、同時に実行されないトレーニングの番号を示す。例えば、トレーニングID「5」の「あぐら回旋」は、トレーニングID「3」の「立位回旋」と同時にトレーニングメニューに含まれることはない。これらは、ストレッチのトレーニングの中でも、同じタイプの回旋系のトレーニングだからである。 The type indicates the type or effect of training. Types include, for example, stretching, strengthening, and balance interlocking. The lottery exception flag indicates the number of training that is not executed simultaneously. For example, “Agura Rotation” of training ID “5” is not included in the training menu at the same time as “Standing Rotation” of training ID “3”. These are the same types of convolution training among stretching training.
 静止時間は、トレーニング中にユーザがとる姿勢の静止時間を示す。例えば、静止時間が「3」の場合、その姿勢を3秒間維持することを示す。回数は、1回のトレーニングにおいて、所定の姿勢を繰り返す回数を示す。例えば、回数が「3」の場合、その姿勢を3セット繰り返すことを示す。所要時間は、1回のトレーニングにかかる時間を示す。 The rest time indicates the rest time of the posture that the user takes during training. For example, when the stationary time is “3”, this indicates that the posture is maintained for 3 seconds. The number of times indicates the number of times a predetermined posture is repeated in one training. For example, when the number of times is “3”, this indicates that the posture is repeated three times. The required time indicates the time required for one training.
 所定の部位は、そのトレーニングに効果がある人間の部位を示す。図9に示す例では、部位IDとして、部位名が用いられるが、番号などが用いられてもよい。なお、1つのトレーニングで複数の部位に効果があってもよい。効果のある部位には、「〇」が設定され、効果のない部位には、「-」が設定される。 The predetermined part indicates a human part effective for the training. In the example illustrated in FIG. 9, the part name is used as the part ID, but a number or the like may be used. One training may be effective for a plurality of parts. “◯” is set for a portion having an effect, and “−” is set for a portion having no effect.
 目的別IDは、トレーニングの目的を識別するための情報である。図10で後述するように、目的別IDごとに、トレーニングの目的が関連付けられた目的情報が記憶部302に記憶される。例えば、目的別ID「1」は、「基礎体幹を鍛える」とし、目的別ID「2」は、「歩行姿勢を綺麗にする」とする。 The purpose-specific ID is information for identifying the purpose of the training. As will be described later with reference to FIG. 10, purpose information associated with the purpose of training is stored in the storage unit 302 for each purpose ID. For example, the purpose-specific ID “1” is “train the basic trunk”, and the purpose-specific ID “2” is “clean walking posture”.
 図9に示す例では、トレーニングID「4」について、トレーニング名「立位側屈」、種類「ストレッチ」、抽選例外フラグ「6」、静止時間「3」、回数「3」、所要時間「170」、背中「〇」、体幹側面「〇」、腹部「-」、・・・、目的別ID「1,2」が関連付けられている。 In the example illustrated in FIG. 9, for the training ID “4”, the training name “standing position side bending”, the type “stretch”, the lottery exception flag “6”, the rest time “3”, the number of times “3”, and the required time “170”. ”, Back“ ◯ ”, trunk side“ ◯ ”, abdomen“-”,...
 図9に示すトレーニング情報によれば、特定部314は、目的別IDを用いて、所定のトレーニングを特定することができ、また、所定の部位を用いて、所定のトレーニングを特定することができる。 According to the training information shown in FIG. 9, the specifying unit 314 can specify the predetermined training using the purpose-specific ID and can specify the predetermined training using the predetermined part. .
 図10は、目的情報の一例を示す図である。図10に示す例では、目的別IDごとに、トレーニングの目的の名称が関連付けられる。例えば、目的別ID「1」には、目的の名称「基礎体幹を鍛える」が関連付けられる。 FIG. 10 is a diagram showing an example of the purpose information. In the example illustrated in FIG. 10, a training purpose name is associated with each purpose ID. For example, the purpose ID “1” is associated with the purpose name “Train the basic trunk”.
 図11は、ユーザの姿勢に対する評価結果情報の一例を示す図である。図11に示す例では、所定のユーザに対し、各トレーニングにおいて評価されたスコアと、各信号の信号値と、ピーク値の閾値とが関連付けられている。 FIG. 11 is a diagram showing an example of evaluation result information for the user's posture. In the example illustrated in FIG. 11, a score evaluated in each training, a signal value of each signal, and a threshold value of a peak value are associated with a predetermined user.
 例えば、ユーザ「AAA」は、トレーニング「ランジ」についてスコア「78」を獲得している。図11に示すように、各トレーニングついて、姿勢に関する評価を示すスコアが関連付けられている。 For example, the user “AAA” has scored “78” for the training “Lunge”. As shown in FIG. 11, for each training, a score indicating an evaluation regarding the posture is associated.
 <閾値比較の具体例>
 図12は、所定の信号における閾値比較の具体例を示す図である。図12に示す例では、第1~第6信号のうちの一の信号S1の変動を示す。図12に示す例では、以下の順でトレーニングが開始されたとする。
T1:1セット目(右)のガイダンス開始
T2:1セット目(右)の準備姿勢開始
T3:1セット目(右)の姿勢静止開始
T4:1セット目(右)の姿勢静止終了
T5:1セット目(右)の姿勢終了、及び1セット目(左)ガイダンス開始
T6:1セット目(左)の準備姿勢開始
T7:1セット目(左)の姿勢静止開始
T8:1セット目(左)の姿勢静止終了
T9:1セット目(左)の姿勢終了、及び2セット目(右)ガイダンス開始
T10:2セット目(右)の準備姿勢開始
T11:2セット目(右)の姿勢静止開始
・・・
<Specific example of threshold comparison>
FIG. 12 is a diagram illustrating a specific example of threshold comparison for a predetermined signal. In the example shown in FIG. 12, the fluctuation of one signal S1 among the first to sixth signals is shown. In the example shown in FIG. 12, it is assumed that training is started in the following order.
T1: 1st set (right) guidance start T2: 1st set (right) ready posture start T3: 1st set (right) posture rest start T4: 1st set (right) posture rest end T5: 1 Set posture (right) posture end and first set (left) guidance start T6: first set (left) ready posture start T7: first set (left) posture static start T8: first set (left) Posture stationary end T9: First set (left) posture end and second set (right) guidance start T10: Second set (right) ready posture start T11: Second set (right) posture stationary start・ ・
 このとき、第1判定部3181は、信号S1が、対応する閾値Th1と閾値Th2との間の所定範囲に含まれるか否かを判定する。閾値Th1は、基準値+記憶部302に設定された閾値であり、閾値Th2は、基準値-記憶部302に設定された閾値である。基準値は、例えば移動時間窓内の信号値の平均値である。第1判定部3181は、区間T3-T4において、値V1、V2、V3及びV4が所定の範囲内に含まれないと判定する。この場合、第1評価部3183は、これらの値に対する姿勢については、安定していないと評価する。なお、記憶部320に記憶される閾値は、トレーニングの期間毎に閾値Th1及び閾値Th2が設定されてもよい。 At this time, the first determination unit 3181 determines whether or not the signal S1 is included in a predetermined range between the corresponding threshold value Th1 and threshold value Th2. The threshold value Th1 is a threshold value set in the reference value + storage unit 302, and the threshold value Th2 is a threshold value set in the reference value−storage unit 302. The reference value is, for example, an average value of signal values within the moving time window. The first determination unit 3181 determines that the values V1, V2, V3, and V4 are not included in the predetermined range in the section T3-T4. In this case, the first evaluation unit 3183 evaluates that the posture with respect to these values is not stable. In addition, the threshold value Th1 and the threshold value Th2 may be set as the threshold value stored in the storage unit 320 for each training period.
 また、スコア算出部3187は、予め設定された所定値(例えば100)から、信号S1の値が所定の範囲を超えた回数を減算する。この場合、T4終了時点で、スコア算出部3187は、スコア「96(=100-4)」を算出する。また、信号S1の値が所定の範囲内に含まれるか否かの判定は、姿勢静止時間に限らず、例えば姿勢準備時間(T2-T3など)にも適用してもよい。 Also, the score calculation unit 3187 subtracts the number of times that the value of the signal S1 has exceeded a predetermined range from a predetermined value (for example, 100) set in advance. In this case, at the end of T4, the score calculation unit 3187 calculates the score “96 (= 100−4)”. The determination of whether or not the value of the signal S1 falls within the predetermined range is not limited to the posture stationary time, and may be applied to, for example, the posture preparation time (T2-T3, etc.).
 また、差値算出部3162は、繰り返しの姿勢における姿勢の安定性について、T3付近のピーク値P1と、T11付近のピーク値P2との差値を算出する。第1判定部3181は、ピーク間の差値(P1-P2)が、閾値未満であるか否かを判定する。第2評価部3185は、ピーク間の差値が閾値未満であれば、繰り返しの姿勢が同様であり、安定していると評価し、ピーク間の差値が閾値以上であれば、繰り返しの姿勢が異なり、安定していないと評価する。第2評価部3185により姿勢が安定していないと評価されたとき、スコア算出部3187は、現在のスコアから、予め設定された値を減算してもよい。 Further, the difference value calculation unit 3162 calculates a difference value between the peak value P1 near T3 and the peak value P2 near T11 with respect to posture stability in repeated postures. The first determination unit 3181 determines whether or not the difference value between peaks (P1−P2) is less than a threshold value. If the difference value between the peaks is less than the threshold, the second evaluation unit 3185 evaluates that the repeated posture is the same and is stable, and if the difference value between the peaks is equal to or greater than the threshold, the repeated posture Is different and is evaluated as not stable. When the second evaluation unit 3185 evaluates that the posture is not stable, the score calculation unit 3187 may subtract a preset value from the current score.
 <画面例>
 次に、実施例における体幹トレーニングのアプリケーションの画面例について説明する。図13は、設定画面の一例を示す図である。図13に示す画面は、例えばアプリケーションの起動後の初期段階において表示される画面であり、「目的別体幹トレーニング」ボタンB10と、「他アプリからのトレーニング」ボタンB12とのいずれかをユーザに選択させる画面である。
<Screen example>
Next, a screen example of the trunk training application in the embodiment will be described. FIG. 13 is a diagram illustrating an example of the setting screen. The screen shown in FIG. 13 is a screen that is displayed, for example, in an initial stage after the application is started. The user can select either the “Train training by purpose” button B10 or the “Training from another application” button B12. This is a screen to be selected.
 例えば、受付部312は、ユーザによりボタンB10が押された場合、このボタン押下を示す情報を表示制御部322に通知し、表示制御部322は、ユーザに目的を選択させる画面を表示するように制御する(例えば図14参照)。 For example, when the button B10 is pressed by the user, the reception unit 312 notifies the display control unit 322 of information indicating this button press, and the display control unit 322 displays a screen that allows the user to select a purpose. Control (for example, see FIG. 14).
 また、表示制御部322は、ユーザによりボタンB12が押された場合、このボタン押下を示す情報を表示制御部322に通知し、表示制御部322は、ユーザに他アプリを選択させる画面を表示するよう制御する(例えば図19参照)。 In addition, when the button B12 is pressed by the user, the display control unit 322 notifies the display control unit 322 of information indicating this button press, and the display control unit 322 displays a screen that allows the user to select another application. (See, for example, FIG. 19).
 図14は、目的選択画面の一例を示す図である。図14に示す画面は、1又は複数の目的から一の目的を選択するための各ボタンが表示される画面である。例えば、基礎体幹を鍛えるためトレーニングを示す「基礎体幹をトレーニング」ボタンB20と、歩行姿勢を綺麗にするためのトレーニングを示す「歩行姿勢を綺麗にしたい」ボタンB22とが表示される。 FIG. 14 is a diagram showing an example of the purpose selection screen. The screen shown in FIG. 14 is a screen on which buttons for selecting one purpose from one or a plurality of purposes are displayed. For example, a “basic trunk training” button B20 indicating training for training the basic trunk and a “want to clean walking posture” button B22 indicating training for cleaning the walking posture are displayed.
 図14に示す例で、ユーザは、「基礎体幹をトレーニング」ボタンB20を押したとする。受付部312は、このボタンB20の押下を検知し、基礎体幹をトレーニングするための目的別ID「1」を特定部314に通知する。特定部314は、目的別ID「1」を取得し、図9に示すトレーニング情報を参照し、目的別ID「1」に対応する複数のトレーニングを特定する。 In the example shown in FIG. 14, it is assumed that the user presses the “Train the basic trunk” button B20. The receiving unit 312 detects the pressing of the button B20 and notifies the specifying unit 314 of the purpose-specific ID “1” for training the basic trunk. The specifying unit 314 acquires the purpose-specific ID “1”, refers to the training information illustrated in FIG. 9, and specifies a plurality of trainings corresponding to the purpose-specific ID “1”.
 生成部320は、例えば、特定部314により特定されたトレーニングの中から乱数を用いたランダム抽選により所定数のトレーニングを抽出し、トレーニングメニューを生成する。 The generation unit 320 extracts a predetermined number of trainings by random lottery using random numbers from the training specified by the specifying unit 314, for example, and generates a training menu.
 図15は、トレーニングメニュー表示画面の一例を示す図である。図15に示す例では、生成部320により、「ストレッチ」TR10、「ポジション」TR12、「リフト」TR14、「アップ」TR16の4つのトレーニングが抽出されている。また、図15に示す画面には、これら4つのトレーニングを全て行った場合の所要時間が10分と表示される。 FIG. 15 is a diagram showing an example of a training menu display screen. In the example illustrated in FIG. 15, four trainings of “stretch” TR10, “position” TR12, “lift” TR14, and “up” TR16 are extracted by the generation unit 320. The screen shown in FIG. 15 displays a required time of 10 minutes when all four trainings are performed.
 また、図15に示す画面では、各トレーニングがボタン形式で選択可能になっており、ユーザ所定のトレーニングの表示領域を押すと、そのトレーニングが選択状態になり、この選択状態でユーザが「トレーニングを始める」ボタンを押すと、そのトレーニングが開始される。例えば、受付部312は、ユーザによる「アップ」TR16のボタンが押されたことを受け付け、次に、「トレーニングを始める」ボタンが押されたことを受け付けたとする。 Further, in the screen shown in FIG. 15, each training can be selected in a button format. When the user presses a predetermined training display area, the training is selected. Press the “Start” button to start the training. For example, it is assumed that the accepting unit 312 accepts that the user has pressed the “up” TR16 button, and then accepts that the “start training” button has been pressed.
 このとき、表示制御部322は、「アップ」TR16に対応するトレーニングのガイダンスを表示画面に表示し、特定部314は、「アップ」TR16に対応する比較対象の信号及びその閾値を、図8に示す閾値情報を参照して特定する。 At this time, the display control unit 322 displays training guidance corresponding to the “up” TR16 on the display screen, and the specifying unit 314 displays the signal to be compared and the threshold corresponding to the “up” TR16 in FIG. It specifies with reference to the threshold value information to show.
 図16は、「アップ」TR16に対するトレーニングのガイダンス画面の一例を示す図である。図16に示す画面には、「アップ」に対応するトレーニングである「腸腰筋ストレッチ」のガイダンスが表示されている。また、図16に示す画面には、所要時間である「30秒」と、セット数「3set」が表示されている。 FIG. 16 is a diagram showing an example of a training guidance screen for the “up” TR 16. On the screen shown in FIG. 16, guidance for “stretching the iliopsoas muscle” which is a training corresponding to “up” is displayed. In addition, the screen shown in FIG. 16 displays “30 seconds” as the required time and the number of sets “3set”.
 図16に示す画面の後、表示制御部322は、予め設定された手順に従って、トレーニング画面を表示するよう制御する。例えば、手順は、(1)ガイダンス開始(準備姿勢)、(2)姿勢移行のカウント、(3)姿勢の静止、(4)準備姿勢に戻るカウントである。また、(2)~(4)の手順がセット数の分だけ繰り返し行われる。 After the screen shown in FIG. 16, the display control unit 322 controls to display the training screen according to a preset procedure. For example, the procedure is (1) Guidance start (preparation posture), (2) Posture transition count, (3) Posture stop, and (4) Count back to the preparatory posture. The procedures (2) to (4) are repeated for the number of sets.
 評価部318は、この(2)~(4)の間、アイウエア30に搭載される6軸センサ206によりセンシングされた信号を用いて、上述したとおり、ユーザの姿勢を評価する。また、評価部318は、姿勢の評価指標としてスコアを算出してもよい。 The evaluation unit 318 evaluates the posture of the user as described above by using the signal sensed by the 6-axis sensor 206 mounted on the eyewear 30 during (2) to (4). The evaluation unit 318 may calculate a score as an evaluation index for posture.
 図17は、スコア表示画面の一例を示す図である。図17に示す例では、「腸腰筋ストレッチ」のトレーニングに対するユーザの姿勢に対して、表示制御部322は、スコアSC10として、スコア算出部3187が算出した「76」を画面に表示するよう制御する。 FIG. 17 is a diagram showing an example of the score display screen. In the example illustrated in FIG. 17, the display control unit 322 performs control so that “76” calculated by the score calculation unit 3187 is displayed on the screen as the score SC10 with respect to the posture of the user with respect to the “intestinal psoas muscle stretch” training. To do.
 次に、図15に示すトレーニングが全て行われた場合、トレーニングの結果画面が表示される。図18は、トレーニング結果画面の一例を示す図である。図18に示す画面では、キャラクタが表示され、キャラクタの部位ごとに、スコアが表示される。各部位のスコアは、その部位に関連付けられている、ユーザが行った全てのトレーニングのスコアの平均点が表示される。例えば、ユーザはトレーニングA~Cを行い、腹部に関するトレーニングがトレーニングA、Bである場合、腹部に表示されるスコアは、トレーニングA及びBのスコアの平均点である。例えば、スコアの平均点は、スコア算出部3187により算出される。図18に示すように、各部位のスコアが一画面で表示されるため、ユーザは、どこの部位に対して適切にトレーニングできていないかを把握することができる。 Next, when all the trainings shown in FIG. 15 are performed, a training result screen is displayed. FIG. 18 is a diagram illustrating an example of a training result screen. In the screen shown in FIG. 18, a character is displayed, and a score is displayed for each part of the character. As the score of each part, an average score of scores of all the trainings performed by the user associated with the part is displayed. For example, when the user performs trainings A to C and the training regarding the abdomen is trainings A and B, the score displayed on the abdomen is the average score of the training A and B scores. For example, the average score is calculated by the score calculation unit 3187. As shown in FIG. 18, since the score of each part is displayed on one screen, the user can grasp which part is not properly trained.
 図19は、アプリ選択画面の一例を示す図である。図19に示す画面は、1又は複数のアプリから一のアプリを選択するための各ボタンが表示される画面である。例えば、他のアプリを示す「RUN」ボタンB30と、「WALK」ボタン32とが表示される。 FIG. 19 is a diagram illustrating an example of an application selection screen. The screen shown in FIG. 19 is a screen on which buttons for selecting one application from one or a plurality of applications are displayed. For example, a “RUN” button B30 indicating another application and a “WALK” button 32 are displayed.
 図19に示す例で、ユーザは、「RUN」ボタンB30を押したとする。受付部312は、このボタンB30の押下を検知し、「RUN」に対応するアプリケーションから、URLスキーム等を用いて、部位「腿裏側」を示す部位IDを取得する。 In the example shown in FIG. 19, it is assumed that the user presses the “RUN” button B30. The accepting unit 312 detects the pressing of the button B30, and acquires a region ID indicating the region “thigh sole” from the application corresponding to “RUN” using a URL scheme or the like.
 図20は、他のアプリケーションから所定の部位が通知された画面の一例を示す図である。図20に示す画面は、ランニングアプリから、「腿裏側」が通知されたことを示す。図20に示す画面では、キャラクタを用いて、ランニングアプリから通知された部位が示され、トレーニングを開始する「トレーニングする」ボタンB40が表示される。 FIG. 20 is a diagram illustrating an example of a screen in which a predetermined part is notified from another application. The screen shown in FIG. 20 indicates that “back of thigh” is notified from the running application. In the screen shown in FIG. 20, the part notified from the running application is shown using the character, and a “training” button B40 for starting training is displayed.
 ここで、ユーザは、このボタンB40を押したとする。受付部312は、部位「腿裏側」を示す部位IDを特定部314に通知する。特定部314は、部位IDを取得し、図9に示すトレーニング情報を参照し、部位IDが示す部位「腿裏側」に対応する複数のトレーニングを特定する。 Here, it is assumed that the user presses this button B40. The accepting unit 312 notifies the specifying unit 314 of a site ID indicating the site “thigh back side”. The specifying unit 314 acquires the part ID, refers to the training information illustrated in FIG. 9, and specifies a plurality of trainings corresponding to the part “thigh reverse side” indicated by the part ID.
 生成部320は、例えば、特定部314により特定されたトレーニングの中から乱数を用いたランダム抽選により所定数のトレーニングを抽出し、トレーニングメニューを生成する。 The generation unit 320 extracts a predetermined number of trainings by random lottery using random numbers from the training specified by the specifying unit 314, for example, and generates a training menu.
 図21は、トレーニングメニュー表示画面の一例を示す図である。図21に示す例では、生成部320により、「ランジストレッチ」TR20、「ハムストリングストレッチ」TR22、「腸腰筋ストレッチ」TR24、「プランクポジション(膝つき)」TR26、「腿後面のスクワット」TR28、「片足立ちから片手上げ」TR30の6つのトレーニングが抽出されている。また、図21に示す画面には、これら6つのトレーニングを全て行った場合の所要時間が11分と表示される。 FIG. 21 is a diagram showing an example of a training menu display screen. In the example shown in FIG. 21, the “Lung Stretch” TR20, “Hamstring Stretch” TR22, “Intestinal Lumbar Muscle Stretch” TR24, “Plank Position (with Knee)” TR26, and “Squat on the Thigh Back” TR28 are generated by the generation unit 320. , “Training from one leg to raising one hand” TR30 is extracted. The screen shown in FIG. 21 displays a required time of 11 minutes when all six trainings are performed.
 また、図21に示す画面では、各トレーニングがボタン形式で選択可能になっており、ユーザ所定のトレーニングの表示領域を押すと、そのトレーニングが選択状態になり、この選択状態でユーザが「トレーニングを始める」ボタンを押すと、そのトレーニングが開始される。 In the screen shown in FIG. 21, each training can be selected in a button format. When the user presses a predetermined training display area, the training is selected. Press the “Start” button to start the training.
 また、生成部320は、各種類のトレーニングが所定の割合となるようにトレーニングメニューを生成してもよい。例えば、ストレッチ:強化:バランス連動の比が3:2:1である場合、生成部320は、種類「ストレッチ」から、「ランジストレッチ」TR20、「ハムストリングストレッチ」TR22、及び「腸腰筋ストレッチ」TR24の3つを抽出し、種類「強化」から、「プランクポジション(膝つき)」TR26、及び「腿後面のスクワット」TR28の2つを抽出し、種類「バランス連動」から、「片足立ちから片手上げ」TR30の1つを抽出し、メニューを自動的に生成する。 Further, the generation unit 320 may generate a training menu so that each type of training has a predetermined ratio. For example, when the ratio of stretch: strengthening: balance interlock is 3: 2: 1, the generation unit 320 changes the type “stretch”, “range stretch” TR20, “hamstring stretch” TR22, and “intestinal lumbar muscle stretch”. "TR24" is extracted, two types of "plank position (with knee)" TR26 and "thigh squat squat" TR28 are extracted from the type "strengthened", and from the type "balance interlock" One of "Third hand up from" TR30 is extracted, and a menu is automatically generated.
 トレーニングが選択されて実行されると、上述したように、所定の手順が画面に表示され、ユーザはそれに従ってトレーニングを行う。このとき、評価部318は、トレーニング中のユーザの姿勢について評価を行う。 When training is selected and executed, a predetermined procedure is displayed on the screen as described above, and the user performs training according to the procedure. At this time, the evaluation unit 318 evaluates the posture of the user during training.
 <動作>
 次に、実施例における情報処理装置10の動作について説明する。図22は、実施例における体幹トレーニングを実行するアプリケーションの全体処理の一例を示すフローチャートである。図22に示すフローチャートは、ユーザがアイウエア30を装着して、情報処理装置10を操作し、上述したアプリケーションのアイコンをユーザがタッチするなどして、このアプリケーションの実行するときに行われる処理である。なお、処理装置20と情報処理装置10との接続については、事前に行われていればよい。
<Operation>
Next, the operation of the information processing apparatus 10 in the embodiment will be described. FIG. 22 is a flowchart illustrating an example of overall processing of an application that executes trunk training in the embodiment. The flowchart shown in FIG. 22 is a process performed when this application is executed by the user wearing the eyewear 30, operating the information processing apparatus 10, and touching the application icon described above by the user. is there. The connection between the processing device 20 and the information processing device 10 may be performed in advance.
 図22に示すステップS102で、制御部306は、ユーザ操作に基づき、トレーニングの目的、又は他のアプリケーションが選択されたか否かを判定する。例えば、ユーザにより、図13に示す「目的別体幹トレーニング」ボタンB10又は「他アプリからのトレーニング」ボタンB12がタッチされたか否かを、制御部306は判定する。目的が選択されれば、処理はステップS104に進み、他のアプリが選択されれば、処理はステップS108に進む。 In step S102 shown in FIG. 22, the control unit 306 determines based on the user operation whether the purpose of training or another application has been selected. For example, the control unit 306 determines whether or not the user has touched the “Training by Purpose” button B10 or the “Training from another application” button B12 illustrated in FIG. If the purpose is selected, the process proceeds to step S104, and if another application is selected, the process proceeds to step S108.
 ステップS104で、表示制御部322は、複数の目的を画面に表示し、ユーザに一の目的を選択させる。受付部312は、選択された目的を受け付ける。例えば、ユーザにより、図14に示す「基礎体幹をトレーニング」ボタンB20、又は「歩行姿勢を綺麗にしたい」ボタンB22のどちらが押されたかを検知し、押下されたボタンに対応する目的を示す情報を取得する。 In step S104, the display control unit 322 displays a plurality of purposes on the screen and allows the user to select one purpose. The accepting unit 312 accepts the selected purpose. For example, information indicating the purpose corresponding to the pressed button is detected by the user pressing either the “Train the basic trunk” button B20 or the “I want to clean my walking posture” button B22 shown in FIG. To get.
 ステップS106で、生成部320は、目的を示す情報を受付部312から取得すると、図9に示すトレーニング情報を参照し、その目的に関連付けられたトレーニングの中から、所定数のトレーニングを組み合わせてメニューを生成する。このとき、例えば図15に示すトレーニングメニュー画面が表示される。なお、目的に応じたメニューの生成処理は、図23を用いて後述する。 In step S <b> 106, when the generation unit 320 acquires information indicating the purpose from the reception unit 312, the generation unit 320 refers to the training information illustrated in FIG. 9 and combines a predetermined number of trainings from among the trainings associated with the purpose. Is generated. At this time, for example, a training menu screen shown in FIG. 15 is displayed. The menu generation process according to the purpose will be described later with reference to FIG.
 ステップS108で、表示制御部322は、複数の他のアプリケーションを画面に表示し、ユーザに一のアプリケーションを選択させる。受付部312は、選択されたアプリケーションを受け付ける。例えば、ユーザにより、図19に示す「RUN」ボタンB30、又は「WALK」ボタンB32のどちらが押されたかを検知し、押下されたボタンに対応するアプケーションを示す情報を取得する。 In step S108, the display control unit 322 displays a plurality of other applications on the screen and causes the user to select one application. The accepting unit 312 accepts the selected application. For example, the user detects which of the “RUN” button B30 or the “WALK” button B32 shown in FIG. 19 is pressed, and acquires information indicating an application corresponding to the pressed button.
 ステップS110で、生成部320は、アプリケーションを示す情報を受付部312から取得すると、URLスキームなどを用いて、そのアプリケーションから所定の部位を示す情報を取得する。このとき、図20に示すように、取得した所定の部位をユーザに通知してもよい。次に、生成部320は、図9に示すトレーニング情報を参照し、他のアプリケーションから取得した所定の部位に関連付けられたトレーニングの中から、所定数のトレーニングを組み合わせてメニューを生成する。このとき、例えば図21に示すトレーニングメニュー画面が表示される。なお、所定の部位に応じたメニューの生成処理は、図24を用いて後述する。 In step S110, when the generation unit 320 acquires information indicating an application from the reception unit 312, the generation unit 320 acquires information indicating a predetermined part from the application using a URL scheme or the like. At this time, as shown in FIG. 20, the acquired predetermined part may be notified to the user. Next, the generation unit 320 refers to the training information illustrated in FIG. 9 and generates a menu by combining a predetermined number of trainings from among the trainings associated with a predetermined part acquired from another application. At this time, for example, a training menu screen shown in FIG. 21 is displayed. The menu generation process corresponding to the predetermined part will be described later with reference to FIG.
 ステップS112で、制御部306は、トレーニングが開始されたか否かを判定する。例えば、制御部306は、画面に表示されたトレーニングを始めるためのボタンがユーザにより押されたか否かを判定する。トレーニングが開始されれば(ステップS112-YES)、処理はステップS114に進み、トレーニングが開始されなければ(ステップS112-NO)、処理はステップS112に戻る。 In step S112, the control unit 306 determines whether training has started. For example, the control unit 306 determines whether or not a button for starting training displayed on the screen has been pressed by the user. If training is started (step S112—YES), the process proceeds to step S114. If training is not started (step S112—NO), the process returns to step S112.
 ステップS114で、制御部306は、トレーニング処理を開始する。制御部306は、アイウエア30に搭載された6軸センサ206によりセンシングされた信号を用いて、トレーニング中の姿勢を評価する。このトレーニング処理は、図25を用いて後述する。 In step S114, the control unit 306 starts a training process. The control unit 306 evaluates the posture during training using a signal sensed by the 6-axis sensor 206 mounted on the eyewear 30. This training process will be described later with reference to FIG.
 ステップS116で、表示制御部322は、トレーニング結果を画面に表示するよう制御する。このとき、例えば、図18に示すトレーニング結果画面が表示される。 In step S116, the display control unit 322 controls to display the training result on the screen. At this time, for example, a training result screen shown in FIG. 18 is displayed.
 以上の処理によれば、体幹に関するトレーニング処理を実行する際に、トレーニングメニューを自動で生成し、ユーザに装着されたセンサからの信号を用いてトレーニングの結果を評価することで、適切な評価を行うことができる。 According to the above processing, when performing training processing on the trunk, appropriate evaluation is performed by automatically generating a training menu and evaluating the result of training using a signal from a sensor worn by the user. It can be performed.
 図23は、目的に応じたメニューの生成処理の一例を示すフローチャートである。図23に示すステップS202で、特定部314は、図9に示すトレーニング情報を参照し、受け付けられた目的に基づいて1又は複数のトレーニングを特定する。 FIG. 23 is a flowchart showing an example of a menu generation process according to the purpose. In step S202 illustrated in FIG. 23, the identifying unit 314 refers to the training information illustrated in FIG. 9 and identifies one or a plurality of trainings based on the accepted purpose.
 ステップS204で、第1生成部3202は、特定されたトレーニングの種類に基づき、各種類の割合が所定の割合になるように、トレーニングを抽出する。例えば、トレーニングの種類がA、B、Cである場合、それぞれ1:1:1になるようにトレーニングをランダムに抽出する。所定の割合は、適宜設定可能である。 In step S204, the first generation unit 3202 extracts the training so that the ratio of each type becomes a predetermined ratio based on the specified type of training. For example, when the types of training are A, B, and C, the training is randomly extracted so as to be 1: 1: 1. The predetermined ratio can be set as appropriate.
 ステップS206で、第1生成部3202は、抽出されたトレーニングを組み合わせてメニューを生成する。例えば、第1生成部3202は、トレーニングの所要時間や、同じトレーニングの種類が続かないなどの所定基準に従って、トレーニングの順番を決めてもよい。 In step S206, the first generation unit 3202 generates a menu by combining the extracted training. For example, the first generation unit 3202 may determine the order of training according to a predetermined criterion such as the time required for training or the same training type not continuing.
 以上の処理によれば、ユーザは、自身が体幹を鍛える目的を選択するだけで、その目的に合ったトレーニングメニューを自動で生成することができる。また、同じ目的が繰り返し選択されても、トレーニングはランダム抽出されるため、毎回同じトレーニングメニューではなく、ユーザに飽きさせずにトレーニングを継続させることができる。 According to the above processing, the user can automatically generate a training menu suitable for the purpose only by selecting the purpose of training the trunk. Even if the same purpose is selected repeatedly, the training is randomly extracted, so that the training can be continued without getting tired of the user instead of the same training menu every time.
 図24は、所定の部位に応じたメニューの生成処理の一例を示すフローチャートである。図24に示すステップS302で、受付部312は、他のアプリケーションから、URLスキームなどを用いて、所定の部位を示す情報を取得する。 FIG. 24 is a flowchart showing an example of menu generation processing corresponding to a predetermined part. In step S302 illustrated in FIG. 24, the reception unit 312 acquires information indicating a predetermined part from another application using a URL scheme or the like.
 ステップS314で、特定部314は、図9に示すトレーニング情報を参照し、受け付けられた所定の部位に基づいて1又は複数のトレーニングを特定する。例えば、特定部314は、所定の部位に対して効果が有るとして関連付けられたトレーニングを特定する。 In step S314, the identifying unit 314 refers to the training information illustrated in FIG. 9 and identifies one or more trainings based on the received predetermined part. For example, the specifying unit 314 specifies the training associated with the predetermined part as having an effect.
 ステップS204で、第2生成部3204は、特定されたトレーニングの中から、所定数のトレーニングをランダムに抽出し、抽出されたトレーニングを組み合わせてメニューを生成する。例えば、第2生成部3204は、トレーニングの所要時間や、同じトレーニングの種類が続かないなどの所定基準に従って、トレーニングの順番を決めてもよい。 In step S204, the second generation unit 3204 randomly extracts a predetermined number of trainings from the identified trainings, and generates a menu by combining the extracted trainings. For example, the second generation unit 3204 may determine the order of training according to a predetermined criterion such as the time required for training and the same training type not continuing.
 以上の処理によれば、他のアプリケーションから通知された所定の部位に合ったトレーニングメニューを自動で生成することができる。また、同じ部位が繰り返し通知されても、トレーニングはランダム抽出されるため、毎回同じトレーニングメニューではなく、ユーザに飽きさせずにトレーニングを継続させることができる。なお、他のアプリケーションから通知される以外にも、ユーザが鍛えたい部位を入力することで、図24に示す処理を実行させるようにしてもよい。 According to the above processing, a training menu suitable for a predetermined part notified from another application can be automatically generated. Even if the same part is repeatedly notified, since the training is randomly extracted, the training can be continued without getting tired of the user instead of the same training menu every time. In addition to the notification from another application, the process shown in FIG. 24 may be executed by inputting a part that the user wants to train.
 図25は、トレーニング処理の一例を示すフローチャートである。図25に示すステップS402で、表示制御部322は、ユーザに対してトレーニング前の姿勢(準備姿勢とも称す。)を促す画面を表示するよう制御する。このとき、評価部318は、センサからの信号に基づいて、準備姿勢についても評価してもよい。 FIG. 25 is a flowchart showing an example of the training process. In step S402 illustrated in FIG. 25, the display control unit 322 performs control so as to display a screen that prompts the user to perform a posture before training (also referred to as a preparation posture). At this time, the evaluation unit 318 may also evaluate the preparation posture based on the signal from the sensor.
 ステップS404で、表示制御部322は、ユーザに対してトレーニング時の姿勢(静止姿勢とも称す。)を維持する画面を表示するよう制御する。このとき、評価部318は、センサからの信号に基づいて、静止姿勢について評価する。 In step S404, the display control unit 322 controls the user to display a screen for maintaining a posture during training (also referred to as a stationary posture). At this time, the evaluation unit 318 evaluates the stationary posture based on the signal from the sensor.
 ステップS406で、スコア算出部3187は、トレーニング中の姿勢について、そのトレーニングに対する閾値を用いて採点を行い、スコアを算出する。また、そのトレーニングに応じた信号を用いることで、関連性が低い信号を用いた処理を省くことができ、省電力化に寄与することができる。 In step S406, the score calculation unit 3187 scores a posture during training using a threshold value for the training, and calculates a score. In addition, by using a signal corresponding to the training, it is possible to omit processing using a signal having low relevance and contribute to power saving.
 ステップS408で、制御部306は、全セットが終了したか否かを判定する。全セットが終了していれば(ステップS408-YES)、処理はステップS410に進み、全セット終了していなければ(ステップS408-NO)、処理はS402に戻る。 In step S408, the control unit 306 determines whether or not all sets have been completed. If all sets have been completed (step S408-YES), the process proceeds to step S410. If all sets have not been completed (step S408-NO), the process returns to S402.
 ステップS410で、スコア算出部3187は、各セットにおけるスコアを合計し、セット数で除算することで、平均点(平均スコア)を算出する。算出された平均点は、表示制御部322により画面に表示される。 In step S410, the score calculation unit 3187 calculates the average score (average score) by summing up the scores in each set and dividing by the number of sets. The calculated average score is displayed on the screen by the display control unit 322.
 以上の処理によれば、所定の姿勢を所定時間維持するトレーニングに対し、適切な評価を行うことができる。 According to the above processing, it is possible to perform an appropriate evaluation with respect to training for maintaining a predetermined posture for a predetermined time.
 なお、図22~24で説明した処理のフローに含まれる各処理ステップは、処理内容に矛盾を生じない範囲で、任意に順番を変更して又は並列に実行することができるとともに、各処理ステップ間に他のステップを追加してもよい。また、便宜上1ステップとして記載されているステップは、複数ステップに分けて実行することができる一方、便宜上複数ステップに分けて記載されているものは、1ステップとして把握することができる。 Each of the processing steps included in the processing flow described with reference to FIGS. 22 to 24 can be executed in any order or in parallel within a range in which there is no contradiction in processing contents. Other steps may be added in between. Further, a step described as one step for convenience can be executed by being divided into a plurality of steps, while a step described as being divided into a plurality of steps for convenience can be grasped as one step.
 以上、実施例によれば、所定姿勢を所定時間維持するトレーニングにおけるユーザの姿勢について、適切な評価を行うことができる。また、実施例によれば、トレーニングごとに用いる信号値を変更することで、そのトレーニングに合った評価を行うことができ、さらに、無駄な処理を省くことで、省電力化に寄与することができる。また、実施例によれば、所定の目的や部位などに基づいて、トレーニングメニューを自動で生成することで、 As described above, according to the embodiment, it is possible to appropriately evaluate the posture of the user in the training for maintaining the predetermined posture for a predetermined time. In addition, according to the embodiment, by changing the signal value used for each training, it is possible to perform an evaluation suitable for the training, and further, it is possible to contribute to power saving by omitting unnecessary processing. it can. In addition, according to the embodiment, by automatically generating a training menu based on a predetermined purpose or part,
 なお、実施例において、アイウエア30がメガネである場合について説明した。しかし、アイウエアはこれに限定されない。アイウエアは、眼に関連する装具であればよく、メガネ、サングラス、ゴーグル及びヘッドマウントディスプレイならびにこれらのフレームなどの顔面装着具又は頭部装着具であってよい。 In the embodiment, the case where the eyewear 30 is glasses has been described. However, eyewear is not limited to this. The eyewear may be any device related to the eye, and may be a face wearing device or a head wearing device such as glasses, sunglasses, goggles and a head mounted display and their frames.
 実施例において、アイウエア30が生体電極を設けてもよいことを説明したが、この生体電極から取得できる眼電位信号に基づいて、視線移動や瞬目を検出してもよい。このとき、6軸センサ206から取得できる各データと、視線移動や瞬目とが関連付けて記憶されてもよい。これにより、運動時の瞬目や視線移動を分析することが可能になる。 In the embodiment, it has been described that the eyewear 30 may be provided with a biological electrode. However, eye movement or blink may be detected based on an electrooculogram signal that can be obtained from the biological electrode. At this time, each data that can be acquired from the 6-axis sensor 206 may be stored in association with the line-of-sight movement and the blink. This makes it possible to analyze blinks and line-of-sight movement during exercise.
 なお、実施例において、アイウエア30に搭載された6軸センサ206からの検出データを用いて説明したが、情報処理装置10に搭載された6軸センサ111からの検出データを用いても、実施例において説明したアプリケーションを実行することが可能である。すなわち、6軸センサは、頭部だけではなく、人体のいずれかの位置に装着されていればよい。なお、6軸センサは、体幹部分に装着されることが好ましい。 In the embodiment, the detection data from the six-axis sensor 206 mounted on the eyewear 30 has been described. However, the detection data from the six-axis sensor 111 mounted on the information processing apparatus 10 may be used. It is possible to execute the application described in the example. That is, the 6-axis sensor only needs to be mounted not only on the head but also on any position of the human body. The 6-axis sensor is preferably attached to the trunk.
 また、実施例では、多くのセンサをユーザが装着する必要がなく、少なくとも1つの6軸センサがあれば実装可能である。よって、ユーザにセンサ装着の煩わしさを感じさせることなく、体幹に関するトレーニング時の姿勢を評価することができる。 In addition, in the embodiment, it is not necessary for the user to mount many sensors, and if there is at least one 6-axis sensor, it can be mounted. Therefore, the posture at the time of training regarding the trunk can be evaluated without making the user feel bothersome about wearing the sensor.
 また、実施例において、上述したアプリケーションを体幹トレーニングに適用する例を用いて説明したが、トレーニングごとに、有効なセンサ信号が異なる場合にも適用可能である。また、トレーニングメニューの生成については、体幹トレーニング以外にも、複数の種類があるトレーニングや、結果が異なる部位に効果的であるトレーニングについても適用可能である。 In the embodiment, the application described above has been described using an example in which the application is applied to trunk training. However, the present invention can also be applied to cases where effective sensor signals differ for each training. In addition to trunk training, the generation of training menus can also be applied to training that has a plurality of types and training that is effective for parts with different results.
 以上、本発明について実施例を用いて説明したが、本発明の技術的範囲は上記実施例に記載の範囲には限定されない。上記実施例に、多様な変更又は改良を加えることが可能であることが当業者に明らかである。その様な変更又は改良を加えた形態も本発明の技術的範囲に含まれ得ることが、特許請求の範囲の記載から明らかである。 As mentioned above, although this invention was demonstrated using the Example, the technical scope of this invention is not limited to the range as described in the said Example. It will be apparent to those skilled in the art that various modifications and improvements can be made to the above-described embodiments. It is apparent from the description of the scope of claims that embodiments with such changes or improvements can be included in the technical scope of the present invention.
10 情報処理装置
20 処理装置
30 アイウエア
302 記憶部
304 通信部
306 制御部
312 受付部
314 特定部
316 取得部
318 評価部
320 生成部
322 表示制御部
DESCRIPTION OF SYMBOLS 10 Information processing apparatus 20 Processing apparatus 30 Eyewear 302 Storage part 304 Communication part 306 Control part 312 Reception part 314 Identification part 316 Acquisition part 318 Evaluation part 320 Generation part 322 Display control part

Claims (11)

  1.  制御部及び記憶部を有するコンピュータが実行する情報処理方法であって、
     前記制御部は、
     体幹に関する所定のトレーニングを示す情報を受け付けること、
     前記記憶部に記憶された閾値情報であって、体幹に関するトレーニングごとに閾値が関連付けられた前記閾値情報に基づき、前記所定のトレーニングに対する閾値を特定すること、
     評価対象のユーザに装着されたセンサから検出される信号値を取得すること、
     前記信号値と前記閾値との比較結果に基づいて、前記所定のトレーニング時の前記ユーザの姿勢を評価すること、
     を実行する情報処理方法。
    An information processing method executed by a computer having a control unit and a storage unit,
    The controller is
    Accepting information indicating predetermined training on the trunk,
    Identifying threshold values for the predetermined training based on the threshold information stored in the storage unit and associated with a threshold value for each training related to the trunk,
    Obtaining a signal value detected from a sensor attached to the user to be evaluated;
    Evaluating the posture of the user during the predetermined training based on a comparison result between the signal value and the threshold;
    Information processing method to execute.
  2.  前記特定することは、
     前記トレーニングごとに、比較対象の1又は複数の信号及び該信号の閾値が関連付けられた前記閾値情報に基づき、前記所定のトレーニングに対する前記比較対象の信号及び閾値を特定し、
     前記取得することは、
     前記センサから、前記比較対象の信号の信号値を取得する、請求項1に記載の情報処理方法。
    The specifying is
    For each training, based on the threshold information associated with one or more signals to be compared and the thresholds of the signals, identify the signals and thresholds to be compared for the predetermined training,
    The obtaining is
    The information processing method according to claim 1, wherein a signal value of the signal to be compared is acquired from the sensor.
  3.  前記評価することは、
     前記信号値の絶対値が前記閾値未満であるか否かに基づいて、前記姿勢の安定性を評価する、請求項2に記載の情報処理方法。
    The evaluation is
    The information processing method according to claim 2, wherein the posture stability is evaluated based on whether or not an absolute value of the signal value is less than the threshold value.
  4.  前記特定することは、
     前記トレーニングごとに、前記信号に関するピーク閾値がさらに関連付けられた前記閾値情報に基づき、前記所定のトレーニングに対する前記比較対象の信号のピーク閾値をさらに特定し、
     前記取得することは、
     前記比較対象の信号のピーク間の差値を算出することを含み、
     前記評価することは、
     前記差値の絶対値が前記ピーク閾値未満であるか否かにさらに基づいて、前記姿勢の安定性を評価する、請求項2に記載の情報処理方法。
    The specifying is
    For each of the trainings, based on the threshold information further associated with a peak threshold for the signal, further identifying a peak threshold of the signal to be compared for the predetermined training,
    The obtaining is
    Calculating a difference value between peaks of the signal to be compared;
    The evaluation is
    The information processing method according to claim 2, wherein the posture stability is evaluated further based on whether or not an absolute value of the difference value is less than the peak threshold value.
  5.  前記評価することは、
     前記信号値の絶対値又は前記差値の絶対値が閾値以上となる回数に応じて、前記姿勢の安定性を示すスコアを算出することを含む、請求項4に記載の情報処理方法。
    The evaluation is
    The information processing method according to claim 4, further comprising: calculating a score indicating stability of the posture according to the number of times that the absolute value of the signal value or the absolute value of the difference value is equal to or greater than a threshold value.
  6.  前記スコアを算出することは、
     前記ユーザが前記所定のトレーニング中であるか否かの判定結果を、前記スコアに反映する、請求項5に記載の情報処理方法。
    To calculate the score,
    The information processing method according to claim 5, wherein a determination result of whether or not the user is in the predetermined training is reflected in the score.
  7.  前記記憶部に記憶されたトレーニング情報であって、前記トレーニングごとに、1又は複数の属性が関連付けられた前記トレーニング情報に基づき、所定の属性に対する1又は複数のトレーニングを特定すること、
     特定された1又は複数のトレーニングの中から、所定数のトレーニングを組み合わせてトレーニングのメニューを生成すること、
     を前記制御部がさらに実行する、請求項1に記載の情報処理方法。
    Identifying one or more trainings for a predetermined attribute based on the training information associated with one or more attributes for each training, the training information stored in the storage unit;
    Generating a training menu by combining a predetermined number of trainings from one or more identified trainings;
    The information processing method according to claim 1, wherein the control unit further executes.
  8.  前記受け付けることは、
     前記トレーニングに関する所定の目的を示す情報を受け付け、
     前記トレーニングを特定することは、
     前記属性として、前記トレーニングに関する目的及び種類が関連付けられた前記トレーニング情報に基づき、前記所定の目的に対する複数のトレーニングを特定し、
     前記生成することは、
     特定された複数のトレーニングの中から、各種類の割合が所定の割合になるように当該各種類のトレーニングを抽出すること、
     抽出された各種類のトレーニングを組み合わせてトレーニングのメニューを生成すること、を含む、
     請求項7に記載の情報処理方法。
    The accepting is
    Accepts information indicating a predetermined purpose for the training,
    Identifying the training is
    Based on the training information associated with the purpose and type related to the training as the attribute, a plurality of trainings for the predetermined purpose are identified,
    Said generating is
    Extracting each type of training from a plurality of identified training so that the proportion of each type becomes a predetermined rate,
    Combining each extracted type of training to generate a training menu,
    The information processing method according to claim 7.
  9.  前記受け付けることは、
     前記ユーザの体に関する所定の部位を示す情報を受け付け、
     前記トレーニングを特定することは、
     前記属性として、前記ユーザの体に関する1又は複数の部位が関連付けられた前記トレーニング情報に基づき、前記所定の部位に対する複数のトレーニングを特定し、
     前記生成することは、
     特定された複数のトレーニングの中から、所定数のトレーニングを組み合わせてトレーニングのメニューを生成する、請求項7に記載の情報処理方法。
    The accepting is
    Receiving information indicating a predetermined part of the user's body;
    Identifying the training is
    Based on the training information associated with one or more parts related to the user's body as the attribute, identify a plurality of training for the predetermined part,
    Said generating is
    The information processing method according to claim 7, wherein a menu for training is generated by combining a predetermined number of trainings from among the plurality of trainings specified.
  10.  体幹に関する所定のトレーニングを示す情報を受け付ける受付部と、
     体幹に関するトレーニングごとに閾値が関連付けられた閾値情報を記憶する記憶部と、
     前記閾値情報に基づき、前記所定のトレーニングに対する閾値を特定する特定部と、
     評価対象のユーザに装着されたセンサから検出される信号値を取得する取得部と、
     前記信号値と前記閾値との比較結果に基づいて、前記所定のトレーニング時の前記ユーザの姿勢を評価する評価部と、
     を備える情報処理装置。
    A reception unit that receives information indicating a predetermined training on the trunk;
    A storage unit that stores threshold information associated with a threshold for each training related to the trunk;
    A specifying unit for specifying a threshold for the predetermined training based on the threshold information;
    An acquisition unit for acquiring a signal value detected from a sensor attached to a user to be evaluated;
    An evaluation unit that evaluates the posture of the user during the predetermined training based on a comparison result between the signal value and the threshold;
    An information processing apparatus comprising:
  11.  コンピュータに、
     体幹に関する所定のトレーニングを示す情報を受け付けること、
     体幹に関するトレーニングごとに閾値が関連付けられた閾値情報に基づき、前記所定のトレーニングに対する閾値を特定すること、
     評価対象のユーザに装着されたセンサから検出される信号値を取得すること、
     前記信号値と前記閾値との比較結果に基づいて、前記所定のトレーニング時の前記ユーザの姿勢を評価すること、
     を実行させるプログラム。
     
     
     
     
     
    On the computer,
    Accepting information indicating predetermined training on the trunk,
    Identifying a threshold for the predetermined training based on threshold information associated with a threshold for each training on the trunk;
    Obtaining a signal value detected from a sensor attached to the user to be evaluated;
    Evaluating the posture of the user during the predetermined training based on a comparison result between the signal value and the threshold;
    A program that executes




PCT/JP2016/052397 2016-01-27 2016-01-27 Information processing method, information processing device, and program WO2017130339A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2016/052397 WO2017130339A1 (en) 2016-01-27 2016-01-27 Information processing method, information processing device, and program
JP2017563464A JP6689889B2 (en) 2016-01-27 2016-01-27 Information processing method, information processing apparatus, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/052397 WO2017130339A1 (en) 2016-01-27 2016-01-27 Information processing method, information processing device, and program

Publications (1)

Publication Number Publication Date
WO2017130339A1 true WO2017130339A1 (en) 2017-08-03

Family

ID=59397684

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/052397 WO2017130339A1 (en) 2016-01-27 2016-01-27 Information processing method, information processing device, and program

Country Status (2)

Country Link
JP (1) JP6689889B2 (en)
WO (1) WO2017130339A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6811349B1 (en) * 2020-03-31 2021-01-13 株式会社三菱ケミカルホールディングス Information processing equipment, methods, programs
JP7150387B1 (en) 2022-01-13 2022-10-11 三菱ケミカルグループ株式会社 Programs, methods and electronics

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009201681A (en) * 2008-02-27 2009-09-10 Xing Inc Exercise supporting apparatus, exercise supporting method, and computer program
JP2010005033A (en) * 2008-06-25 2010-01-14 Panasonic Electric Works Co Ltd Walking motion analyzer
WO2014091583A1 (en) * 2012-12-12 2014-06-19 富士通株式会社 Acceleration sensor output processing program, processing method, and processing device, and gait assessment program
US20140174174A1 (en) * 2012-12-19 2014-06-26 Alert Core, Inc. System, apparatus, and method for promoting usage of core muscles and other applications
JP2015002964A (en) * 2013-05-20 2015-01-08 セイコーインスツル株式会社 Exercise form analyzer and exercise form analysis method
JP2015160049A (en) * 2014-02-28 2015-09-07 株式会社東芝 Ultrasonic diagnostic device and ultrasonic diagnostic device user's posture evaluation notifying method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010175754A (en) * 2009-01-28 2010-08-12 Yamaha Corp Attitude evaluating device, attitude evaluating system and program
US9283429B2 (en) * 2010-11-05 2016-03-15 Nike, Inc. Method and system for automated personal training
US9452341B2 (en) * 2012-02-29 2016-09-27 Mizuno Corporation Running form diagnosis system and method for scoring running form
US10318708B2 (en) * 2013-03-14 2019-06-11 Nike, Inc. System and method for monitoring athletic activity from multiple body locations

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009201681A (en) * 2008-02-27 2009-09-10 Xing Inc Exercise supporting apparatus, exercise supporting method, and computer program
JP2010005033A (en) * 2008-06-25 2010-01-14 Panasonic Electric Works Co Ltd Walking motion analyzer
WO2014091583A1 (en) * 2012-12-12 2014-06-19 富士通株式会社 Acceleration sensor output processing program, processing method, and processing device, and gait assessment program
US20140174174A1 (en) * 2012-12-19 2014-06-26 Alert Core, Inc. System, apparatus, and method for promoting usage of core muscles and other applications
JP2015002964A (en) * 2013-05-20 2015-01-08 セイコーインスツル株式会社 Exercise form analyzer and exercise form analysis method
JP2015160049A (en) * 2014-02-28 2015-09-07 株式会社東芝 Ultrasonic diagnostic device and ultrasonic diagnostic device user's posture evaluation notifying method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6811349B1 (en) * 2020-03-31 2021-01-13 株式会社三菱ケミカルホールディングス Information processing equipment, methods, programs
JP2021159313A (en) * 2020-03-31 2021-10-11 株式会社三菱ケミカルホールディングス Information processing apparatus, method and program
JP7150387B1 (en) 2022-01-13 2022-10-11 三菱ケミカルグループ株式会社 Programs, methods and electronics
JP2023102856A (en) * 2022-01-13 2023-07-26 三菱ケミカルグループ株式会社 Program, method and electronic apparatus

Also Published As

Publication number Publication date
JP6689889B2 (en) 2020-04-28
JPWO2017130339A1 (en) 2018-11-22

Similar Documents

Publication Publication Date Title
US11389686B2 (en) Robotically assisted ankle rehabilitation systems, apparatuses, and methods thereof
KR101858116B1 (en) A postural balance training system
TWI476633B (en) Tactile communication system
US10893830B2 (en) Electronic apparatus, system, and method for providing body posture health information
CN104023802B (en) Use the control of the electronic installation of neural analysis
CN108153422B (en) Display object control method and mobile terminal
KR20160084702A (en) Apparatus and method for assisting physical exercise
JP2017185076A (en) Information processing method, information processing device, and program
JP6621133B2 (en) Information processing method, information processing apparatus, and program
WO2017130339A1 (en) Information processing method, information processing device, and program
WO2018155098A1 (en) Information processing method, information processing device, and program
US11501552B2 (en) Control apparatus, information processing system, control method, and program
JP6697300B2 (en) Information processing method, program, and information processing device
TW201143715A (en) Sleeping efficiency analyzer and analyzing method thereof
JP7055111B2 (en) Mobile devices, programs and methods that can estimate the user&#39;s neck bending state
KR102390599B1 (en) Method and apparatus for training inner concentration
CN117084668A (en) Drunk state evaluation method and device, storage medium and electronic equipment
JP6706958B2 (en) Information processing method, information processing apparatus, and program
JP6067148B1 (en) Information processing method, information processing apparatus, and program
CN108635805B (en) Motion reminding method and terminal equipment
CN109489798B (en) Method and device for detecting amplitude displacement of linear motor
JP6670710B2 (en) Information processing method, information processing apparatus and program
JP6621134B2 (en) Information processing method, information processing apparatus, and program
WO2022028307A1 (en) Blood pressure measurement method, electronic device, and computer readable storage medium
JP2023039749A (en) Biofeedback system, biofeedback method, and control program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16887926

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2017563464

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16887926

Country of ref document: EP

Kind code of ref document: A1