US20140073481A1 - Exercise support apparatus, exercise support method and exercise support program - Google Patents
Exercise support apparatus, exercise support method and exercise support program Download PDFInfo
- Publication number
- US20140073481A1 US20140073481A1 US14/021,885 US201314021885A US2014073481A1 US 20140073481 A1 US20140073481 A1 US 20140073481A1 US 201314021885 A US201314021885 A US 201314021885A US 2014073481 A1 US2014073481 A1 US 2014073481A1
- Authority
- US
- United States
- Prior art keywords
- user
- section
- display
- virtual person
- exercise support
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 60
- 230000033001 locomotion Effects 0.000 claims abstract description 144
- 230000008859 change Effects 0.000 claims abstract description 17
- 230000001133 acceleration Effects 0.000 claims description 64
- 230000006870 function Effects 0.000 claims description 59
- 238000012545 processing Methods 0.000 claims description 43
- 230000001360 synchronised effect Effects 0.000 abstract description 2
- 239000011295 pitch Substances 0.000 description 67
- 239000011521 glass Substances 0.000 description 58
- 210000002683 foot Anatomy 0.000 description 39
- 230000015654 memory Effects 0.000 description 22
- 238000004891 communication Methods 0.000 description 18
- 210000000707 wrist Anatomy 0.000 description 18
- 230000003247 decreasing effect Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 8
- 238000012549 training Methods 0.000 description 8
- 210000003128 head Anatomy 0.000 description 6
- 230000006872 improvement Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000010365 information processing Effects 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 235000019577 caloric intake Nutrition 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 210000003423 ankle Anatomy 0.000 description 2
- 238000003306 harvesting Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 229910052987 metal hydride Inorganic materials 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0084—Exercising apparatus with means for competitions, e.g. virtual races
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/742—Details of notification to user or communication with user or patient; User input means using visual displays
- A61B5/744—Displaying an avatar, e.g. an animated cartoon character
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
- G09B19/0038—Sports
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present invention relates to an exercise support apparatus, an exercise support method and an exercise support program. Specifically, the present invention relates to an exercise support apparatus, an exercise support method and an exercise support program that can be applied to exercises such as walking or running.
- a device that provides information serving as a pacemaker to a user has been known.
- the information provided by this device is, for example, merely numerical value information such as a pitch (a footstep count) or a running speed.
- all it does to display the numerical value information as a pacemaker is to change the display format or the display method.
- the present invention can advantageously provide an exercise support apparatus, an exercise support method and an exercise support program that contributes to achievement of a high exercise effect and an excellent record in an exercise such as moving, by appropriately guiding a user like an actual pacemaker.
- an exercise support apparatus comprising: a sensor section which outputs motion data corresponding to a motion status of a user performing an exercise by moving; a motion information obtaining section which obtains motion information of the user based on the motion data; an image generating section which generates a moving image of a virtual person in a moving state, and sets a way of movements of feet of the virtual person to a way of movements corresponding to the obtained motion information of the user; and a display section which displays the moving image on a part of a display area arranged in a viewing field of the user.
- an exercise support method for an exercise support apparatus including a display section having a display area that is arranged in a viewing field of a user, comprising: a step of obtaining motion information of the user based on motion data corresponding to a motion status of the user performing an exercise by moving; a step of generating a moving image of a virtual person in a moving state; a step of setting a way of movements of feet of the virtual person to a way of movements corresponding to the obtained motion information of the user; and a step of displaying the moving image on a part of the display area of the display section.
- a non-transitory computer-readable storage medium having stored thereon an exercise support program that is executable by a computer in an exercise support apparatus including a display section having a display area that is arranged in a viewing field of a user, the program being executable by the computer to perform functions comprising: processing for obtaining motion information of the user based on motion data corresponding to a motion status of the user performing an exercise by moving; processing for generating a moving image of a virtual person in a moving state; processing for setting a way of movements of feet of the virtual person to a way of movements corresponding to the obtained motion information of the user; and processing for displaying the moving image on a part of the display area of the display section.
- FIG. 1A , FIG. 1B , and FIG. 1C are schematic structural views of an exercise support apparatus according to a first embodiment of the present invention
- FIG. 2 is a block diagram showing an example of structure of display glasses applied to the exercise support apparatus according to the first embodiment
- FIG. 3A , FIG. 3B , and FIG. 3C are schematic diagrams each depicting an example of a method of displaying a virtual person applied in an exercise support method according to the first embodiment
- FIG. 4 is a flowchart depicting a first example of a normal mode applied to the exercise support method according to the first embodiment
- FIG. 5 is a flowchart depicting a second example of a normal mode applied to the exercise support method according to the first embodiment
- FIG. 6 is a flowchart of an example depicting a long-distance running mode applied to the exercise support method according to the first embodiment
- FIG. 7 is a flowchart of an example of a pace set mode applied to the exercise support method according to the first embodiment
- FIG. 8 is a flowchart of an example of a build-up mode applied to the exercise support method according to the first embodiment
- FIG. 9A , FIG. 9B , and FIG. 9C are schematic structural views of an exercise support apparatus according to a second embodiment of the present invention.
- FIG. 10A and FIG. 10B are block diagrams showing an example of structure of a device applied to the exercise support apparatus according to the second embodiment.
- FIG. 11A and FIG. 11B are block diagrams showing an example of structure of a device applied to an exercise support apparatus according to a third embodiment.
- the present invention is not limited thereto, and can be applied to any other exercise such as walking,
- FIG. 1A , FIG. 1B , and FIG. 1C are schematic structural views of an exercise support apparatus according to a first embodiment of the present invention.
- FIG. 2 is a block diagram of an example of structure of display glasses applied to the exercise support apparatus according to the first embodiment.
- the exercise support apparatus has, for example, display glasses (a head-mount display) 100 mounted on the head part of a user US who is a runner, as schematically depicted in FIG. 1A .
- display glasses a head-mount display
- the display glasses 100 have an outer appearance of, for example, eyeglasses-type or a goggles-type, as depicted in FIG. 1B and FIG. 1C .
- the display glasses 100 mainly includes a main body 101 having a transparent-type display section 110 which is positioned in an area immediately in front of the eyes of the user US and in which the lenses of eyeglasses or goggles are positioned, and temples for mounting the display glasses 100 on the head part; and a display control section 102 which performs transparent display or projection display of a desired image and various exercise information on a part of the display section 110 of the main body 101 to visually provide the image or information to the user US.
- a display control section 102 which performs transparent display or projection display of a desired image and various exercise information on a part of the display section 110 of the main body 101 to visually provide the image or information to the user US.
- the display glasses 100 may have a structure in which the display control section 102 is integrally provided inside the main body 101 , as depicted in FIG. 1B , or a structure in which the display control section 102 including an image projecting apparatus 103 is additionally assembled on commercially-available sports glasses, sunglasses or eyeglasses, as depicted in FIG. 1C .
- the display glasses 100 mainly includes for example, the display section 110 , emotion sensor section 120 , an operation section 130 , a central computation circuit (hereinafter referred to as a “CPU”) 140 , a memory 150 , and an operation power supply 160 , as depicted in FIG. 2
- the display control section 102 is provided with components of the display glasses 100 other than the display section 110 , that is, the motion sensor section 120 , the operation section 130 , the CPU 140 , the memory 150 , and the operation power supply 160 .
- the display section 110 can have a transparent-type display panel in place of the lens of the eyeglasses or goggles.
- a transparent-type liquid-crystal display panel or an organic EL display panel capable of color or monochrome display can be applied.
- the display section 110 is structured to perform transparent display directly onto the image projection apparatus 103 or to perform projection display on a transparent glass, transparent resin, or the like of a lens of the sports glasses immediately in front of the eyes of the user US.
- a desired image and exercise information are displayed in the viewing field of the user US in a manner to be superimposed on the surrounding view.
- a moving image or a still image of a virtual person serving as a pacemaker dedicated for the user US is displayed, which is generated based on the exercise support method described further below.
- the display section 110 in addition to the image of the virtual person, for example, numerical information and character information regarding the exercise performed by the user US (for example, pitch (footstep count), running speed, run distance, and calorie consumption amount) are displayed as exercise information.
- numerical information and character information regarding the exercise performed by the user US for example, pitch (footstep count), running speed, run distance, and calorie consumption amount
- image and exercise information may be displayed simultaneously on the display section 110 , or either one of the image and one or plurality of exercise information may be displayed by operating the operation section 130 , which will be described further below.
- the motion sensor section 120 has an acceleration sensor 121 , a gyro sensor (angular velocity sensor) 122 , and a GPS (Global Positioning System) reception circuit 123 , for example, as depicted in FIG. 2 .
- an acceleration sensor 121 a gyro sensor (angular velocity sensor) 122
- a GPS (Global Positioning System) reception circuit 123 for example, as depicted in FIG. 2 .
- the acceleration sensor 121 detects an acceleration corresponding to the change ratio of the motion speed of the user US during running, and outputs acceleration data corresponding to the acceleration. Then, based on this acceleration data outputted from the acceleration sensor 121 , relative changes of the pitch (footstep count per second) and the running speed (pace) of the user US are obtained.
- the gyro sensor (angular velocity sensor) 122 detects an angular velocity corresponding to a change in the moving direction of the user US during an exercise and outputs angular velocity data corresponding to the angular velocity. Then, based on this angular velocity data outputted from the gyro sensor 122 and a change tendency of the acceleration data and a waveform peak frequency outputted from the acceleration sensor 121 described above, absolute values of the pitch and the running speed at the time of running are obtained.
- the GPS reception circuit 123 is a position sensor which receives electric waves from a plurality of GPS satellites to detect a (geographical) position composed of latitude and longitude and outputs position data corresponding to the position. Based on the position data outputted from the GPS reception circuit 123 , the GPS reception circuit 123 can obtain the moving distance (that is, run distance) of the user US.
- the acceleration data outputted from the acceleration sensor 121 the angular velocity data outputted from the gyro sensor 122 , and the position data outputted from the GPS reception circuit 123 are collectively referred to as sensor data (motion data).
- the motion sensor section 120 has at least the acceleration sensor 121 and the gyro sensor 122 depicted in FIG. 2 , or a pressure sensor and the like for another structure, as a sensor for obtaining the pitch of the user US.
- the motion sensor section 120 has at least one of the sensor group constituted by the acceleration sensor 121 and the gyro sensor 122 depicted in FIG. 2 and the GPS reception circuit 123 , as a sensor for obtaining the running speed of the user US.
- the motion sensor section 120 is required to include the acceleration sensor 121 and the gyro sensor 122 described above, and may not be structured to include the GPS reception circuit 123 .
- the moving distance data and the moving speed data obtained based on the position data outputted from the GPS reception circuit 123 may be used together or complementarily with the run distance and the running speed obtained based on the acceleration data and the angular velocity data outputted from the acceleration sensor 121 and the gyro sensor 122 described above so as to increase the accuracy of the run distance and the running speed of the user US.
- pitch, running speed, and run distance are then associated with each other for each running time, and are each stored in a predetermined storage area of the memory 150 described below.
- the operation section 130 has at least a power supply switch, and controls supply (power supply ON) and shutoff (power supply OFF) of driving power from the operation power supply 160 described below to each component inside the display glasses 100 .
- the operation section 130 is used for setting display of the exercise information on the display section 110 described above, selection of a motion (training) mode in the exercise support method described below, selection of an image design (for example, body-build, gender, or costume) of the virtual person in the motion mode, inputs of numerical value conditions, a display position of the virtual person on the display section 110 (for example, whether to display the virtual person in a left viewing field or a right viewing field) a pause of the motion of the virtual person, etc.
- a motion (training) mode in the exercise support method described below selection of an image design (for example, body-build, gender, or costume) of the virtual person in the motion mode
- inputs of numerical value conditions for example, a display position of the virtual person on the display section 110 (for example, whether to display the virtual person in a left viewing field or a right viewing field) a pause of the motion of the virtual person, etc.
- the memory 150 has a non-volatile memory, and stores the acceleration data and the angular velocity data outputted from the motion sensor section 120 described above, the sensor data such as the position data, and the motion information including the pitch, the running speed, the run distance, etc, at the time of running obtained based on these sensor data, in association with each other for each running time.
- non-volatile memory part of the memory 150 various data and information generated or referred to by the exercise support method described below are stored.
- the memory 150 may include a Read Only Memory (ROM) having stored therein control programs (software) for achieving predetermined functions of the display section 110 , the motion sensor section 120 , the CPU 140 , and the memory 150 .
- ROM Read Only Memory
- the non-volatile memory part forming the memory 150 may have a removable storage medium such as a memory card, and may be structured to be removable from the display glasses 100 .
- the CPU 140 has a clock function and performs processing by following a predetermined program to control the operations of the display section 110 , the motion sensor section 120 , and the memory 150 and achieve predetermined functions.
- the control program may be stored in the memory 150 described above or may be incorporated in advance in the CPU 140 .
- the CPU 140 mainly includes a sensor data obtaining section 141 , a motion information obtaining section 142 , an image generating section 143 , and a display driving section 144 , as depicted in FIG. 2 .
- the sensor data obtaining section 141 obtains acceleration data of the user US during running from the acceleration sensor 121 of the motion sensor section 120 described above.
- the motion information obtaining section 142 detects the pitch of the user US, and landing and takeoff (landing/takeoff) timing of the feet of the user US based on the acceleration data obtained by the sensor data obtaining section 141 .
- the image generating section 143 generates a moving image of the virtual person based on the pitch and the landing/takeoff timing obtained by the motion information obtaining section 142 .
- the image generating section 143 sets a replay speed, a replay method, a display size, etc., of the moving image of the virtual person such that they correspond to the obtained pitch and landing/takeoff timing.
- the display driving section 144 causes the image of the virtual person generated by the image generating section 143 to be displayed on a partial area of the display section 110 of the display glasses 100 .
- the operation power supply 160 supplies driving electric power to each components of the display glasses 100 .
- a primary battery such as a commercially-available coin-shaped battery or button-shaped battery or a secondary battery such as a lithium-ion battery or a nickel-metal-hydride battery can be applied.
- a power supply by energy harvest technology for generating electricity by energy such as vibrations, light, heat, or electro magnetic waves.
- FIG. 3A , FIG. 35 , and FIG. 3C are schematic diagrams each depicting an example of the method of displaying the virtual person applied in the exercise support method according to the present embodiment.
- the actual view VIEW surrounding the user US as depicted in FIG. 3A is first recognized through the transparent-type display panel or transparent glasses of the display section 110 .
- the motion mode set by the user US and a moving image or a still image of a virtual person VR serving as a pacemaker dedicated for the user US, which is generated based on the current exercise status of the user US, are displayed in a predetermined display format in a predetermined partial area of the display section 110 .
- the exercise support apparatus is constituted by the display glasses 100 that is a single device, and the display section 110 has a transparent display area such as a transparent-type display panel or a transparent glass.
- the surrounding view VIEW can be viewed by the user US through the display section 110 with a simple head mounting method. Also, the virtual person VR is displayed on a part of the display area.
- the view VIEW transmitted through the display section 110 and the image of the virtual person VR displayed on the display section 110 are viewed in a manner to be superposed with each other, as depicted in FIG. 3C .
- the user US recognizes as if the virtual person VR serving as the pacemaker dedicated for the user US is running in front of the user US in the front view VIEW.
- a display state is used as a reference in which the virtual person VR displayed on the display section 110 is running with the same pitch as that of the user US who is a runner so as to keep a predetermined positional relation (clearance) with the user US. If the running speed of the user US has reduced with reference to this state, the display state is changed such that the virtual person VR moves (proceed) toward the front and positioned away from the user US.
- the user US is caused to recognize that the virtual person VR is moving away from the user US and become aware that he or she needs to increase the pace to catch up with the virtual person VR, and thereby prompted to increase the pace.
- the user US can be drawn (guided) by the virtual person VR to run fast, and thereby supported (assisted) to improve his or her physical capability.
- a state where the virtual person VR displayed on the display section 110 is changed, how it is changed, and how it is returned to its original state are described by specifically presenting various motion modes (training modes) set in the display glasses 100 .
- the image of the virtual person VR which is displayed on the display section 110 of the display glasses 100 , is not particularly limited to that shown in FIG. 3B to FIG. 3C .
- the image design can be changed or another image design can be selected by the user US operating the operation section 130 or by one of the motion modes in the exercise support method described below being selected and set (that is, by a program).
- FIG. 3B the case is depicted where the image of the virtual person is displayed on the right side or the left side (one side) of the viewing field of the display section 110 of the display glasses 100 .
- a configuration may be adopted in which, by operating this display position with the operation section 130 , the user US can select whether the image of the virtual person is displayed on the right side of the viewing field or the left side of the viewing field, and can move and adjust the display position in any of the leftward, rightward, upward and downward directions in the display section 110 .
- the display glasses applied to the exercise support apparatus have a plurality of motion modes (training modes), and one of the motion modes is selected and set by the user US operating the operation section 130 .
- a replay speed, a replay method, a display format such as the display size of the image of the virtual person VR displayed on the display section 110 described above are individually set.
- a series of operations according to the exercise support method is achieved by the CPU 140 of the display glasses 100 described above by following a predetermined control program.
- a normal mode and a long-distance running mode can be selected, set, and performed, which will be described further below.
- the motion sensor section 120 has at least the acceleration sensor 121 and the gyro sensor 122 , all motion modes described below can be selected, set, and performed.
- FIG. 4 is a flowchart of a first example of a normal mode applied to the exercise support method according to the present embodiment.
- the normal mode applied to the present embodiment is achieved by the user US operating the operation section 130 of the display glasses 100 mounted on the head part to select and set the normal mode as a motion mode, whereby the CPU 140 calls a program module of the control program regarding the normal mode to perform processing.
- the CPU 140 first starts an operation of controlling the motion sensor section 120 to detect at least the acceleration of the user US during an exercise (during running) by the acceleration sensor 121 and output acceleration data.
- the CPU 140 causes the sensor data obtaining section 141 to obtain the acceleration data from the acceleration sensor 121 and causes the motion information obtaining section 142 to obtain motion information such as the pitch and the landing/takeoff timing of the feet of the user US based on the acceleration data, as depicted in FIG. 4 (Step S 111 ).
- the CPU 140 causes the image generating section 143 to match the landing/takeoff timing of the feet of the virtual person with the current landing/takeoff timing of the feet of the user US based on the obtained pitch and landing/takeoff timing of the feet and also match the pitch of the virtual person VR with the current pitch of the user US.
- a moving image (a pitch-synchronized image) of the virtual person VR in synchronization with the motion status of the user US is generated (Step S 112 ).
- the CPU 140 causes the display driving section 144 to cause the generated moving image of the virtual person VR to be displayed in a predetermined area of the display section 110 (Step S 113 ).
- the CPU 140 then performs this series of processing at predetermined time intervals or repeatedly at all times.
- FIG. 5 is a flowchart depicting a second example of the normal mode applied to the exercise support method according to the present embodiment.
- the pitch and the landing/takeoff timing of the feet of the user US may temporally vary depending on, for example, the status of the surrounding runners or the running course.
- the moving image of the virtual person VR is set based on average values of pitches and landing/takeoff timings of the feet of the user US in a predetermined short period of time.
- the CPU 140 first starts an operation of causing the motion sensor section 120 (the acceleration sensor 121 ) to detect the acceleration of the user US during an exercise and output acceleration data.
- the CPU 140 causes the sensor data obtaining section 141 and the motion information obtaining section 142 to obtain motion information such as the pitch of the user US and the landing/takeoff timing of the feet of the user US based on the acceleration data obtained from the acceleration sensor 121 , as depicted in FIG. 5 (Step S 121 ).
- the CPU 140 causes the obtainment of the pitch and the landing/takeoff timing of the feet described above to continue for t second (for example, ten seconds), and thereby retains the motion information (Step S 122 ).
- the CPU 140 causes the motion information obtaining section 142 to calculate an average value of the pitches (an average pitch) and an average value of the landing/takeoff timings of the feet (an average timing) based on the obtained pitches and landing/takeoff timings of the feet of the user US obtained continuously for ten seconds (Step S 123 ).
- a time for obtaining the pitch and the landing/takeoff timing of the feet of the user US is set at ten seconds.
- any sampling time such as five seconds or thirty seconds, may be set.
- a relatively short time such as five seconds or ten seconds, should preferably be set.
- the CPU 140 causes the image generating section 143 to match the landing/takeoff timing of the feet of the virtual person VR virtual person VR with the average timing and also match the pitch of the virtual person VR with the average pitch.
- a moving image (a pitch-synchronized image) of the virtual person VR in synchronization with an average motion status of a user US between sampling times is generated (Step S 124 ).
- the CPU 140 causes the display driving section 144 to cause the generated moving image of the virtual person VR to be displayed in a predetermined area of the display section 110 (Step S 125 )
- the CPU 140 then performs the series of processing at predetermined time intervals or repeatedly at all times.
- the user US who is a runner can recognize a moving image of the pacemaker dedicated for the user US (virtual person VR) which is running in front of the user US with the current (real-time) pitch or the immediately preceding (for example, an average for previous ten seconds) pitch.
- the pacemaker dedicated for the user US virtual person VR
- the immediately preceding for example, an average for previous ten seconds
- the user US continues running so as to follow the back (image) of the virtual person VR running ahead, and thereby can be drawn (the running can be guided) by the virtual person VR to contribute to an improvement of the physical capability of the user US (that is, to achieve a high exercise effect and an excellent record).
- angular velocity data or position data outputted from the gyro sensor 122 and the GPS reception circuit 123 of the motion sensor section 120 may be applied, and accordingly reflected onto the image and the exercise information displayed on the display section 110 .
- FIG. 6 is a flowchart of an example depicting a long-distance running mode applied to the exercise support method according to the present embodiment.
- the long-distance running mode is a motion mode in which large fluctuations in running speed are suppressed by the virtual person VR to prompt the user to run at an approximately constant speed.
- the long-distance running mode applied to the present embodiment is achieved by the user US operating the operation section 130 of the display glasses 100 to select and set the long-distance running mode as a motion mode, whereby the CPU 140 calls a program module of the control program regarding the long-distance running mode to perform processing.
- the CPU 140 first starts an operation of controlling the motion sensor section 120 to detect the acceleration of the user US during an exercise (during running) by the acceleration sensor 121 and output acceleration data.
- the CPU 140 causes the sensor data obtaining section 141 to obtain the acceleration data from the acceleration sensor 121 , and then causes the motion information obtaining section 142 to obtain motion information such as the pitch and the landing/takeoff timing of the feet of the user US based on the acceleration data and a change of the running speed (pace), as depicted in FIG. 6 (Step S 211 ).
- Step S 212 the CPU 140 judges whether the obtained change of the running speed is within a range set in advance.
- the CPU 140 causes the image generating section 143 to generate a moving image (a pitch-synchronized image) of the virtual person VR synchronization with the motion status of the user US by matching the landing/takeoff timing of the feet of the virtual person VR with the current landing/takeoff timing of the feet of the user US and matching the pitch of the virtual person VR with the current pitch of the user US (Step S 213 ).
- the CPU 140 causes the display driving section 144 to cause the moving image of the virtual person VR generated so as to be synchronized with the motion status of the user US to be displayed in a predetermined area of the display section 110 (Step S 214 ).
- Step S 212 when judged at Step S 212 that the change of the running speed is not within the set range, based on the change of the running speed, the CPU 140 judges whether the running speed of the user US is decreasing (Step S 215 ).
- the CPU 140 causes the image generating section 143 to generate the moving image of the virtual person VR with its display size reduced more than the display size at the previous Step S 214 , according to the degree of the change (degrease) of the running speed (Step S 216 ). That is, as the degree of the decrease of the running speed becomes larger, the display size becomes relatively smaller.
- the CPU 140 causes the display driving section 144 to cause the moving image of the virtual person VR generated by reducing the display size according to the decrease of the running speed of the user US to be displayed in a predetermined area of the display section 110 (Step S 214 ).
- Step S 215 when judged at Step S 215 that the running speed of the user US is not decreasing, the CPU 140 judges that the running speed of the user US is increasing. Then, the CPU 140 causes the image generating section 143 to generate a moving image of the virtual person VR with its display size enlarged more than the display size at the previous Step S 214 , according to the degree of the change (increase) of the running speed (Step S 217 ). That is, as the degree of the increase of the running speed becomes larger, the display size becomes relatively larger.
- the CPU 140 causes the display driving section 144 to cause the moving image of the virtual person VR generated by enlarging the display size according to the increase of the running speed of the user US to be displayed in a predetermined area of the display section 110 (Step S 214 ).
- the CPU 140 then performs the series of processing at predetermined time intervals or repeatedly at all ties.
- the running speed (pace) of the user US is relatively decreasing, the virtual person VR is displayed smaller, and therefore the user US recognizes that the virtual person VR has moved away ahead.
- the user US notices that he or she is not keeping, pace with the running of the virtual person VR and is in a delay state.
- the user US is prompted to catch up with the virtual person VR running ahead by increasing the pitch or extending his or her stride (footstep width) to increase the pace.
- the virtual person VR is displayed larger, and therefore the user US recognizes that he or she is approaching the virtual person VR.
- the user US notices that he or she is starting to keep pace with the virtual person VR if lagging behind the virtual person VR.
- the user US notices that he or she is running faster than the virtual person VR, and is in an over pace state.
- the user US can be drawn (the running can be guided) by the virtual person VR, so as to suppress large fluctuations of the running speed, which contributes to an improvement of the physical capability of the user US.
- FIG. 7 is a flowchart of an example depicting a pace set mode applied to the exercise support method according to the present embodiment.
- the pace-set mode is a motion mode in which the running speed is promoted to be brought closer to a preset target value by the virtual person VR.
- the pace-set mode applied to the present embodiment is achieved by the user US operating the operation section 130 of the display glasses 100 to select and set the pace-set mode as a motion mode, whereby the CPU 140 calls a program module of the control program regarding the pace-set mode to perform processing.
- the user US first operates the operation section 130 of the display glasses 100 to input and set a target value of the running speed (pace), as depicted in FIG. 7 (Step S 311 ).
- the CPU 140 causes the image generating section 143 and the display driving section 144 to cause a moving image of the virtual person VR registered in advance as an initial image to be displayed in a predetermined area of the display section 110 (Step S 312 ).
- the initial image is generated as a moving image whose display size has been reduced so as to achieve a state where the virtual person VR is running in the front of the viewing field of the user US in an area relatively away from the user US.
- the CPU 140 starts an operation of controlling the motion sensor section 120 to detect at least the acceleration and the angular velocity of the user US during an exercise (during running) by the acceleration sensor 121 and the gyro sensor 122 and output acceleration data and angular velocity data.
- the CPU 140 causes the sensor data obtaining section 141 to obtain the acceleration data from the acceleration sensor 121 and the angular velocity data from the gyro sensor 122
- the CPU 140 then causes the motion information obtaining section 142 to obtain motion information such as the pitch and the landing/takeoff timing of the feet of the user US based on the acceleration data and the angular velocity data, and an absolute value of the running speed (pace) (Step S 313 ).
- the CPU 140 calculates a difference between the obtained running supped and the target value (target pace) of the running speed set in advance (the obtained value ⁇ the target value) (Step S 314 ).
- the CPU 140 judges based on the calculated difference whether the obtained running speed is increasing (whether an absolute value of the difference is reducing) (Step S 315 ).
- Step S 315 When judged at Step S 315 that the running speed is increasing (the absolute value of the difference is reducing) the CPU 140 judges whether the running speed has reached the target value (Step 3316 ).
- the CPU 140 causes the image generating section 143 to enlarge the display size to a standard size set in advance so as to achieve a state where the virtual person VR is running in an area immediately in front of the user US in the viewing field, match the landing/takeoff timing of the feet of the virtual person VR with the landing/takeoff timing of the feet of the user US and the pitch of the virtual person VR with the pitch of the user US based on the pitch and the landing/takeoff timing of the feet of the user US, and generate a moving image (a standard image) of the virtual person VR in synchronization with the motion status of the user US (Step S 317 ).
- the CPU 140 causes the display driving section 144 to enlarge the display size and to cause the moving image of the virtual person VR generated in synchronization with the motion status of the user US to be displayed in a predetermined area of the display section 110 (Step S 313 ).
- the CPU 140 judges that the running speed of the user US is decreasing.
- the CPU 140 then causes the image generating section 143 to generate a moving image of the virtual person VR whose display size has been reduced in, for example, inverse proportion to the size of the initial image, according to the magnitude of the absolute value of the difference (Step S 319 ).
- the CPU 140 causes the display driving section 144 to cause the moving image of the virtual person VR generated by being reduced according to the decrease of the running speed to be displayed in a predetermined area of the display section 110 (Step S 318 ).
- the CPU 140 causes the image generating section 143 to generate a moving image of the virtual person VR with its display size enlarged in, for example, inverse proportion to the size of the initial image, according to the magnitude of the absolute value of the difference (Step S 320 ).
- the CPU 140 causes the display driving section 144 to cause the moving image of the virtual person VR generated by being enlarged according to the increase of the running speed to be displayed in a predetermined area of the display section 110 (Step S 318 ).
- the CPU 140 then performs this series of processing at predetermined time intervals or repeatedly at all times.
- the display size of the virtual person VR displayed in the front of the viewing field of the user US at the time of the start of the running is set to be small, whereby the user US recognizes that the virtual person VR is running in front of the user US in an area relatively away from the user US. Then, as his or her running speed (pace) becomes closer to the target value set in advance, the display size of the virtual person VR is set larger. Consequently, the user US gradually catches up with the virtual person VR.
- the user US can be drawn (the running is guided) by the virtual person VR at the running speed of the target value set in advance, which contributes to an improvement of the physical capability of the user US.
- a processing operation is described in which the user US sets a running speed (pace) that serves as a target value in advance and achieves running that is approximate to this target value.
- pace running speed
- the present invention is not limited thereto.
- control may be performed by which, in addition to the running speed (pace), an arbitrary pitch is set as a target value and this pitch is increased and decreased to correspond to the display size (that is, a distance from the user US) of the virtual person VR.
- FIG. 8 is a flowchart of an example depicting a build-up mode applied to the exercise support method according to the present embodiment.
- the build-up mode is a motion mode in which build-up running at a running speed set in advance is promoted to be performed by the virtual person VR.
- the build-up running is a way of running in which the running speed is gradually increased for each running distance set in advance, which has been known as a practice method capable of improving endurance and increasing the speed.
- the build-up mode applied to the present embodiment is achieved by the user US operating the operation section 130 of the display glasses 100 to select and set the build-up mode as a motion mode, whereby the CPU 140 calls a program module of the control program regarding the build-up mode to perform processing.
- the user US first operates the operation section 130 of the display glasses 100 to input and set a target value of the running speed (pace), as depicted in FIG. 8 (Step S 411 ).
- the target value of the running speed to be inputted and set is set so that the running speed is gradually increased for each set distance such as an arbitrary running distance, section distance, etc.
- the CPU 140 causes the image generating section 143 and the display driving section 144 to cause a moving image of the virtual person VR registered in advance as an initial image to be displayed in a predetermined area of the display section 110 (Step S 412 ).
- the initial image is generated as a moving image whose display size has been reduced to achieve a state where the virtual person VR is running in front of the user US in an area relatively away from the user US.
- the CPU 140 starts an operation of controlling the motion sensor section 120 to detect at least the acceleration and the angular velocity of the user US during an exercise (during running) by the acceleration sensor 121 and the gyro sensor 122 and output acceleration data and angular velocity data.
- the CPU 140 causes the sensor data obtaining section 141 to obtain the acceleration data from the acceleration sensor 121 and the angular velocity data from the gyro sensor 122 .
- the CPU 140 then causes the motion information obtaining section 142 to obtain motion information such as the pitch and the landing/takeoff timing of the feet of the user US, an absolute value of the running speed (pace), and a run distance based on the acceleration data, the angular velocity data, and the running time (Step S 413 and Step S 414 ).
- Step S 411 the CPU 140 judges at Step S 411 whether the obtained run distance has reached a set distance with a target value of the running speed set in advance (Step S 415 ).
- the CPU 140 calculates a difference between the obtained running speed and the target value (target pace) of the running speed set to the set distance (the obtained value ⁇ the target value) (Step S 416 ).
- the CPU 140 judges based on the calculated difference whether the obtained running speed is increasing (an absolute value of the difference is reducing) (Step S 417 ).
- Step S 417 When judged at Step S 417 that the running speed is increasing (the absolute value of the difference is reducing) the CPU 140 judges whether the running speed has reached the target value (Step S 418 ).
- the CPU 140 causes the image generating section 143 to generate a moving image of the virtual person VR whose display size has been enlarged to achieve a state where the virtual person VR is running in an area immediately in front of the user US in the viewing field, match the landing/takeoff timing of the feet of the virtual person VR with the landing/takeoff timing of the feet of the user US and the pitch of the virtual person VR with the pitch of the user US based on the pitch and the landing/takeoff timing of the feet of the user US, and generate a moving image of the virtual person VR in synchronization with the motion status of the user US (Step S 419 ).
- the CPU 140 causes the display driving section 144 to enlarge the display size and to cause the moving image of the virtual person VR generated in synchronization with the motion status of the user US to be displayed in a predetermined area of the display section 110 (Step S 420 ).
- the CPU 140 causes the image generating section 143 to increase the pitch of the virtual person VR and generate a moving image whose display size has been reduced such that the virtual person VR is moving away from the user US toward the front of the viewing field (Step S 421 ).
- the CPU 140 causes the display driving section 144 to reduce the display size and to cause the moving image of the virtual person VR generated by increasing the pitch to be displayed in a predetermined area of the display section 110 (Step S 420 ).
- Step S 417 When judged at Step S 417 that the running speed is not increasing (the absolute value of the difference is increasing), the CPU 140 judges that the running speed of the user US is decreasing, and causes the image generating section 143 to generate a moving image of the virtual person VR with its display size reduced in, for example, inverse proportion, according to the magnitude of the absolute value of the difference (Step S 422 ).
- the CPU 140 causes the display driving section 144 to cause the moving image of the virtual person VR generated by reducing the display size according to the decrease of the running speed to be displayed in a predetermined area of the display section 110 (Step S 420 ).
- the CPU 140 When judged at Step S 418 that the running speed has not reached the target value, the CPU 140 causes the image generating section 143 to generate a moving image of the virtual person VR with its display size enlarged in, for example, inverse proportion, according to the magnitude of the absolute value of the difference (Step S 423 ).
- the CPU 140 causes the display driving section 144 to cause the moving image of the virtual person VR generated by enlarging the display size according to the increase of the running speed to be displayed in a predetermined area of the display section 110 (Step S 420 ).
- the CPU 140 then performs this series of processing at predetermined time intervals or repeatedly at all times.
- the display size of the virtual person VR displayed in the front of the viewing field of the user US is set to be small for each arbitrary set distance, whereby the user US recognizes that the virtual person VR is running in front of the user US in an area relatively away from the user US. Then, as his or her running speed (pace) gradually becomes closer to the target value sat in advance so as to gradually increase for each set distance, the display size of the virtual person VR is set larger. Consequently, the user US is gradually catches up with the virtual person VR.
- the running speed (pace) so that the user US can catch up with the virtual person VR running ahead or the distance from the virtual person VR is kept constant the user US can be drawn (the running is guided) by the virtual person VR at the running speed of the target value set in advance for each set distance, which contributes to an improvement of the physical capability of the user US.
- An interval mode applied to the exercise support method according to the present embodiment is achieved by a processing operation approximately similar to that of the build-up mode described above.
- the interval mode is a motion mode in which interval running at a running speed set in advance is promoted to be performed by the virtual person VR.
- the interval running is a way of running in which a period during which the running speed is increased for running fast and a period during which the running speed is decreased for running slowly are alternately repeated for each running distance set in advance, which has been known as a practice method capable of improving endurance.
- the interval mode applied to the present embodiment is achieved by the user US operating the operation section 130 of the display glasses 100 to select and set the interval mode as a motion mode, whereby the CPU 140 calls a program module of the control program regarding the interval mode to perform processing.
- the user US first operates the operation section 130 of the display glasses 100 to input and set a target value of the running speed (pace) (Step S 411 ).
- the target value of the running speed to be inputted and set is set so that a period during which the running speed is increased and a period during which the running speed is decreased are alternately provided for each arbitrary set distance.
- the CPU 140 causes a moving image of the virtual person VR as an initial image to be displayed on the display section 110 such that the virtual person VR is running in front of the user US in an area relatively away from the user US (Step S 412 ).
- the CPU 140 performs this series of processing at Step S 413 to Step S 420 at predetermined time intervals or repeatedly at all times.
- the user US recognizes that the virtual person VR is running in front of the user US in an area relatively away from the user US for each arbitrary set distance, as in the case of the build-up mode described above. Then, as his or her running speed (pace) becomes closer to the target value set in advance so as to increase or decrease for each set distance, the user US gradually catches up with the virtual person VR.
- pace running speed
- the user US can be drawn (the running is guided) by the virtual person VR at the running speed of the target value set in advance for each set distance, which contributes to an improvement of the physical capability of the user US.
- the processing operation is described in which the user US sets the running speed (pace) that serves as a target value in advance for each arbitrary set distance and achieves the running that is approximate to this target value.
- the present invention is not limited thereto.
- control may be performed by which, in addition to the running speed (pace), an arbitrary pitch is set as a target value and this pitch is increased and decreased to correspond to the display size (that is, a distance from the user US) of the virtual person VR.
- the pitch of the virtual person VR is matched with the pitch of the user US who is a runner so as to extend the stride (footstep width) of the user US to increase the pace.
- the present invention is not limited to this scheme.
- a configuration may be adopted in which, when it is judged that the pitch or the pace of the user US is decreasing, or at arbitrary timing, a moving image whose time period from the time of takeoff of the feet of the virtual person VR to the time of landing thereof is slightly shorter than the pitch of the user US, or in other words, a moving image whose pitch is slightly fast is generated and displayed, whereby the pitch of the user US is prompted to increase.
- the pitch to be set to the virtual person may be a pitch that is set so as to increase the pitch of the user US uniformly by a predetermined footstep count, a pitch that is set so as to increase according to the pitch of the user US by, for example, a footstep count at a predetermined ratio, or a pitch that is set variably based on another arithmetic expression, conditional expression, or the like.
- a configuration may be adopted in which, when the user US temporarily suspends a running motion to takes a rest, water, or the like during an exercise (during running) and the obtained running speed is decreased (the pace is decreased), the image of the virtual, person VR is temporarily stopped by, for example, the user US operating the operation section 130 , so that the display operation in which the virtual person VR increases the running speed (increases the pace) and moves away from the user US based on the processing operation described above is not performed.
- motion information (pitch, pace, or the like) of the user who is an actual runner is fed back to the generation of an image of the virtual person for display in the viewing field of the user, whereby a function of drawing the user (guiding the running) and a function of sufficiently drawing user's capability (physical capability) can be achieved, like an actual pacemaker.
- a practice of guiding and drawing the running of the user while adjusting the running speed (pace) does not always work well depending on the physical conditions of the pacemaker. However, with the present invention, it can always be favorably conducted, which contributes to an improvement of the physical capability of the user.
- the display section, the motion sensor section, the CPU, etc. are integrally incorporated in the display glasses as a single device.
- the second embodiment has a structure where at least the display function and the sensor function are separately provided to different devices.
- FIG. 9A , FIG. 9B , and FIG. 9C are schematic structural views of the second embodiment of the exercise support apparatus according to the present invention.
- FIG. 10A and FIG. 10B are block diagrams showing an example of the structure of a device applied to the exercise support apparatus according to the present embodiment.
- components similar to those of the above-described first embodiment are provided with the same reference numerals and therefore description thereof is simplified.
- the exercise support apparatus mainly includes, in addition to the display glasses 100 having a display function, at least one sensor device among a chest sensor 200 , a wrist analyzer 300 , a foot sensor 400 , etc. having a sensor function, as depicted in FIG. 9A .
- the display glasses 100 applied to the present embodiment has an outer appearance of eyeglasses-type or goggles-type, and is mounted on the head part of the user US.
- the display glasses 100 mainly includes the display section 110 , the operation section 130 , the CPU 140 , the memory 150 , the operation power supply 160 , and a communication function section 170 , as depicted in FIG. 10A .
- the display glasses 100 has a structure where the motion sensor section 120 is omitted and the communication function section 170 is added in the structure described in the above-described first embodiment (refer to FIG. 2 ).
- the display section 110 the operation section 130 , the CPU 140 , the memory 150 , and the operation power supply 160 have structures and functions approximately similar to those of the first embodiment, and therefore are not described herein.
- the communication function section 170 applied to the display glasses 100 transmits data to the sensor devices, such as the chest sensor 200 , the wrist analyzer 300 , and the foot sensor 400 , which will be described further below, by various wireless communication schemes or by a wired communication scheme via a communication cable.
- Bluetooth registered trademark
- LE Bluetooth (registered trademark) low energy (LE) laid out in this communication standard as a standard of a low power consumption type
- data transmission can be favorably performed even with small electric power generated by the above-described energy harvest technology or the like.
- the chest sensor 200 applied to the present embodiment has an outer appearance of a chest sensor, as depicted in FIG. 9B , and mainly includes a device main body 201 having a sensor function and a belt section 202 to be wound around the chest part of the user US to mount the device main body 201 on the chest part.
- the wrist analyzer 300 applied to the present embodiment has an outer appearance of a wrist band or a wristwatch, as depicted in FIG. 9C , and mainly includes a device main body 301 having a sensor function and a belt section 302 to be wound around the wrist of the user US to mount the device main body 301 on the wrist.
- the structure is depicted where the device main body 301 includes a display section.
- a display section is not necessarily required to be provided.
- exercise information such as a pitch, a running speed, a running distance, and a calorie consumption amount are displayed as appropriate, as depicted in FIG. 9C .
- the foot sensor 400 applied to the present embodiment has a sensor function, and is mounted on an ankle, a shoelace, a shoe sole, or the like of the user US.
- the sensor device applied to the present embodiment mainly includes a motion sensor section 220 , an operation section 230 , a CPU 240 , a memory 250 , an operation power supply 260 , and a communication function section 270 , as depicted in FIG. 10B .
- the motion sensor section 220 has at least one of an acceleration sensor 221 , a gyro sensor 222 , and a pressure sensor.
- the motion sensor section 220 , the operation section 230 , the CPU 240 , the memory 250 , and the operation power supply 260 have functions approximately similar to those of the motion sensor section 120 , the operation section 130 , the CPU 140 , the memory 150 , and the operation power supply 160 described in the above-described first embodiment, and therefore are not described herein.
- the communication function section 270 applied to the sensor device transmits various data to the above-described display glasses 100 by a predetermined communication scheme.
- data transmission between the display glasses 100 and the sensor device may be performed at predetermined time intervals in synchronization with the timing of detection by each sensor of the motion sensor section 220 or the timing of performing the processing operation in the above-described exercise support method, or may be performed continuously.
- the display glasses 100 having the display function and the sensor device having the sensor function are structured to be separated to different devices and both transmit data by a predetermined communication scheme such as a wireless one.
- the sensor device having the sensor function can be mounted on any part without the limitation of the mount position of the display glasses 100 having the display function. Therefore, the structure of the display glasses can be simplified to reduce the weight thereof whereby a factor that prevents the exercise (running) of the user can be eliminated.
- the entire series of processing operations may be performed by the CPU 140 provided to the display glasses 100 or may be performed by the CPU 240 provided to the sensor device.
- the series of processing operations may be split to be performed by the CPU 140 and the CPU 240 .
- the functions of the CPU 140 described in the above-described first embodiment may be divided as appropriate between the CPU 140 of the display glasses 100 and the CPU 240 of the sensor device and performed thereby.
- the chest sensor 200 mounted on the chest part of the user, the wrist analyzer 300 mounted on the wrist, the foot sensor 400 mounted on an ankle, a shoelace, a shoe sole, or the like have been described as the sensor devices having a sensor function.
- the present invention is not limited thereto, and a sensor device that is mounted on the waist part, the upper arm part, the neck part, or the like may be adopted.
- the apparatus has a structure in which the display function and the sensor function are separately provided to different devices.
- the apparatus has a structure where a display function, a sensor function, and an arithmetic processing function are separately provided to different devices.
- FIG. 11A and FIG. 11B are block diagrams showing an example of the structure of a device applied to the exercise support apparatus according to the third embodiment.
- the exercise support apparatus mainly includes, in addition to the display glasses 100 having the display function, a sensor device having a sensor function such as the chest sensor 200 or the foot sensor 400 , and an information processing device having an arithmetic processing function such as the wrist analyzer 300 , as depicted in FIG. 9A .
- the display glasses 100 , the chest sensor 200 , and the foot sensor 400 have structures and functions approximately similar to those of the above-described second embodiment, as depicted in FIG. 10A and FIG. 11A , and therefore are not described herein.
- the information processing device (such as the wrist analyzer 300 ) applied to the present embodiment mainly includes a display section 310 , an operation section 330 , a CPU 340 , a memory 350 , an operation power supply 360 , and a communication function section 370 , as depicted in FIG. 11B .
- exercise information such as a pitch, a running speed, a running distance, and a calorie consumption amount, are displayed as appropriate, as depicted in FIG. 9C .
- the operation section 330 , the CPU 340 , the memory 350 , the operation power supply 360 , and the communication function section 370 have structures and functions approximately similar to those of the operation sections 130 and 230 , the CPUs 140 and 240 , the memories 150 and 250 , the operating power supplies 160 and 260 , and the communication function sections 170 and 270 described in the above-described second embodiment, respectively, and therefore are not described herein.
- the entire series of processing operations of the exercise support method described in the above-described first embodiment is performed by the CPU 340 provided to the wrist analyzer 300 .
- the chest sensor 200 and the foot sensor 400 serving as sensor devices detect a motion of the user US during an exercise and perform only an operation of outputting corresponding sensor data.
- the wrist analyzer 300 performs the series of processing of the exercise support method described above to generate an image of the virtual person VR whose pitch, display size, and the like have been set, and transmits the image to the display glasses 100 .
- the display glasses 100 performs only an operation of displaying the generated image (moving image) of the virtual person VR on the display section 110
- the series of processing operations of the exercise support method is performed only by the CPU 340 provided to the wrist analyzer 300 .
- the present invention is not limited thereto.
- the series of processing operations may be split to be performed by the CPU 140 provided to the display glasses 100 and the CPU 340 .
- the functions of the CPU 140 described in the above-described first embodiment may be divided as appropriate between the CPU 140 of the display glasses 100 and the CPU 340 of the wrist analyzer 300 and performed thereby.
- the display glasses 100 having the display function, the sensor device having the sensor function (such as the chest sensor 200 and the foot sensor 400 ), and the information processing device having the arithmetic processing function (such as the wrist analyzer 300 ) are structured to be separated from each other into different devices and transmit data to and from each other by a predetermined communication scheme such as a wireless scheme.
- each device can be specialized, whereby the sensor device having the sensor function and the information processing device having the arithmetic processing function can be mounted on any part without the limitation of the mount position of the display glasses 100 having the display function.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Physiology (AREA)
- Optics & Photonics (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Entrepreneurship & Innovation (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Theoretical Computer Science (AREA)
- Physical Education & Sports Medicine (AREA)
- Controls And Circuits For Display Device (AREA)
- Rehabilitation Tools (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2012199372A JP5885129B2 (ja) | 2012-09-11 | 2012-09-11 | 運動支援装置、運動支援方法及び運動支援プログラム |
| JP2012-199372 | 2012-09-11 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20140073481A1 true US20140073481A1 (en) | 2014-03-13 |
Family
ID=49328315
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/021,885 Abandoned US20140073481A1 (en) | 2012-09-11 | 2013-09-09 | Exercise support apparatus, exercise support method and exercise support program |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20140073481A1 (enExample) |
| EP (1) | EP2706395A3 (enExample) |
| JP (1) | JP5885129B2 (enExample) |
| CN (1) | CN103657029B (enExample) |
Cited By (25)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150243068A1 (en) * | 1990-12-07 | 2015-08-27 | Dennis J. Solomon | Integrated 3d-d2 visual effects dispay |
| US20160030823A1 (en) * | 2014-07-31 | 2016-02-04 | Seiko Epson Corporation | On-running landing position evaluation method, on-running landing position evaluation apparatus, detection method, detection apparatus, running motion evaluation method, and running motion evaluation apparatus |
| US20160089574A1 (en) * | 2014-09-29 | 2016-03-31 | Equinox Holdings, Inc. | Excercise class apparatus and method |
| CN105759953A (zh) * | 2015-01-05 | 2016-07-13 | 索尼公司 | 信息处理装置、信息处理方法以及程序 |
| US20160346612A1 (en) * | 2015-05-29 | 2016-12-01 | Nike, Inc. | Enhancing Exercise Through Augmented Reality |
| CN106781805A (zh) * | 2017-01-19 | 2017-05-31 | 合肥金诺数码科技股份有限公司 | 一种农田灌溉及插秧体验系统 |
| US9713756B1 (en) * | 2016-05-02 | 2017-07-25 | Bao Tran | Smart sport device |
| US20170312614A1 (en) * | 2016-05-02 | 2017-11-02 | Bao Tran | Smart device |
| US20170318360A1 (en) * | 2016-05-02 | 2017-11-02 | Bao Tran | Smart device |
| US20180001184A1 (en) * | 2016-05-02 | 2018-01-04 | Bao Tran | Smart device |
| US20180120928A1 (en) * | 2016-10-31 | 2018-05-03 | Fujitsu Limited | Action control method and device |
| US10022614B1 (en) * | 2016-05-02 | 2018-07-17 | Bao Tran | Smart device |
| US10195513B2 (en) * | 2016-05-02 | 2019-02-05 | Bao Tran | Smart device |
| GB2567231A (en) * | 2017-10-09 | 2019-04-10 | Aboense Ltd | A sports apparatus for providing information |
| US10289902B2 (en) | 2015-06-18 | 2019-05-14 | Casio Computer Co., Ltd. | Data analysis device, data analysis method and storage medium |
| US10342462B2 (en) * | 2014-10-26 | 2019-07-09 | David Martin | Application of gait characteristics for mobile |
| US10466475B2 (en) * | 2016-07-26 | 2019-11-05 | Bion Inc. | Head mounted virtual reality object synchronized physical training system |
| US20200261763A1 (en) * | 2016-01-12 | 2020-08-20 | Samsung Electronics Co., Ltd. | Display device and control method therefor |
| US11484765B2 (en) * | 2017-03-22 | 2022-11-01 | Honda Motor Co., Ltd. | Walking support system, walking support method, and program |
| US11884155B2 (en) * | 2019-04-25 | 2024-01-30 | Motional Ad Llc | Graphical user interface for display of autonomous vehicle behaviors |
| US20240231430A1 (en) * | 2021-02-08 | 2024-07-11 | Sightful Computers Ltd | Altering display of virtual content based on mobility status change |
| FR3149698A1 (fr) * | 2023-06-09 | 2024-12-13 | Sylvain Quendez | Dispositif de contrôle d’allure affichant un point lumineux en réalité augmentée |
| US20250028495A1 (en) * | 2023-07-20 | 2025-01-23 | Guangdong Coros Sports Technology Co., Ltd | Method and device for linkage transmission, and computer-readable storage medium |
| FR3151679A1 (fr) * | 2023-07-28 | 2025-01-31 | Sylvain Quendez | Dispositif de contrôle d’allure affichant une cible en réalité augmentée |
| US12474816B2 (en) | 2022-09-30 | 2025-11-18 | Sightful Computers Ltd | Presenting extended reality content in different physical environments |
Families Citing this family (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016131782A (ja) * | 2015-01-21 | 2016-07-25 | セイコーエプソン株式会社 | 頭部装着型表示装置、検出装置、頭部装着型表示装置の制御方法、およびコンピュータープログラム |
| CN105301771B (zh) * | 2014-06-06 | 2020-06-09 | 精工爱普生株式会社 | 头部佩戴型显示装置、检测装置、控制方法以及计算机程序 |
| CN104407697A (zh) * | 2014-11-17 | 2015-03-11 | 联想(北京)有限公司 | 一种信息处理方法及穿戴式设备 |
| CN106139559B (zh) * | 2015-03-23 | 2019-01-15 | 小米科技有限责任公司 | 运动数据采集方法、测量装置和运动装置 |
| JP6421689B2 (ja) | 2015-04-27 | 2018-11-14 | オムロンヘルスケア株式会社 | 運動情報測定装置、運動支援方法、及び運動支援プログラム |
| CN105107184B (zh) * | 2015-08-26 | 2017-10-31 | 上海斐讯数据通信技术有限公司 | 一种运动训练方法、系统、智能眼镜以及运动训练器 |
| JP2017068594A (ja) * | 2015-09-30 | 2017-04-06 | ソニー株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
| WO2017107182A1 (zh) * | 2015-12-25 | 2017-06-29 | 深圳市柔宇科技有限公司 | 头戴式显示设备 |
| JP6566209B2 (ja) * | 2016-02-26 | 2019-08-28 | 株式会社セガゲームス | プログラム及びアイウエア |
| CN105903166B (zh) | 2016-04-18 | 2019-05-24 | 北京小鸟看看科技有限公司 | 一种3d在线运动竞技方法和系统 |
| CN105944332B (zh) * | 2016-05-10 | 2018-12-14 | 杭州韵健科技有限公司 | 一种可联网的虚拟现实智能健身系统 |
| JP2019133207A (ja) * | 2016-06-06 | 2019-08-08 | シャープ株式会社 | 映像生成装置及び映像生成方法、及び映像生成プログラム |
| CN106621264B (zh) * | 2016-10-12 | 2019-05-24 | 快创科技(大连)有限公司 | 在线可视化编程及编辑的vr剑术比赛制作体验系统 |
| CN106422263B (zh) * | 2016-10-12 | 2019-03-08 | 快创科技(大连)有限公司 | 可视化编程及编辑的vr击剑训练制作体验系统 |
| CN107050773A (zh) * | 2017-04-18 | 2017-08-18 | 河南牧业经济学院 | 一种用于田径场上跑步速度的采集测量系统 |
| KR102108962B1 (ko) * | 2017-08-08 | 2020-05-12 | 한국과학기술연구원 | 전방향 가상현실 네비게이션을 위한 제자리 걸음을 이용한 인터랙션 장치 및 방법 |
| CN108404382A (zh) * | 2018-02-24 | 2018-08-17 | 上海康斐信息技术有限公司 | 一种自适应训练计划生成的算法及系统 |
| CN108499083A (zh) * | 2018-03-29 | 2018-09-07 | 四川斐讯信息技术有限公司 | 一种用于智能穿戴设备的虚拟比赛方法及系统 |
| FR3121612B1 (fr) * | 2021-04-13 | 2023-11-24 | Sylvain Quendez | Dispositif de contrôle d’allure projetant une cible lumineuse |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030078138A1 (en) * | 2001-10-19 | 2003-04-24 | Konami Corporation | Exercise assistance controlling method and exercise assisting apparatus |
| US20060040793A1 (en) * | 2001-08-21 | 2006-02-23 | Martens Mark H | Exercise system with graphical feedback and method of gauging fitness progress |
| US20060262120A1 (en) * | 2005-05-19 | 2006-11-23 | Outland Research, Llc | Ambulatory based human-computer interface |
| US20090023554A1 (en) * | 2007-07-16 | 2009-01-22 | Youngtack Shim | Exercise systems in virtual environment |
| US20090156363A1 (en) * | 2007-12-13 | 2009-06-18 | Technogym S.P.A. | Exercise machine with adaptive interface |
| US20100022351A1 (en) * | 2007-02-14 | 2010-01-28 | Koninklijke Philips Electronics N.V. | Feedback device for guiding and supervising physical exercises |
| US20110009241A1 (en) * | 2009-04-10 | 2011-01-13 | Sovoz, Inc. | Virtual locomotion controller apparatus and methods |
| US20110050707A1 (en) * | 2009-08-28 | 2011-03-03 | Samsung Electronics Co., Ltd. | Method and apparatus for providing content |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000033184A (ja) * | 1998-05-14 | 2000-02-02 | Masanobu Kujirada | 全身動作入力型のゲ―ム及びイベント装置 |
| JP2003134510A (ja) * | 2001-08-16 | 2003-05-09 | Space Tag Inc | 画像情報配信システム |
| JP2005224318A (ja) * | 2004-02-10 | 2005-08-25 | Rikogaku Shinkokai | ペースメーカー |
| JP2008099834A (ja) | 2006-10-18 | 2008-05-01 | Sony Corp | 表示装置、表示方法 |
| ATE512624T1 (de) * | 2007-11-14 | 2011-07-15 | Zebris Medical Gmbh | Anordnung zur ganganalyse |
| US20110131005A1 (en) * | 2007-12-18 | 2011-06-02 | Hiromu Ueshima | Mobile recording apparatus, body movement measuring apparatus, information processing apparatus, movement pattern determining apparatus, activity amount calculating apparatus, recording method, body movement measuring method, information processing method, movement pattern determining method, activity amount calculating met |
| US7972245B2 (en) * | 2009-02-27 | 2011-07-05 | T-Mobile Usa, Inc. | Presenting information to users during an activity, such as information from a previous or concurrent outdoor, physical activity |
| JP2011067277A (ja) * | 2009-09-24 | 2011-04-07 | Brother Industries Ltd | ヘッドマウントディスプレイ |
| JP5406880B2 (ja) * | 2011-04-28 | 2014-02-05 | シャープ株式会社 | 運動インストラクション装置 |
-
2012
- 2012-09-11 JP JP2012199372A patent/JP5885129B2/ja active Active
-
2013
- 2013-09-06 EP EP13183361.8A patent/EP2706395A3/en not_active Withdrawn
- 2013-09-09 US US14/021,885 patent/US20140073481A1/en not_active Abandoned
- 2013-09-11 CN CN201310412038.5A patent/CN103657029B/zh active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060040793A1 (en) * | 2001-08-21 | 2006-02-23 | Martens Mark H | Exercise system with graphical feedback and method of gauging fitness progress |
| US20030078138A1 (en) * | 2001-10-19 | 2003-04-24 | Konami Corporation | Exercise assistance controlling method and exercise assisting apparatus |
| US20060262120A1 (en) * | 2005-05-19 | 2006-11-23 | Outland Research, Llc | Ambulatory based human-computer interface |
| US20100022351A1 (en) * | 2007-02-14 | 2010-01-28 | Koninklijke Philips Electronics N.V. | Feedback device for guiding and supervising physical exercises |
| US20090023554A1 (en) * | 2007-07-16 | 2009-01-22 | Youngtack Shim | Exercise systems in virtual environment |
| US20090156363A1 (en) * | 2007-12-13 | 2009-06-18 | Technogym S.P.A. | Exercise machine with adaptive interface |
| US20110009241A1 (en) * | 2009-04-10 | 2011-01-13 | Sovoz, Inc. | Virtual locomotion controller apparatus and methods |
| US20110050707A1 (en) * | 2009-08-28 | 2011-03-03 | Samsung Electronics Co., Ltd. | Method and apparatus for providing content |
Cited By (47)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10593092B2 (en) * | 1990-12-07 | 2020-03-17 | Dennis J Solomon | Integrated 3D-D2 visual effects display |
| US20150243068A1 (en) * | 1990-12-07 | 2015-08-27 | Dennis J. Solomon | Integrated 3d-d2 visual effects dispay |
| US20160030823A1 (en) * | 2014-07-31 | 2016-02-04 | Seiko Epson Corporation | On-running landing position evaluation method, on-running landing position evaluation apparatus, detection method, detection apparatus, running motion evaluation method, and running motion evaluation apparatus |
| US10504381B2 (en) * | 2014-07-31 | 2019-12-10 | Seiko Epson Corporation | On-running landing position evaluation method, on-running landing position evaluation apparatus, detection method, detection apparatus, running motion evaluation method, and running motion evaluation apparatus |
| US20160089574A1 (en) * | 2014-09-29 | 2016-03-31 | Equinox Holdings, Inc. | Excercise class apparatus and method |
| US11148032B2 (en) * | 2014-09-29 | 2021-10-19 | Equinox Holding, Inc. | Exercise class apparatus and method |
| US10342462B2 (en) * | 2014-10-26 | 2019-07-09 | David Martin | Application of gait characteristics for mobile |
| CN105759953A (zh) * | 2015-01-05 | 2016-07-13 | 索尼公司 | 信息处理装置、信息处理方法以及程序 |
| US20170352226A1 (en) * | 2015-01-05 | 2017-12-07 | Sony Corporation | Information processing device, information processing method, and program |
| EP3243557A4 (en) * | 2015-01-05 | 2018-05-30 | Sony Corporation | Information processing device, information processing method, and program |
| US20160346612A1 (en) * | 2015-05-29 | 2016-12-01 | Nike, Inc. | Enhancing Exercise Through Augmented Reality |
| US10289902B2 (en) | 2015-06-18 | 2019-05-14 | Casio Computer Co., Ltd. | Data analysis device, data analysis method and storage medium |
| US20200261763A1 (en) * | 2016-01-12 | 2020-08-20 | Samsung Electronics Co., Ltd. | Display device and control method therefor |
| US11020628B2 (en) * | 2016-01-12 | 2021-06-01 | Samsung Electronics Co., Ltd. | Display device and control method therefor |
| US20210252333A1 (en) * | 2016-01-12 | 2021-08-19 | Samsung Electronics Co., Ltd. | Display device and control method therefor |
| US20170312578A1 (en) * | 2016-05-02 | 2017-11-02 | Bao Tran | Smart device |
| US20230079256A1 (en) * | 2016-05-02 | 2023-03-16 | Bao Tran | Smart device |
| US10022614B1 (en) * | 2016-05-02 | 2018-07-17 | Bao Tran | Smart device |
| US10034066B2 (en) * | 2016-05-02 | 2018-07-24 | Bao Tran | Smart device |
| US10046229B2 (en) * | 2016-05-02 | 2018-08-14 | Bao Tran | Smart device |
| US10052519B2 (en) * | 2016-05-02 | 2018-08-21 | Bao Tran | Smart device |
| US10195513B2 (en) * | 2016-05-02 | 2019-02-05 | Bao Tran | Smart device |
| US10252145B2 (en) * | 2016-05-02 | 2019-04-09 | Bao Tran | Smart device |
| US11818634B2 (en) * | 2016-05-02 | 2023-11-14 | Bao Tran | Smart device |
| US11496870B2 (en) * | 2016-05-02 | 2022-11-08 | Bao Tran | Smart device |
| US20190200184A1 (en) * | 2016-05-02 | 2019-06-27 | Bao Tran | Smart device |
| US20180001184A1 (en) * | 2016-05-02 | 2018-01-04 | Bao Tran | Smart device |
| US9713756B1 (en) * | 2016-05-02 | 2017-07-25 | Bao Tran | Smart sport device |
| US20170318360A1 (en) * | 2016-05-02 | 2017-11-02 | Bao Tran | Smart device |
| US20170312614A1 (en) * | 2016-05-02 | 2017-11-02 | Bao Tran | Smart device |
| US9975033B2 (en) * | 2016-05-02 | 2018-05-22 | Bao Tran | Smart sport device |
| US9717958B1 (en) * | 2016-05-02 | 2017-08-01 | Bao Tran | Smart sport device |
| US9717949B1 (en) * | 2016-05-02 | 2017-08-01 | Bao Tran | Smart sport device |
| US10466475B2 (en) * | 2016-07-26 | 2019-11-05 | Bion Inc. | Head mounted virtual reality object synchronized physical training system |
| US10642346B2 (en) * | 2016-10-31 | 2020-05-05 | Fujitsu Limited | Action control method and device |
| US20180120928A1 (en) * | 2016-10-31 | 2018-05-03 | Fujitsu Limited | Action control method and device |
| CN106781805A (zh) * | 2017-01-19 | 2017-05-31 | 合肥金诺数码科技股份有限公司 | 一种农田灌溉及插秧体验系统 |
| US11484765B2 (en) * | 2017-03-22 | 2022-11-01 | Honda Motor Co., Ltd. | Walking support system, walking support method, and program |
| GB2567231A (en) * | 2017-10-09 | 2019-04-10 | Aboense Ltd | A sports apparatus for providing information |
| US11884155B2 (en) * | 2019-04-25 | 2024-01-30 | Motional Ad Llc | Graphical user interface for display of autonomous vehicle behaviors |
| US20240231430A1 (en) * | 2021-02-08 | 2024-07-11 | Sightful Computers Ltd | Altering display of virtual content based on mobility status change |
| US12360558B2 (en) * | 2021-02-08 | 2025-07-15 | Sightful Computers Ltd | Altering display of virtual content based on mobility status change |
| US12474816B2 (en) | 2022-09-30 | 2025-11-18 | Sightful Computers Ltd | Presenting extended reality content in different physical environments |
| FR3149698A1 (fr) * | 2023-06-09 | 2024-12-13 | Sylvain Quendez | Dispositif de contrôle d’allure affichant un point lumineux en réalité augmentée |
| US20250028495A1 (en) * | 2023-07-20 | 2025-01-23 | Guangdong Coros Sports Technology Co., Ltd | Method and device for linkage transmission, and computer-readable storage medium |
| US12417062B2 (en) * | 2023-07-20 | 2025-09-16 | Guangdong Coros Sports Technology Co., Ltd | Method and device for linkage transmission, and computer-readable storage medium |
| FR3151679A1 (fr) * | 2023-07-28 | 2025-01-31 | Sylvain Quendez | Dispositif de contrôle d’allure affichant une cible en réalité augmentée |
Also Published As
| Publication number | Publication date |
|---|---|
| CN103657029B (zh) | 2017-04-12 |
| EP2706395A2 (en) | 2014-03-12 |
| EP2706395A3 (en) | 2017-11-01 |
| JP2014054303A (ja) | 2014-03-27 |
| CN103657029A (zh) | 2014-03-26 |
| JP5885129B2 (ja) | 2016-03-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20140073481A1 (en) | Exercise support apparatus, exercise support method and exercise support program | |
| US10740599B2 (en) | Notification device, exercise analysis system, notification method, notification program, exercise support method, and exercise support device | |
| US12118684B2 (en) | Fitness system for simulating a virtual fitness partner and methods for use therewith | |
| US10085692B2 (en) | Exercise support device, exercise support method, and computer-readable storage medium having exercise support program stored therein | |
| CN106999757B (zh) | 用于训练用户的正确跑步的方法和系统 | |
| US20180043212A1 (en) | System, method, and non-transitory computer readable medium for recommending a route based on a user's physical condition | |
| JP6332830B2 (ja) | 運動支援システム及び運動支援方法、運動支援プログラム | |
| US11839466B2 (en) | Biofeedback for altering gait | |
| US20180047194A1 (en) | Information output system, information output method, and information output program | |
| JP6307673B1 (ja) | 二肢の間隔計測器を用いた歩行および走行行動提示システム | |
| US10271769B2 (en) | Performance information notification device and performance information notification method | |
| US12248887B2 (en) | Method and apparatus for predicting a race time | |
| JP2018025517A (ja) | 情報出力システム、情報出力方法、及び情報出力プログラム | |
| JP7449463B1 (ja) | 歩行補助ウェアラブルデバイス、制御方法、及びプログラム | |
| CN119733224A (zh) | 信息展示方法和装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: CASIO COMPUTER CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AIBARA, TAKEHIRO;REEL/FRAME:031168/0151 Effective date: 20130830 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |