WO2016133158A1 - 履物、音声出力システム及び出力制御方法 - Google Patents
履物、音声出力システム及び出力制御方法 Download PDFInfo
- Publication number
- WO2016133158A1 WO2016133158A1 PCT/JP2016/054692 JP2016054692W WO2016133158A1 WO 2016133158 A1 WO2016133158 A1 WO 2016133158A1 JP 2016054692 W JP2016054692 W JP 2016054692W WO 2016133158 A1 WO2016133158 A1 WO 2016133158A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- footwear
- unit
- output
- data
- output control
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B1/00—Footwear characterised by the material
- A43B1/0027—Footwear characterised by the material made at least partially from a material having special colours
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B3/00—Footwear characterised by the shape or the use
- A43B3/34—Footwear characterised by the shape or the use with electrical or electronic arrangements
- A43B3/50—Footwear characterised by the shape or the use with electrical or electronic arrangements with sound or music sources
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B13/00—Soles; Sole-and-heel integral units
- A43B13/14—Soles; Sole-and-heel integral units characterised by the constructive form
- A43B13/22—Soles made slip-preventing or wear-resisting, e.g. by impregnation or spreading a wear-resisting layer
- A43B13/24—Soles made slip-preventing or wear-resisting, e.g. by impregnation or spreading a wear-resisting layer by use of insertions
- A43B13/26—Soles made slip-preventing or wear-resisting, e.g. by impregnation or spreading a wear-resisting layer by use of insertions projecting beyond the sole surface
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B23/00—Uppers; Boot legs; Stiffeners; Other single parts of footwear
- A43B23/24—Ornamental buckles; Other ornaments for shoes without fastening function
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B3/00—Footwear characterised by the shape or the use
- A43B3/0036—Footwear characterised by the shape or the use characterised by a special shape or design
- A43B3/0078—Footwear characterised by the shape or the use characterised by a special shape or design provided with logos, letters, signatures or the like decoration
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B3/00—Footwear characterised by the shape or the use
- A43B3/24—Collapsible or convertible
- A43B3/242—Collapsible or convertible characterised by the upper
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B3/00—Footwear characterised by the shape or the use
- A43B3/34—Footwear characterised by the shape or the use with electrical or electronic arrangements
- A43B3/36—Footwear characterised by the shape or the use with electrical or electronic arrangements with light sources
-
- A—HUMAN NECESSITIES
- A43—FOOTWEAR
- A43B—CHARACTERISTIC FEATURES OF FOOTWEAR; PARTS OF FOOTWEAR
- A43B5/00—Footwear for sporting purposes
- A43B5/12—Dancing shoes
Definitions
- the present invention relates to footwear and an output control method.
- footwear that has a discoloration portion, measures performance parameters, and colors the discoloration portion according to the measured performance parameters (see, for example, Patent Document 1).
- the discoloration portion only changes color based on the performance parameter measured in the footwear, and further, the color does not change according to the signal received from the outside. That is, the conventional footwear is not adaptively discolored by a plurality of parameters.
- an object of the present invention is to provide footwear and an output control method capable of adaptively performing output control based on sound and movement.
- Footwear is footwear, and includes a sensor unit that detects movement of the footwear, a transmission unit that transmits sensor data detected by the sensor unit to an external device, and a sound from the external device.
- a receiving unit configured to receive an output control signal based on the data and the sensor data; and an output unit configured to perform output based on the output control signal.
- output control can be performed adaptively based on sound and movement.
- (B) is a perspective view of a sole part, and is a perspective view showing a state where a light emitting part and a sensor part 106 are placed. It is a figure which shows an example of the function of the main control part of the information processing apparatus which concerns on an Example. It is a data conceptual diagram of the audio
- FIG. 1 is a diagram illustrating an example of a configuration of an output control system 10 according to the embodiment.
- the output control system 10 includes a footwear 100 and an information processing apparatus 200 connected via a network N.
- the information processing apparatus 200 may be any apparatus as long as it can process a signal acquired via the network N such as a PC (Personal Computer) or a mobile terminal.
- a server may be connected to the network N.
- the output control system 10 shown in FIG. 1 is provided in the footwear 100 by an output control signal based on sensor data sensed by a sensor provided in the footwear 100 and sound data stored in the information processing apparatus 200, for example.
- the output of the output unit is controlled.
- the output control system 10 performs light emission control of the LED in conjunction with the movement of the footwear 100 and music.
- the movement, sound and light are controlled by performing LED issuance control according to the movement and music of the foot. It can be shown to the audience as if it were linked to the Trinity.
- FIG. 2 is a diagram illustrating an example of a schematic configuration of hardware of the footwear 100 in the embodiment.
- the footwear 100 shown in FIG. 2 includes at least a control unit 102, a communication unit 104, a sensor unit 106, an output unit 108, a power supply unit 110, and a storage unit 112.
- Communication unit 104 includes a transmission unit 142 and a reception unit 144.
- the upper part and sole part as the footwear 100 are abbreviate
- the control unit 102 is, for example, a CPU (Central Processing Unit), and executes a program developed on a memory to cause the footwear 100 to realize various functions.
- the control unit 102 performs various calculations based on the sensor data sensed by the sensor unit 106 and the output control signal received by the reception unit 144. For example, when acquiring the output control signal, the control unit 102 controls the output of the output unit 108 according to the output control signal. Details of the control unit 102 will be described with reference to FIG.
- the communication unit 104 transmits and receives data via the communication network N.
- the transmission unit 142 transmits the sensor data detected by the sensor unit 106 to the information processing apparatus 200.
- the receiving unit 144 receives an output control signal based on sensor data and sound data from one information processing apparatus 200.
- the communication unit 104 may set which device 200 and which footwear 100 are to communicate before data transmission / reception. Note that the communication does not have to be one body, and for example, data may be transmitted from one information processing apparatus 200 to a plurality of footwear 100.
- the communication network N is configured by a wireless network or a wired network.
- Examples of communication networks include mobile phone networks, PHS (Personal Handy-phone System) networks, wireless LAN (Local Area Network), 3G (3rd Generation), LTE (Long Term Evolution), 4G (4th Generation), WiMax. (Registered trademark), infrared communication, Bluetooth (registered trademark), wired LAN, telephone line, power line network, IEEE 1394, ZigBee (registered trademark), and other networks.
- the sensor unit 106 includes an acceleration sensor and an angular velocity (gyro) sensor, and may further include a geomagnetic sensor.
- the sensor unit 106 includes a 9-axis sensor in which a 3-axis acceleration sensor, a 3-axis angular velocity sensor, and a 3-axis geomagnetic sensor are integrated.
- Sensor unit 106 detects the movement of footwear 100. For example, when the footwear 100 is worn by the user, the movement of the foot is detected. Sensor data detected by the sensor unit 106 is transmitted to the external information processing apparatus 200 via the transmission unit 142.
- the output unit 108 performs output by control from the control unit 102 based on the output control signal.
- the output unit 108 includes, for example, a light emitting unit, and the light emission is controlled by the control unit 102 to emit light.
- the light emitting unit is, for example, an LED.
- a plurality of LEDs may be provided, and for example, an RGB 8-bit full car may be individually controlled. Further, a plurality of LEDs may be provided in a straight line over the side surface of the sole part, or a plurality of LEDs may be provided in a straight line on the collar part.
- the output unit 108 includes a curved display such as an organic EL (Electro Luminescence), a speaker, and the like, and the output based on the output control signal may be realized by an image or sound. Further, the output unit 108 may include a vibration element or the like, and may realize output based on the output control signal by vibration.
- a curved display such as an organic EL (Electro Luminescence), a speaker, and the like
- the output based on the output control signal may be realized by an image or sound.
- the output unit 108 may include a vibration element or the like, and may realize output based on the output control signal by vibration.
- the power supply unit 110 is a battery, for example, and supplies power to each part inside the footwear 100.
- the storage unit 112 stores, for example, programs and various data.
- the program is executed by the control unit 102.
- the various data includes, for example, image information, output function information by the output unit, calibration information of the sensor unit 106, and the like.
- the footwear 100 may be provided with the above-described components on the sole part, or only the output part 108 may be provided on the upper part, and the output part 108 may be provided on both the sole part and the upper part. May be.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of the information processing apparatus 200 according to the embodiment.
- 3 includes a touch panel 14, a speaker 16, a microphone 18, a hard button 20, a hard key 22, a mobile communication antenna 30, a mobile communication unit 32, a wireless LAN communication antenna 34, and wireless LAN communication.
- the touch panel 14 has functions of both a display device and an input device, and includes a display (display screen) 14A that bears the display function and a touch sensor 14B that bears the input function.
- the display 14A is configured by a general display device such as a liquid crystal display or an organic EL display, for example.
- the touch sensor 14B is configured to include an element for detecting a contact operation disposed on the upper surface of the display 14A, and a transparent operation surface stacked thereon.
- a contact detection method of the touch sensor 14B any of known methods such as a capacitance type, a resistance film type (pressure sensitive type), and an electromagnetic induction type can be adopted.
- the touch panel 14 displays an image generated by executing the program 50 stored in the storage unit 38 by the main control unit 40.
- the touch panel 14 as an input device detects an operation of a contact object (including a user's finger or a touch pen, which will be described as a representative example) in contact with the operation surface. An operation input is received, and information on the contact position is given to the main control unit 40.
- the movement of the finger is detected as coordinate information indicating the position or region of the contact point, and the coordinate information is expressed as, for example, coordinate values on two axes of the short side direction and the long side direction of the touch panel 14.
- the information processing apparatus 200 is connected to the network (Internet) N through the mobile communication antenna 30 and the wireless LAN communication antenna 34, and can perform data communication with the footwear 100 and the server.
- the program 50 may be installed in the information processing apparatus 200, or may be provided with an output control function from a server online. By executing the program 50, an application for controlling the output of the footwear 100 operates.
- FIG. 4 is a diagram illustrating an example of functions of the control unit 102 of the footwear 100 in the embodiment.
- the control unit 102 illustrated in FIG. 4 has at least the functions of the acquisition unit 202, the determination unit 204, the output control unit 206, the conversion unit 208, and the evaluation unit 210 by executing a predetermined program.
- the acquisition unit 202 acquires the detected sensor data from the sensor unit 106.
- the sensor data is a signal indicating the movement of the footwear 100.
- the acquired sensor data is output to the transmission unit 142 and the determination unit 204.
- the acquisition unit 202 acquires the output control signal received by the reception unit 144.
- the output control signal is a control signal corresponding to the output content of the output unit 108, and is, for example, at least one of a light emission control signal, a display control signal, a sound control signal, and a vibration control signal.
- the acquired output control signal is output to the output control unit 206.
- the determination unit 204 determines whether the footwear 100 is moving in a predetermined direction based on the sensor data. For example, the determination unit 204 can determine the movement of the footwear 100 in a direction substantially perpendicular to the linear direction in which the LEDs are provided, since the posture and the moving direction of the footwear 100 can be known from the sensor data.
- the predetermined direction may be appropriately determined according to the output content of the output unit 108.
- the output control unit 206 controls the output of the output unit 108 based on the output control signal. For example, when the output unit 108 is a plurality of LEDs, the output control unit 206 controls the light emission position, color, intensity, and the like.
- the conversion unit 208 converts the predetermined image into data indicating the position and color of the LED corresponding to the predetermined image, and generates a light emission control signal (output control signal). .
- the converter 208 outputs the light emission control signal to the output controller 206. Note that the conversion unit 208 may be implemented as a function of the output control unit 206.
- the output control unit 206 may perform light emission control on the plurality of light emitting units based on the light emission control signal generated by the conversion unit 208 so that an afterimage of light representing a predetermined image appears in a predetermined direction. Thereby, the expression of the output by the footwear 100 can be increased.
- the evaluation unit 210 evaluates the movement of the footwear 100 based on the sensor data. For example, the evaluation unit 210 retains data in which movement as a model is sensed as time-series model data. This model data may be received from the information processing apparatus 200 or the server, or sensor data obtained by sensing a model movement may be held as data learned by machine learning or the like.
- the evaluation unit 210 compares the model data with the sensor data, and if both data are approximated, it is a good evaluation result, and if both data are not approximated, the evaluation result is bad. For example, the evaluation unit 210 determines that the approximation is approximate if the accumulated difference value of both data is equal to or less than a predetermined value, and does not approximate if the accumulated difference value of both data is greater than the predetermined value. Note that the evaluation result may be evaluated in a plurality of stages according to the magnitude of the accumulated difference value.
- the output control unit 206 may control the output of the output unit 210 based on the evaluation result of the evaluation unit 210. For example, the output control unit 206 controls the output of the output unit 210 such as red for a good evaluation result and green for a bad evaluation result. Thereby, for example, it can be applied to what kind of evaluation is performed in the practice of the step of the dancer's foot.
- FIG. 5 is a diagram illustrating an example of functions of the main control unit 40 of the information processing apparatus 200 in the embodiment.
- the main control unit 40 illustrated in FIG. 5 has at least the functions of the acquisition unit 302, the analysis unit 304, the conversion unit 306, and the learning unit 308 by executing the program 50.
- the acquisition unit 302 acquires sensor data received by the wireless LAN communication unit 36 or the like.
- the sensor data is sensor data detected by the sensor unit 106 provided in the footwear 100.
- the acquired sensor data is output to the conversion unit 306.
- the analysis unit 304 analyzes sound using a general acoustic analysis technique.
- the analysis unit 304 analyzes, for example, music hitting sound, sound pressure, pitch, chord configuration, and the like.
- the analysis result data is output to the conversion unit 306.
- the conversion unit 306 converts the sensor data and the analysis result data (also referred to as sound data) into an output control signal that controls the output unit 108 of the footwear 100.
- the conversion unit 306, for example, based on the analysis result data, the first color when the pitch is higher than the first pitch, the second color when the pitch is lower than the first pitch, and the third color when the predetermined movement is based on the sensor data.
- An output control signal is generated so as to be a color.
- the conversion unit 306 can change the contribution rate of each of the sound data and sensor data to the output control signal according to a prior setting. For example, the conversion unit 306 makes it possible to set the contribution ratio of sound data to 80% and the contribution ratio of sensor data to 20% in order to increase the influence of acoustic analysis. The contribution rate may be set in advance by the user.
- the conversion unit 306 can select a parameter of sound data (e.g., pitch, hitting sound, sound pressure, chord configuration) that is subject to light emission control, and a parameter (e.g., movement) of sensor data that is subject to light emission control. Type, moving direction, moving speed, etc.).
- the conversion unit 306 enables selection of light emission parameters (for example, light emission color, light emission intensity, light emission position, etc.).
- the conversion unit 306 associates the selected parameter for light emission control with the light emission parameter.
- the conversion unit 306 is the first color when the sound is equal to or higher than the first pitch, the second color when the sound is lower than the first pitch, and when the footwear 100 performs a predetermined movement. Can generate an output control signal so as to be the third color.
- the learning unit 308 accumulates sensor data acquired from the footwear 100, extracts feature values from the sensor data, and performs machine learning of the extracted feature values.
- a known technique may be used for the feature amount extraction process and the machine learning process.
- the learning unit 308 acquires model data serving as a model for a dance step by machine learning of sensor data of the footwear 100.
- the model data can also be acquired by downloading from a server or the like.
- FIG. 6 is a diagram illustrating an example of the footwear 100 in the embodiment.
- FIG. 6A is a side view showing an example of the footwear 100 in the embodiment.
- FIG. 6B is a rear view showing an example of the footwear 100 in the embodiment.
- the footwear 100 includes an upper portion 100A and a sole portion 100B, and a plurality of LEDs 100C are provided on the sole portion 100B. Moreover, LED100C is provided over the side surface of the X direction of sole part 100B. Moreover, LED100C may be provided also in the Z direction of the collar part of 100 A of upper parts.
- the arrangement position of LED100C is an example to the last, and is not restricted to the example shown in FIG.
- ⁇ Issuance control (part 1)> In the footwear 100 shown in FIG. 6, for example, light emission control (part 1) when a dancer wears and dances will be described. When the dancer dances to music, sensor data indicating the movement of the footwear 100 is transmitted to the information processing apparatus 200. The information processing apparatus 100 generates a light emission control signal based on the result of acoustic analysis of the music sound source and the acquired sensor data.
- the information processing apparatus 100 generates a basic control signal from the acoustic analysis result, and additionally inserts a light emission control signal when it is determined that the sensor data is a movement indicating light emission control. Thereby, light emission control can be adaptively performed based on sound and movement.
- the LED 100C of the footwear 100 emits light in accordance with the sound of music, and the emission color can be changed depending on the pitch, and the LED 100C emits a predetermined color by a tap operation. can do. Therefore, control is performed so that sound, movement, and light are linked in a trinity.
- FIG. 7 is a diagram for explaining the predetermined image.
- FIG. 7A is a diagram illustrating an example of a predetermined image. As shown in FIG. 7A, the predetermined image is “H”.
- FIG. 7B is a diagram illustrating an example of dividing a predetermined image.
- the predetermined image “H” is divided so as to appear by an afterimage of light.
- it is divided into five (400A to E) in the vertical direction (Z direction shown in FIG. 6).
- the divided images 400A to 400E of the predetermined image are sequentially emitted.
- the predetermined image stand out by reducing the contribution rate (for example, 0% to 10%). Thereby, a contribution rate can be changed according to the motion etc. which are detected from sensor data adaptively.
- control for causing the predetermined image to appear may be performed on the footwear 100 side or on the information processing apparatus 100 side.
- an example performed on the footwear 100 side will be described.
- FIG. 8 is a conceptual diagram for explaining that a predetermined image appears.
- FIG. 8 illustrates that the predetermined image “H” appears due to an afterimage of light when jumping upward in the Z direction.
- the determination unit 204 detects jumping upward based on the sensor data. For example, if the sensor data indicates that the movement distance in the upward direction is equal to or greater than the threshold value within a predetermined time while maintaining the horizontal posture to some extent, the determination unit 204 determines that the jump is upward. .
- the conversion unit 208 generates a light emission control signal so that the LED at the position corresponding to the divided image 400A emits light in the color of the image, and outputs the light emission control signal to the output control unit 206.
- the output control unit 206 receives a light emission control signal from the conversion unit 208, the output control unit 206 controls the light emission with priority over the output control signal acquired by the acquisition unit 202.
- the conversion unit 208 At time t2, the conversion unit 208 generates a light emission control signal so that the LED at the position corresponding to the divided image 400B is emitted with the color of the image, and outputs the light emission control signal to the output control unit 206.
- the output control unit 206 performs light emission control so that the divided image 400B appears.
- the conversion unit 208 At time t3, the conversion unit 208 generates a light emission control signal so that the LED at the position corresponding to the divided image 400C emits light in the color of the image, and outputs the light emission control signal to the output control unit 206.
- the output control unit 206 performs light emission control so that the divided image 400C appears.
- the conversion unit 208 At time t4, the conversion unit 208 generates a light emission control signal so that the LED at the position corresponding to the divided image 400D emits light in the color of the image, and outputs the light emission control signal to the output control unit 206.
- the output control unit 206 performs light emission control so that the divided image 400D appears.
- the conversion unit 208 At time t5, the conversion unit 208 generates a light emission control signal so that the LED at the position corresponding to the divided image 400E emits light in the color of the image, and outputs the light emission control signal to the output control unit 206.
- the output control unit 206 performs light emission control so that the divided image 400E appears.
- the predetermined image is not limited to characters, and may be a logo or a picture.
- each interval from time t1 to time t5 may be determined in advance, or may be determined according to the moving speed because the moving speed is known from the sensor data.
- the size of the divided image may be determined by the arrangement of the LEDs in the sole portion. For example, in the case where LEDs are provided by being stacked in the Z direction, the length of the divided image in the Z direction may be increased.
- a technique called POV Persistence of Vision
- This POV is a technique for displaying an image or video by blinking an LED at high speed in accordance with the movement of a device or the like. For example, when the user wearing the footwear 100 repeats the jump, it is possible to control the predetermined image to appear at the vertical movement position of the LED of the footwear 100.
- evaluation results based on differences between model data representing dance step examples and sensor data may be expressed by differences in LED colors, light emission positions, or the like. Good.
- FIG. 9 is a flowchart illustrating an example of light emission control processing (part 1) in the embodiment.
- the communication unit 104 initializes communication settings. Initialization includes setting which device 200 the communication unit 104 communicates with.
- step S104 the control unit 102 controls the output unit 108 to output (emit light), and the user confirms that the output unit 108 outputs (emits light).
- step S106 the sensor unit 106 determines whether or not the sensor data has been updated. If the sensor data has been updated (YES in step S106), the process proceeds to step S108, and if the sensor data has not been updated (step S106). The process proceeds to step S112 (NO in step S106).
- step S108 the acquisition unit 202 of the control unit 102 acquires sensor data from the sensor unit 106.
- step S110 the transmission unit 142 transmits the sensor data to the information processing apparatus 200.
- step S112 the reception unit 144 determines whether an output control signal has been received from the information processing apparatus 200. If the output control signal has been received (step S112—YES), the process proceeds to step S114. If the output control signal has not been received (step S112—NO), the process proceeds to step S116.
- step S114 the output control unit 206 controls the light emission of the output unit 108 according to the output control signal.
- This output control signal is an output control signal generated based on sound data and sensor data.
- step S116 the control unit 102 determines whether or not reception of the output control signal is completed. If reception of the output control signal has been completed (step S116—YES), the process ends. If reception of the output control signal has not been completed (step S116—NO), the process returns to step S106.
- the end of reception is determined, for example, when the output control signal is not received for a certain period of time or when reception is turned off by a switch or the like.
- the footwear 100 can adaptively perform output control based on sound and movement.
- FIG. 10 is a flowchart illustrating an example of the light emission control process (part 2) in the embodiment. Steps S202 to S204 shown in FIG. 10 are the same as steps S102 to S104 shown in FIG.
- step S206 the receiving unit 144 determines whether image data has been received. If image data is received (step S206—YES), the process proceeds to step S208. If image data is not received (step S206—NO), the process returns to step S206. In this processing example, the footwear 100 acquires image data first.
- step S208 the storage unit 112 stores and stores the received image data.
- step S210 the sensor unit 106 determines whether or not the sensor data has been updated. If the sensor data has been updated (step S210-YES), the process proceeds to step S212, and if the sensor data has not been updated (step S210). Step S210—NO) The process returns to Step S210.
- step S212 the acquisition unit 202 of the control unit 102 acquires sensor data from the sensor unit 106.
- step S214 the control unit 102 analyzes the sensor data and updates the posture information and the movement information.
- step S216 the determination unit 208 determines whether or not the footwear 100 has moved a predetermined distance or more in a predetermined direction. If the condition is satisfied (step S216—YES), the process proceeds to step S218. If the condition is not satisfied (step S216—NO), the process proceeds to step S222.
- step S2128 the conversion unit 208 converts the image data into display data in a form corresponding to the movement direction and posture information, and generates an output control signal.
- step S220 the output control unit 206 performs light emission control based on the output control signal generated by the conversion unit 208.
- the output control unit 206 performs light emission control until a predetermined image appears in space (from t1 to t5 in FIG. 8).
- step S222 the control unit 102 determines whether the sensing of the sensor unit 106 has been completed. The end of sensing is determined when the sensor signal is not updated for a certain time or when sensing is turned off by a switch or the like.
- the predetermined image can be displayed in the space by using the afterimage of light.
- This process can be realized by the light emission control based on the sensor data, but is realized by detecting a predetermined movement when the light emission control (part 1) shown in FIG. 9 is controlled. You can also.
- the model data is, for example, data sensed in a dance step.
- FIG. 11 is a flowchart showing an example of uploading process of model data in the embodiment.
- the main control unit 40 of the information processing device 200 determines whether or not the step learning button has been pressed. If the step learning button is pressed (step S302—YES), the process proceeds to step S304, and if the step learning button is not pressed (step S302—NO), the process returns to step S302.
- the learning button is a UI (User Interface) button displayed on the screen.
- step S304 the main control unit 40 turns on the learning mode trigger.
- step S306 the main control unit 40 acquires sensor data received from the footwear 100 and accumulates it in the storage unit 38 as motion data.
- step S308 the main control unit 40 determines whether or not a learning end button has been pressed. If the learning end button is pressed (step S308—YES), the process proceeds to step S310. If the learning end button is not pressed (step S308—NO), the process returns to step S306.
- the learning end button is a UI button displayed on the screen.
- step S310 the main control unit 40 turns off the learning mode trigger.
- step S312 the main control unit 40 analyzes the feature amount of the accumulated motion data.
- a known technique may be used for the feature amount analysis.
- step S314 the main control unit 40 determines whether or not the upload button has been pressed. If the upload button is pressed (step S314-YES), the process proceeds to step S316. If the upload button is not pressed (step S314-NO), the process returns to step S314.
- the upload button is a UI button displayed on the screen.
- step S316 the main control unit 40 performs control so that motion data or data such as a feature amount is transmitted to the server.
- the model data to be compared is uploaded to the server.
- the server stores a plurality of model data and enables the model data to be downloaded to the information processing apparatus 200 and the footwear 100.
- FIG. 12 is a flowchart illustrating an example of the light emission control process (part 3) in the embodiment.
- the case where the information processing apparatus 200 evaluates a step will be described as an example.
- step S402 shown in FIG. 12 the user who wants to practice the step operates the information processing apparatus 200 to access the server, selects the step he / she wants to learn, and uses the motion data (or feature quantity) as a model for the information processing apparatus 200. Data).
- the downloaded data is referred to as learning data.
- step S404 the user puts on the footwear 100 and performs the step selected in step S402.
- step S406 the sensor unit 106 of the footwear 100 transmits sensor data indicating the movement of the step to the information processing apparatus 200.
- the information processing apparatus 200 accumulates the received sensor data in the storage unit 38 as motion data. Data acquired during practice is referred to as user data.
- step S408 the main control unit 40 detects a difference between the learning data and the user data.
- step S410 the main control unit 40 determines whether or not a difference value that is a difference is within a threshold value. If the difference value is within the threshold value (step S410—YES), the process proceeds to step S412. If the difference value is greater than the threshold value (step S410—NO), the process proceeds to step S414.
- step S412 the main control unit 40 outputs an output control signal indicating success to the footwear 100.
- the footwear 100 can perform an output indicating success.
- the output control unit 206 causes the LED to emit light in the first color, displays a circle on the display, or causes the vibrator to vibrate in a predetermined manner.
- step S414 the main control unit 40 outputs an output control signal indicating failure to the footwear 100.
- the footwear 100 can perform the main force which shows success.
- the output control unit 206 causes the LED to emit light in the first color, displays a circle on the display, or causes the vibrator to vibrate in a predetermined manner.
- the information processing apparatus 200 can compare and display the learning data and the user data. Thereby, the user can grasp which movement is good and which movement is bad, and can practice the step effectively.
- the above-described evaluation process can be executed by the control unit 102 of the footwear 100 that has downloaded the learning data. Thereby, once learning data is downloaded to the footwear 100, the step can be practiced even offline.
- the user can practice the predetermined movement while wearing the footwear 100 and can know an appropriate evaluation result of the practiced movement.
- Each processing step included in the processing flow described with reference to FIGS. 9 to 12 can be executed in any order or in parallel as long as the processing contents do not contradict each other. Other steps may be added in between. Further, a step described as one step for convenience can be executed by being divided into a plurality of steps, while a step described as being divided into a plurality of steps for convenience can be grasped as one step.
- the main control unit 40 of the information processing apparatus 200 generates or selects image data based on the motion data and sound data of the user's series of footwear 100, and displays an LED provided as an output unit 108 on the footwear 100. Update content in real time.
- the LED functions as a display having a certain width in the vertical and horizontal directions. For example, when the motion data indicates a predetermined motion, a first image having a size that can be displayed on the display is displayed. When the sound data indicates a predetermined sound, a second image having a size that can be displayed on the display is displayed. .
- the output unit 108 is a display of an external computer, and may display an image on the display, reproduce sound using an external speaker, or perform tactile output using a vibration module.
- a device such as a piezoelectric element may be provided in the insole of the footwear 100.
- the footwear 100 can detect the treading and can control the output of the output unit 108 according to the treading.
- the sensor unit 106 may be a 10-axis sensor including an altimeter altimeter in a 9-axis sensor.
- the sensor unit 106 may include a load sensor. Thereby, the output control of the output part 108 according to an altitude or a load can be performed.
- the vibration element may be provided in the insole of the footwear 100 or the upper part of the shoe.
- a predetermined message can be transmitted to the user by vibration.
- the output control system 10 can also control a plurality of devices simultaneously. For example, simultaneous control of a plurality of footwear 100 is possible by using wireless communication. Thereby, the light emission pattern (output control signal) can be transmitted from one information processing apparatus 200, and the light emission colors of all the footwear 100 in the venue can be synchronized.
- the acoustic analysis may be performed not only by the information processing apparatus 200 but also by the control unit 102 of the footwear 100. Accordingly, the footwear 100 can automatically generate a light emission pattern (output control signal) in accordance with surrounding music.
- the output control system 10 can generate music.
- the motion data of the footwear 100 can be analyzed by the information processing apparatus 200 or the internal control unit 102, and sound or music matching the moving direction, moving speed, etc. can be generated in real time.
- the output unit 108 can reproduce specific sound sample data.
- the output control system 10 can control the drum sound to be reproduced by stepping on the kite.
- the output control system 10 can share performance data in which any sound is associated with any movement of the footwear 100 through an external device (information processing device 200) in a server on the Internet. . Thereby, another user can download the user's data and perform with his / her footwear 100.
- the output control system 10 can share LED animation, an image drawn with an afterimage, and video data with a server on the Internet through an external device (information processing device 200). As a result, other users can download the user's data and display it on his / her footwear 100.
- the output control system 10 can also analyze the movement detected by the footwear 100. Further, by using a 9-axis sensor or the like as the sensor unit 106, the posture, moving speed, and moving distance of the footwear 100 can be properly sensed, and the analysis results of these motions are displayed on the display in real time. Can do.
- the footwear 100 of the output control system 10 can be used as a controller.
- a foot gesture wearing the footwear 100 in the footwear 100 or the like in advance it can be used as a wireless controller of another computer. Specifically, it is conceivable to operate room lighting by rotating the right foot.
- the output control system 10 can estimate the physical characteristics of the user by analyzing the sensor data detected by the sensor unit 106 of the footwear 100. As a result, it is possible to implement an application that gives advice such as exercise based on the physical characteristics of the user or a method for improving the form.
- the output control system 10 may include a GPS (Global Positioning System) module in the footwear 100.
- GPS Global Positioning System
- the current location is detected, and when it enters a specific location, it emits light to show it, and in combination with a geomagnetic sensor, it can detect the current direction and guide the route by light emission or vibration. it can.
- the footwear 100 can transmit a musical rhythm to a user by providing a vibration element inside and vibrating the vibration element at a constant rhythm. Or the footwear 100 can transmit a specific message like a Morse signal by vibration of a vibration element.
- it can be used for video output and effects such as moving the CG of the shoe displayed on the display according to the sensor data detected by the sensor unit 106 provided in the footwear 100.
- the output control system 10 can be used as an effector for the music being played.
- a specific amount of exercise can be used as an effect amount, thereby synchronizing the amount of exercise for a certain time and the volume.
- the dancer wearing the footwear 100 may rotate the foot and control so as to increase the volume of music when the number of rotations increases.
- the present invention can be applied to a wearable device (for example, a wristwatch or glasses) that is worn at a position where a user's movement is desired to be detected.
- the sensor unit 106 may be mounted not at the inside of the footwear 100 or the wearable device but at a position where motion is desired to be detected as an external sensor.
- the program of the present invention can be installed or loaded on a computer through various recording media such as an optical disk such as a CD-ROM, a magnetic disk, and a semiconductor memory, or via a communication network. .
- unit does not simply mean a physical configuration, but also includes a case where the functions of the configuration are realized by software.
- functions of one configuration may be realized by two or more physical configurations, or functions of two or more configurations may be realized by one physical configuration.
- the “system” includes a system configured to provide a specific function to the user, which is configured by an information processing apparatus or the like. For example, it is configured by a server device, a cloud computing type, an ASP (Application Service Provider), a client server model, and the like, but is not limited thereto.
- FIG. 13A is an external view showing the configuration of the footwear 100.
- the footwear 100 is on the upper surface side of the footwear 100, and is an upper portion 1301 that covers and fixes the instep of the user wearing the footwear 100, and the bottom surface side of the footwear 100.
- a sole portion 1302 having a function of absorbing an impact.
- the upper part 1301 is provided with a tongue part 1303 for protecting the instep of the user.
- the tongue unit 1303 is provided with a module 1304 including the control unit 102, the communication unit 104, and the power supply unit 110.
- the tongue unit 1303 is opened by opening the tongue unit 1303.
- the module 1304 inserted in the provided pocket can be exposed.
- the module 1304 has a terminal (for example, a USB terminal) for receiving power supply.
- a terminal for example, a USB terminal
- the power supply unit 110 can be connected to a power source, supplied with power, and stored in the power source unit 110.
- the communication unit 104 may suppress power consumption due to communication, for example, by performing communication according to the Bluetooth Low Energy standard.
- the sole part 1302 includes an output part 108 and a sensor part 106.
- the sensor unit 106 is provided inside the shank unit 1302 at a position corresponding to the arch of the user's foot.
- the sensor unit 106 is connected to the module 1304 through the inside of the footwear 100, operates with power supplied from the power source 110 inside the module 1304, and transmits sensor data to the module 1304. To communicate. Thereby, the sensor data sensed by the sensor unit 106 is transmitted to the external information processing apparatus 200 by the communication unit 104.
- FIG. 14A is a plan view of the sole portion 1302, and FIG. 14B is a cross-sectional view of the sole portion 1302 of FIG. 14A taken along the line AA ′.
- the sole part 1302 includes a groove part 1401 for placing the output part 108 thereon.
- the groove portion 1401 is provided inside the sole portion 1302 and on the outer peripheral portion of the sole portion 1302 along the outer edge thereof.
- the groove part 1401 is recessed for placing the output part 108, and an LED tape is provided in the groove part 1401 as the output part 108.
- the sensor unit 106 is provided at a position where the groove part 1401 is not provided and at a position facing the user's arch inside the sole part 1302.
- the location is a position referred to as a so-called shank portion in the structure of the footwear 100.
- shock absorbing ribs 1402 to 1405 are provided at positions where the groove part 1401 and the sensor part 106 are not provided.
- the ribs 1402 and 1403 are provided on the toe side of the user of the sole portion 1302 and on the outer peripheral side of the groove portion 1401. Accordingly, it is possible to absorb an impact applied to the front end portion of the footwear 100 with respect to the footwear 100, reduce the possibility that the output portion 108 provided in the groove portion 1401 breaks down, and reduce the burden on the user's foot.
- the ribs 1404 and 1405 are also located in the center of the footwear 100 to absorb the impact on the footwear, reduce the possibility that the output unit 108 provided in the groove 1401 will break down, and reduce the burden on the user's foot. can do.
- FIG. 14C is a cross-sectional view of the sole portion 1302 and shows a state where an LED tape as the output portion 108 is placed.
- the output unit 108 is placed with the light emitting surface facing the bottom surface side of the footwear 100. That is, the bottom surface of the footwear 100 emits light.
- the inventors have found that when the LED tape is installed along the side surface of the sole portion 1302 so that the side surface emits light, the breakage rate of the LED tape, in particular, the bending rate of the toe portion increases, and the breakage rate increases. is doing.
- the sole portion 1302 is made of a transparent or translucent resin having high shock absorption, the footwear 100 that transmits light emitted from the LED tape and emits light from the bottom surface thereof can be provided.
- FIG. 15 is a perspective view of the sole portion 1302 provided to make the structure of the sole portion 1302 easier to understand.
- FIG. 15A is a perspective view showing a state in which the sensor unit 106 and the output unit 108 are not placed on the sole part 1302, and
- FIG. 15B is a diagram illustrating the output part 106 and the sensor on the sole part 1302. It is a perspective view which shows the state which mounted the part 106.
- the output portion 108 which is an LED tape, is placed in the groove portion 1401 and provided on the outer peripheral portion of the bottom surface of the sole portion 1302. .
- the sensor unit 106 is provided in a recess 1501 provided in the sole unit 1302.
- the recess 1501 is configured so as to substantially match the outer diameter of the sensor unit 106, thereby preventing rattling as much as possible when the sensor unit 106 is placed in the recess 1501.
- the detection can be purely able to detect the movement of the footwear 100.
- the sensor unit 106 is provided in the module 1304 of the tongue unit 1303 of the footwear 100, there is a possibility that sensing accuracy may be lacking. Therefore, the sensor unit 106 is provided on the sole unit 1302 so that more stable sensing can be performed.
- FIG. 16 is a diagram illustrating an example of functions of the main control unit 40 of the information processing apparatus 200 according to the third embodiment.
- the configuration itself of the information processing apparatus 200 is as shown in FIG. 3 of the first embodiment.
- the main control unit 40 illustrated in FIG. 16 has at least the functions of the acquisition unit 302, the operation analysis unit 1601, the voice generation unit 1602, and the voice output unit 1603 by executing a predetermined program.
- the acquisition unit 302 acquires the audio file table 1700 and the output audio table 1710 stored in the storage unit 38 from the storage unit 38 and transmits them to the audio generation unit 1602. .
- the audio file table 1700 and the output audio table 1710 will be described.
- the acquisition unit 302 acquires the audio file and the actual sound source data stored in the storage unit 38. Further, the acquisition unit 302 acquires user setting information related to audio output control from the storage unit 38.
- the user setting information related to the sound output control is information indicating a setting related to a sound control method to be output according to the operation of the footwear 100, and is set in advance in the information processing apparatus 200 by using the touch panel 14 from the user. Has been.
- the setting is stored in the storage unit 38.
- FIG. 17A is a data conceptual diagram showing a data configuration example of the audio file table 1700 stored in the storage unit 38.
- the audio file table 1700 is information in which gesture data 1701 and an audio file 1702 are associated with each other.
- the gesture data 1701 is information indicating a movement pattern that defines the movement of the footwear 100, and is information indicating a change in momentum and acceleration with time. More specifically, it is information indicating changes in the momentum and acceleration over time in the X-axis direction, the Y-axis direction, and the Z-axis direction.
- the audio file 1702 is associated with the gesture data 1701 and is information for specifying an audio file to be output when the sensor data patterns analyzed by the motion analysis unit 1601 match.
- the voice output table 1710 is information in which exercise data 1711 and voice parameters 1712 are associated with each other.
- the exercise data 1711 is information indicating the momentum and acceleration, does not define a specific movement pattern, and is information indicating the momentum and acceleration in the X-axis direction, the Y-axis direction, and the Z-axis direction.
- the audio parameter 1712 is information that is associated with the exercise data 1711 and indicates information related to the audio that is output when the information indicated by the exercise data 1711 is obtained from the sensor data. Is parameter information that defines a change to be applied to (for example, changing the pitch or changing the audio playback speed).
- the motion analysis unit 1601 analyzes the motion of the footwear 100 based on the sensor data acquired by the acquisition unit 302. Based on the sensor data, the motion analysis unit 1601 analyzes the movement information of the footwear 100 indicated by the sensor data. Specifically, the change over time of the momentum and acceleration of the footwear 100 is specified based on the sensor data. Then, the motion analysis unit 1601 transmits the analyzed motion information to the sound generation unit 1602.
- the sound generation unit 1602 includes the motion information transmitted from the motion analysis unit 1601, the sound file table 1701 transmitted from the acquisition unit 302, and the output sound table according to the user setting information related to sound output control acquired by the acquisition unit 302.
- the output audio is generated with reference to 1702.
- the sound generation unit 1602 transmits the generated sound to the sound output unit 1603. Details of the voice generation method will be described later.
- the audio output unit 1603 causes the audio transmitted from the audio generation unit 1602 to be output from the speaker 16 of the information processing apparatus 200.
- the above is the description of the main control unit 40 according to the third embodiment.
- FIG. 18 is a flowchart showing the operation of the information processing apparatus 200 according to the third embodiment.
- the touch panel 14 of the information processing device 200 receives user setting information related to audio output control from the user.
- the main control unit 40 records the user setting information in the storage unit 38.
- step S1802 the acquisition unit 302 acquires sensor data from the sensor unit 106 of the footwear 100.
- the sensor data is sensing data having a predetermined time length (for example, 1 second).
- step S1803 the acquisition unit 302 acquires user setting information related to the audio output control set in step S1801 from the storage unit 38, and the main control unit 40 determines the audio output control method.
- step S1803 (1) When the user setting information indicates momentum analysis (step S1803 (1)), the process proceeds to step S1804.
- step S1803 (2) the process proceeds to step S1807, and the momentum analysis and the gesture analysis are performed. If it is indicated that both of these are executed (step S1803 (3)), the process proceeds to step S1811.
- step S1804 the motion analysis unit 1601 calculates the amount of exercise from the sensor data.
- the motion analysis unit 1601 transmits the calculated amount of exercise to the voice generation unit 1602.
- step S1805 the acquisition unit 302 reads the audio output table 1710 from the storage unit 38.
- the voice generation unit 1602 identifies the momentum data 1711 having the highest correlation with the transmitted momentum, and identifies the corresponding voice parameter 1712. Then, the voice generation unit 1602 generates a voice to be output based on the specified voice parameter 1712 (a voice specified by the voice parameter 1712 or a voice obtained by changing the parameter indicated by the voice parameter 1712 to the voice that has been played so far). To do.
- the sound generation unit 1602 transmits the generated sound to the sound output unit 1603.
- step S1806 the audio output unit 1603 causes the speaker 16 to output the audio transmitted from the audio generation unit 1602, and proceeds to the process of step S1817.
- step S1807 the motion analysis unit 1601 analyzes the gesture from the sensor data.
- step S1808 the acquisition unit 302 reads the audio file table 1701 from the storage unit 38.
- the motion analysis unit 1601 calculates a correlation value between the temporal change of the momentum and acceleration indicated by the sensor data and the temporal change of the momentum and acceleration indicated by the gesture pattern 1711 of the voice file table 1701. Then, the gesture pattern that can obtain the highest correlation value is specified.
- the motion analysis unit 1601 transmits the identified gesture pattern to the voice generation unit 1602.
- step S1809 the voice generation unit 1602 identifies a voice file corresponding to the transmitted gesture pattern using the voice file table 1701. Then, the specified audio file is transmitted to the audio output unit 1603.
- step S1810 the audio output unit 1603 outputs the transmitted audio file from the speaker 16, and proceeds to the process of step S1817.
- step S1811 the motion analysis unit 1601 first analyzes the gesture from the sensor data.
- step S1812 the acquisition unit 302 reads the audio file table 1701 from the storage unit 38.
- the motion analysis unit 1601 calculates a correlation value between the temporal change of the momentum and acceleration indicated by the sensor data and the temporal change of the momentum and acceleration indicated by the gesture pattern 1711 of the voice file table 1701. Then, the gesture pattern that can obtain the highest correlation value is specified.
- the motion analysis unit 1601 transmits the identified gesture pattern to the voice generation unit 1602.
- step S1813 the voice generation unit 1602 specifies a voice file corresponding to the transmitted gesture pattern using the voice file table 1701.
- step S1814 the motion analysis unit 1601 calculates the amount of exercise from the sensor data.
- the motion analysis unit 1601 transmits the calculated amount of exercise to the voice generation unit 1602.
- step S1815 the acquisition unit 302 reads the audio output table 1710 from the storage unit 38.
- the voice generation unit 1602 identifies the momentum data 1711 having the highest correlation with the transmitted momentum, and identifies the corresponding voice parameter 1712.
- step S1816 the sound generation unit 1602 generates sound based on the specified sound file and the specified sound parameter.
- the voice generation unit 1602 synthesizes the voice with the voice file.
- the voice generation unit 1602 changes the voice. Apply to audio files to generate synthesized speech.
- the voice generation unit 1602 transmits the generated synthesized voice to the voice output unit 1603.
- the audio output unit 1603 outputs the transmitted synthesized voice from the speaker 16 and proceeds to step S1817.
- step S ⁇ b> 1817 the main control unit 40 determines whether an input for ending the output control of the voice from the user is accepted via the touch panel 14. If it has been accepted (YES in step S1817), the process ends. If it has not been accepted (NO in step S1817), the process returns to step S1802.
- the above is description of the audio
- FIG. 19 is an interface screen for performing light emission control of the footwear 100 by a user designation by the information processing apparatus 200 according to the fourth embodiment.
- the interface screen 1901 has an outer shape 1902L of the footwear 100 for the left foot, an outer shape 1902R of the footwear 100 for the right foot, an LED lighting area 1904L in the footwear 100 for the left foot, and a color for lighting the LED.
- buttons indicating colors to be emitted are arranged in the color palette 1903, and the buttons corresponding to the selected buttons can be issued by touching the buttons.
- “RAINBOW” means lighting in rainbow colors
- “MULTI” means lighting in multiple colors
- “OTHERS” is a button for selecting other colors. ing.
- the time bar 1905 designates the time and the light emission pattern (light emission location, light emission color, and light emission method) at the time when it is desired to change the light emission control in time series, and this is stored in the storage unit 38.
- the footwear 100 can be made to emit light with a designated light emission pattern by touching the light emission button 1906. Using such an interface, the user can designate any light emission, and the convenience of the footwear 100 can be improved.
- the interface can be realized by the main control unit 40 executing a GUI program that can execute the above processing.
- the gesture pattern having the highest correlation value is specified. If the correlation value does not exceed a predetermined threshold value, the detected motion information
- the voice generation unit 1602 of the main control unit 40 may determine that the gesture pattern corresponding to is not registered. In that case, it is good also as a structure which does not output the audio
- the audio output control is executed by the information processing apparatus 200.
- the footwear 200 may be executed with a processor and a speaker that execute the audio output control.
- the voice file corresponding to the gesture pattern in the third embodiment may be specified by the user.
- Each function of the main control unit 40 and the control unit 102 shown in the above embodiment may be realized by a dedicated circuit that realizes the same function.
- the dedicated circuit may be configured to execute a plurality of functions among the functional units of the main control unit 40 and the control unit 102 so that the function of one functional unit is realized by the plurality of circuits. It may be configured.
Landscapes
- Engineering & Computer Science (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Footwear And Its Accessory, Manufacturing Method And Apparatuses (AREA)
Abstract
Description
以下、本発明の実施形態における履物及び出力制御方法を、図面を用いて説明する。
図1は、実施形態における出力制御システム10の構成の一例を示す図である。図1に示す例では、出力制御システム10は、履物100、情報処理装置200がネットワークNを介して接続されている。なお、ネットワークNに接続される履物100は、複数あってもよい。情報処理装置200は、例えば、PC(Personal Computer)、携帯端末などのネットワークNを介して取得した信号を処理可能な装置であれば、いずれの装置でもよい。また、ネットワークNにサーバが接続されていてもよい。
次に、出力制御システム10における各装置内のハードウェアの概略について説明する。図2は、実施形態における履物100のハードウェアの概略構成の一例を示す図である。図2に示す履物100は、制御部102と、通信部104と、センサ部106と、出力部108と、電源部110と、記憶部112とを少なくとも備える。通信部104は、送信部142と、受信部144とを含む。なお、履物100としてのアッパー部やソール部は省略している。
次に、履物100、情報処理装置200それぞれの機能構成について説明する。ます、履物100の機能構成について説明する。
次に、上述した履物100のソール部に出力部108として、複数のLEDを設ける実施例について説明する。図6は、実施例における履物100の一例を示す図である。図6(A)は、実施例における履物100の一例を示す側面図である。図6(B)は、実施例における履物100の一例を示す背面図である。
図6に示す履物100において、例えばダンサーが履いてダンスする場合の発光制御(その1)について説明する。ダンサーが音楽に合わせてダンスすると、履物100の動きを示すセンサデータが情報処理装置200に送信される。情報処理装置100では、音楽の音源を音響解析した結果と、取得されたセンサデータに基づいて発光制御信号を生成する。
次に、光によって、所定画像が現れる光の発光制御(その2)について説明する。音に合わせて発光制御されることは、上述したとおりであるが、履物100の動きに合わせて所定画像が現れるように発光制御する。
次に、出力制御システム10の動作について説明する。以下では、上述した2つの発光制御における各処理と、履物100の動きの評価を行う発光制御における処理とを例に挙げる。
図9は、実施例における発光制御処理(その1)の一例を示すフローチャートである。図9に示すステップS102で、通信部104は、通信設定の初期化を行う。初期化には、通信部104がどの装置200と通信するかを設定することが含まれる。
図10は、実施例における発光制御処理(その2)の一例を示すフローチャートである。図10に示すステップS202~S204は、図9に示すステップS102~S104と同様であるため、その説明を省略する。
履物100の動きを評価する発光制御を説明する前に、評価の基準となる手本データをサーバにアップロードする処理について説明する。手本データとしては、例えばダンスのステップにおいてセンシングされたデータとする。
以上、本願の開示する技術の複数の実施形態について説明したが、本願の開示する技術は、上記に限定されるものではない。
本実施形態2においては、上記実施形態1において触れなかった履物100の詳細構造について説明するとともに、上記実施形態1において触れなかった出力制御について説明する。
本実施形態3においては、履物100の動きに応じた音声を出力する音声出力制御について説明する。上記実施形態1においては、周囲の音に適した発光制御を実行する例を示しているが、本実施形態3においては、履物100を履いたユーザの動き、すなわち履物100の動きに適した音を出力する手法について説明する。
上記実施形態に従って、本発明に係る履物200について説明してきたが、本発明の思想として含まれる構成はこれに限るものではない。その他各種参考例について説明する。
100 履物
200 情報処理装置
102 制御部
104 通信部
106 センサ部
108 出力部
110 電源部
112 記憶部
Claims (15)
- 履物であって、
前記履物の動きを検出するセンサ部と、
前記センサ部により検出されたセンサデータを外部装置に送信する送信部と、
前記外部装置から、音データ及び前記センサデータに基づく出力制御信号を受信する受信部と、
前記出力制御信号に基づく出力を行う出力部と、
を備える履物。 - 前記出力部は、発光部を有し、
前記出力制御信号は、前記音データに関する第1パラメータ及び前記センサデータに関する第2パラメータに基づく、発光の色及び発光の強さを制御する光制御信号であり、
前記発光部は、
前記光制御信号に基づき発光する、請求項1に記載の履物。 - 複数の前記発光部が、前記履物に対して直線状に設けられ、
前記センサデータに基づき、前記直線状の方向とは略垂直な方向への前記履物の動きが判定された場合、前記略垂直な方向において所定画像を表す光の残像が現れるよう前記複数の発光部を発光制御する制御部をさらに備える、請求項2に記載の履物。 - 前記制御部は、
前記所定画像を前記略垂直な方向に複数に分割し、分割された各画像に対応する発光部を、前記略垂直な方向に順に発光させるよう制御する、請求項3に記載の履物。 - 前記音データ及び前記センサデータそれぞれの前記出力制御信号に対する寄与率が可変である、請求項1乃至4いずれか一項に記載の履物。
- 前記履物は、当該履物の底部であるソール部と、当該ソール部以外のアッパー部と、から成り、
前記ソール部は、その内部に当該ソール部の外周に沿って、前記発光部が配されていることを特徴とする請求項2に記載の履物。 - 前記ソール部は、前記発光部を配するための溝部が設けられ、前記発光部の発光面が、前記履物の底面側に向けて配されている
ことを特徴とする請求項6に記載の履物。 - 前記発光部は、LEDテープである
ことを特徴とする請求項7に記載の履物。 - 前記ソール部は、前記履物のつま先側であって、前記溝部の外縁に、衝撃吸収部が設けられている
ことを特徴とする請求項7又は8に記載の履物。 - 前記センサ部は、前記ソール部内部であってシャンク部に設けられ、
前記送信部及び前記受信部は、前記アッパー部のタング部に設けられている
ことを特徴とする請求項6~9のいずれか1項に記載の履物。 - 履物は、更に、
履物の動きのパターンと、当該パターンを検出した際に出力する音声データとを対応付けた音情報を記憶する記憶部を備え、
前記出力部は、音声を出力するスピーカを含み、前記センサ部が検出した動きのパターンに対応する音声データを出力する
ことを特徴とする請求項1~9のいずれか1項に記載の履物。 - 履物と、少なくとも音声を出力する外部装置とからなる音声出力システムであって、
前記履物は、
前記履物の動きを検出するセンサ部と、
前記センサ部により検出されたセンサデータを外部装置に送信する送信部と、
を備え、
前記外部装置は、
前記前記センサデータを受信する第2受信部と、
前記履物の動きのパターンと、当該パターンを検出した際に出力する音声データとを対応付けた音情報を記憶する記憶部と、
前記センサデータが前記動きのパターンのいずれに該当するかを判定する判定部と、
前記判定部により該当すると判定された動きのパターンに対応付けられた音声データを出力する音声出力部を備える
音声出力システム。 - 前記外部装置は、さらに、
前記センサデータに基づいて、出力制御信号を生成する生成部と、
生成した出力制御信号を前記履物に送信する第2送信部と、を備え、
前記履物は、さらに、
前記外部装置から、音データ及び前記センサデータに基づく前記出力制御信号を受信する第1受信部と、
前記出力制御信号に基づく出力を行う出力部と、を備える
ことを特徴とする請求項12に記載の音声出力システム。 - 前記出力部は、複数の発光部を含み、
前記外部装置は、さらに、
ユーザから、前記複数の発光部のうち発光させる発光部の指定、発光させる発光色の指定、発光させる発光パターンの指定を受け付ける入力部を備え、
前記生成部は、前記入力部で受け付けた入力内容に従って、前記発光部を制御する出力制御信号を生成する
ことを特徴とする請求項12又は13に記載の音声出力システム。 - 履物における出力制御方法であって、
前記履物に設けられたプロセッサが、
前記履物に設けられたセンサ部により前記履物の動きが検出されたセンサデータを取得することと、
取得されたセンサデータが送信された外部装置から、音データ及び前記センサデータに基づく出力制御信号を取得することと、
前記出力制御信号に基づき、出力部による出力を制御することと、
を実行する出力制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/106,828 US10856602B2 (en) | 2015-02-18 | 2016-02-18 | Footwear, sound output system, and sound output method |
JP2016510864A JP6043891B1 (ja) | 2015-02-18 | 2016-02-18 | 履物、音声出力システム及び出力制御方法 |
CN201680000481.3A CN106061307A (zh) | 2015-02-18 | 2016-02-18 | 鞋、声音输出系统以及输出控制方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015029573 | 2015-02-18 | ||
JP2015-029573 | 2015-02-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016133158A1 true WO2016133158A1 (ja) | 2016-08-25 |
Family
ID=56689032
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/054692 WO2016133158A1 (ja) | 2015-02-18 | 2016-02-18 | 履物、音声出力システム及び出力制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10856602B2 (ja) |
JP (1) | JP6043891B1 (ja) |
CN (1) | CN106061307A (ja) |
WO (1) | WO2016133158A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6142060B1 (ja) * | 2016-10-09 | 2017-06-07 | 好則 神山 | スマートフォン |
JP2018089062A (ja) * | 2016-12-01 | 2018-06-14 | 株式会社エクスプロア | 残像シューズ |
KR102152804B1 (ko) * | 2019-10-08 | 2020-09-07 | (주)지엔인터내셔날 | 제어 가능한 광원을 포함하는 신발 |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017136142A (ja) * | 2016-02-02 | 2017-08-10 | セイコーエプソン株式会社 | 情報端末、動作評価システム、動作評価方法、動作評価プログラム、及び記録媒体 |
US20180325453A1 (en) * | 2016-03-15 | 2018-11-15 | Shenzhen Royole Technologies Co., Ltd. | Shoe and control method thereof |
CN106072974A (zh) * | 2016-07-25 | 2016-11-09 | 天津聚盛龙庄源科技有限公司 | 一种智能卫星定位鞋 |
US10575594B2 (en) * | 2016-11-17 | 2020-03-03 | Samsung Electronics Co., Ltd. | Footwear internal space measuring device and method for providing service thereof |
JP6737505B2 (ja) * | 2017-03-03 | 2020-08-12 | 株式会社ノーニューフォークスタジオ | 歩行教示システム、歩行教示方法 |
US20190082756A1 (en) * | 2017-09-21 | 2019-03-21 | Michael Arno | Led lighted placard system for apparel or gear, and manufacturing method therefore |
WO2019061732A1 (zh) * | 2017-09-26 | 2019-04-04 | 催琥宝(深圳)科技有限公司 | 一种智能灯光鞋 |
US20190174863A1 (en) * | 2017-12-13 | 2019-06-13 | John McClain | Footwear With Kinetically Activated Auditory Effects |
CN208462097U (zh) * | 2018-02-13 | 2019-02-01 | 曾胜克 | 发光装置及具有发光功能的可穿戴物件 |
KR101976635B1 (ko) * | 2018-03-09 | 2019-05-09 | 강민서 | 학습용 신발 |
CN110665204A (zh) * | 2018-07-02 | 2020-01-10 | 瀚谊世界科技股份有限公司 | 具有移动指示的穿戴装置、系统与移动指示方法 |
FR3087098B1 (fr) * | 2018-10-10 | 2020-12-25 | Izome | Chaussure dite connectee apte a communiquer avec l'exterieur |
KR102073910B1 (ko) * | 2018-10-25 | 2020-02-05 | (주)씨지픽셀스튜디오 | 가위바위보게임이 가능한 신발 및 그 신발을 이용한 가위바위보게임 방법 |
CN110664047B (zh) * | 2019-08-30 | 2022-04-12 | 福建省万物智联科技有限公司 | 一种跟随音频震动的智能鞋 |
KR20220100054A (ko) * | 2019-11-22 | 2022-07-14 | 나이키 이노베이트 씨.브이. | 모션 기반의 미디어 생성 |
WO2021243487A1 (zh) * | 2020-05-30 | 2021-12-09 | 深圳二郎神工业设计有限公司 | 舞步鞋 |
CN112471690B (zh) * | 2020-11-23 | 2022-02-22 | 浙江工贸职业技术学院 | 一种用于排舞的多功能舞蹈鞋 |
CN116952303B (zh) * | 2023-07-27 | 2024-04-30 | 浙江卓诗尼鞋业有限公司 | 鞋类多项功能综合检测设备 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07504602A (ja) * | 1993-02-05 | 1995-05-25 | エル・エイ・ギア インコーポレーテッド | 点滅ライト付き履物 |
JP2002035191A (ja) * | 2000-07-31 | 2002-02-05 | Taito Corp | 踊り採点装置。 |
JP2006267711A (ja) * | 2005-03-24 | 2006-10-05 | Xing Inc | 音楽再生装置 |
JP2007090076A (ja) * | 2005-09-29 | 2007-04-12 | Konami Digital Entertainment Inc | ダンスゲーム装置、ダンスゲーム採点方法およびコンピュータ読み取り可能な記憶媒体 |
JP3151948U (ja) * | 2009-04-30 | 2009-07-09 | 深海 蔡 | バイブレーション・マッサージ付き靴 |
JP2013037036A (ja) * | 2011-08-03 | 2013-02-21 | Kyoraku Sangyo Kk | スイング式発光表示装置 |
JP2013529504A (ja) * | 2010-06-22 | 2013-07-22 | ナイキ インターナショナル リミテッド | 変色部を有する履物および色を変化させる方法 |
JP3192015U (ja) * | 2014-05-12 | 2014-07-24 | 株式会社Shindo | Led実装テープおよびそれを装着した衣服 |
JP3193890U (ja) * | 2014-08-13 | 2014-10-23 | 孝文 竹内 | 陳列棚 |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2572760A (en) * | 1948-01-15 | 1951-10-23 | Rikelman Nathan | Illuminated shoe device |
CA2106407A1 (en) * | 1991-12-11 | 1993-06-24 | Mark R. Goldston | Athletic shoe having plug-in-module |
US5461188A (en) | 1994-03-07 | 1995-10-24 | Drago; Marcello S. | Synthesized music, sound and light system |
WO1995026652A1 (en) * | 1994-04-01 | 1995-10-12 | Bbc International, Ltd. | Footwear having provisions for accepting modules |
US5615111A (en) * | 1994-05-23 | 1997-03-25 | Solefound, Inc. | Record and playback means for footwear |
US6539336B1 (en) * | 1996-12-12 | 2003-03-25 | Phatrat Technologies, Inc. | Sport monitoring system for determining airtime, speed, power absorbed and other factors such as drop distance |
US5799418A (en) * | 1996-07-24 | 1998-09-01 | Davis; Richard P. | Footwear device for reducing walking related noise |
US5748087A (en) * | 1996-08-01 | 1998-05-05 | Ingargiola; Thomas R. | Remote personal security alarm system |
US6278378B1 (en) * | 1999-07-14 | 2001-08-21 | Reebok International Ltd. | Performance and entertainment device and method of using the same |
GB2352551A (en) * | 1999-07-23 | 2001-01-31 | Bbc Internat | Sound generating electronic shoes with alarm |
JP2001242813A (ja) | 2000-02-29 | 2001-09-07 | Mamoru Chiku | 旗振り状表示装置 |
US7225565B2 (en) * | 2003-03-10 | 2007-06-05 | Adidas International Marketing B.V. | Intelligent footwear systems |
US7178929B2 (en) * | 2004-11-12 | 2007-02-20 | Bbc International, Ltd. | Light and sound producing system |
US7207688B2 (en) * | 2005-08-18 | 2007-04-24 | Wong Wai Yuen | Interactive shoe light device |
KR100702613B1 (ko) * | 2006-05-30 | 2007-04-03 | 주식회사 아이손 | 콘트롤러를 장착한 인공지능신발과 운동량측정방법 |
US7789520B2 (en) * | 2006-09-08 | 2010-09-07 | Kristian Konig | Electroluminescent communication system between articles of apparel and the like |
US7494237B1 (en) * | 2006-12-20 | 2009-02-24 | Cheung James D | Multiple programmed different sequential illumination light sources for footwear |
EP3254663B8 (en) * | 2007-06-18 | 2021-06-30 | SonicSensory Inc. | Vibrating footwear device and entertainment system for use therewith |
DE102008027104A1 (de) * | 2008-06-06 | 2009-12-10 | Cairos Technologies Ag | System und Verfahren zur mobilen Bewertung von Schuhdämpfungseigenschaften |
US20110023331A1 (en) * | 2009-07-29 | 2011-02-03 | Jason Kolodjski | Shoe with action activated electronic audio sound generator |
US8769836B2 (en) * | 2010-06-22 | 2014-07-08 | Nike, Inc. | Article of footwear with color change portion and method of changing color |
IL209331A0 (en) * | 2010-11-15 | 2011-02-28 | Elbit Systems Ltd | Footwear seismic communication system |
JP5703156B2 (ja) | 2010-11-22 | 2015-04-15 | 富士フイルム株式会社 | 熱線遮蔽材 |
CN103442607B (zh) * | 2011-02-07 | 2016-06-22 | 新平衡运动公司 | 用于监视运动表现的系统和方法 |
US20120297960A1 (en) * | 2011-05-29 | 2012-11-29 | Rohan Bader | Sound shoe studio |
CN202222510U (zh) * | 2011-09-14 | 2012-05-23 | 黑金刚(泉州)数控科技有限公司 | 一种可随音乐闪动的亮灯鞋 |
WO2013056263A1 (en) * | 2011-10-14 | 2013-04-18 | Bishop, Roger | Sport performance monitoring apparatus, process, and method of use |
WO2013088096A1 (en) * | 2011-12-13 | 2013-06-20 | Bonnie White | Solar powered l.c.d/l.e.d/o.l.e.d footwear |
WO2014081706A2 (en) * | 2012-11-21 | 2014-05-30 | Wolverine World Wide, Inc. | Indicator system |
US9747781B2 (en) * | 2014-09-26 | 2017-08-29 | Intel Corporation | Shoe-based wearable interaction system |
-
2016
- 2016-02-18 JP JP2016510864A patent/JP6043891B1/ja active Active
- 2016-02-18 US US15/106,828 patent/US10856602B2/en active Active
- 2016-02-18 WO PCT/JP2016/054692 patent/WO2016133158A1/ja active Application Filing
- 2016-02-18 CN CN201680000481.3A patent/CN106061307A/zh active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07504602A (ja) * | 1993-02-05 | 1995-05-25 | エル・エイ・ギア インコーポレーテッド | 点滅ライト付き履物 |
JP2002035191A (ja) * | 2000-07-31 | 2002-02-05 | Taito Corp | 踊り採点装置。 |
JP2006267711A (ja) * | 2005-03-24 | 2006-10-05 | Xing Inc | 音楽再生装置 |
JP2007090076A (ja) * | 2005-09-29 | 2007-04-12 | Konami Digital Entertainment Inc | ダンスゲーム装置、ダンスゲーム採点方法およびコンピュータ読み取り可能な記憶媒体 |
JP3151948U (ja) * | 2009-04-30 | 2009-07-09 | 深海 蔡 | バイブレーション・マッサージ付き靴 |
JP2013529504A (ja) * | 2010-06-22 | 2013-07-22 | ナイキ インターナショナル リミテッド | 変色部を有する履物および色を変化させる方法 |
JP2013037036A (ja) * | 2011-08-03 | 2013-02-21 | Kyoraku Sangyo Kk | スイング式発光表示装置 |
JP3192015U (ja) * | 2014-05-12 | 2014-07-24 | 株式会社Shindo | Led実装テープおよびそれを装着した衣服 |
JP3193890U (ja) * | 2014-08-13 | 2014-10-23 | 孝文 竹内 | 陳列棚 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6142060B1 (ja) * | 2016-10-09 | 2017-06-07 | 好則 神山 | スマートフォン |
JP2018089062A (ja) * | 2016-12-01 | 2018-06-14 | 株式会社エクスプロア | 残像シューズ |
KR102152804B1 (ko) * | 2019-10-08 | 2020-09-07 | (주)지엔인터내셔날 | 제어 가능한 광원을 포함하는 신발 |
Also Published As
Publication number | Publication date |
---|---|
US10856602B2 (en) | 2020-12-08 |
US20180199657A1 (en) | 2018-07-19 |
JP6043891B1 (ja) | 2016-12-14 |
CN106061307A (zh) | 2016-10-26 |
JPWO2016133158A1 (ja) | 2017-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6043891B1 (ja) | 履物、音声出力システム及び出力制御方法 | |
CN107106907B (zh) | 用于确定用户手指位置的信号生成和检测器系统以及方法 | |
CN106502388B (zh) | 一种互动式运动方法及头戴式智能设备 | |
US7890199B2 (en) | Storage medium storing sound output control program and sound output control apparatus | |
JP5692904B2 (ja) | 入力システム、情報処理装置、情報処理プログラム、および指示位置算出方法 | |
JP5271121B2 (ja) | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 | |
US9310894B2 (en) | Processing operation signals from a pointing device and/or an input device | |
JP5161182B2 (ja) | 情報処理プログラム及び情報処理装置 | |
JP4895352B2 (ja) | 対象選択プログラム、対象選択装置、対象選択システム、および対象選択方法 | |
CN114945295A (zh) | 基于运动的媒体创作 | |
US8586852B2 (en) | Storage medium recorded with program for musical performance, apparatus, system and method | |
US9092894B2 (en) | Display control device and display control program for grouping related items based upon location | |
JP5607579B2 (ja) | 情報処理システム、情報処理装置、情報処理プログラム、およびコンテンツ再生制御方法 | |
JP5988549B2 (ja) | 位置算出システム、位置算出装置、位置算出プログラム、および位置算出方法 | |
US8352267B2 (en) | Information processing system and method for reading characters aloud | |
WO2018181584A1 (ja) | 情報処理システム、情報処理方法、情報処理プログラム | |
US9417761B2 (en) | Storage medium storing image processing program, image processing apparatus, image processing method and image processing system for displaying a virtual space in which objects are arranged with a virtual camera | |
JP5702585B2 (ja) | 入力判定プログラム、情報処理装置、システム及び情報処理方法 | |
ES2834601T3 (es) | Aparato de audio, procedimiento de accionamiento para aparato de audio y medio de registro legible por ordenador | |
JP2009146284A (ja) | 表示制御プログラムおよび表示制御装置 | |
JP5807089B2 (ja) | 情報処理システム、情報処理装置、情報処理プログラム、およびコンテンツ再生制御方法 | |
CN110665204A (zh) | 具有移动指示的穿戴装置、系统与移动指示方法 | |
KR101203787B1 (ko) | 영상인식 기반 인터페이스를 사용하는 콘텐츠 제작 방법 및 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016510864 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16752544 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16752544 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15106828 Country of ref document: US |