WO2018008217A1 - 情報処理装置、情報処理方法、およびプログラム - Google Patents
情報処理装置、情報処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2018008217A1 WO2018008217A1 PCT/JP2017/014379 JP2017014379W WO2018008217A1 WO 2018008217 A1 WO2018008217 A1 WO 2018008217A1 JP 2017014379 W JP2017014379 W JP 2017014379W WO 2018008217 A1 WO2018008217 A1 WO 2018008217A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- output
- tactile
- unit
- target
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2201/00—Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
- H04R2201/02—Details casings, cabinets or mounting therein for transducers covered by H04R1/02 but not provided for in any of its subgroups
- H04R2201/023—Transducers incorporated in garment, rucksacks or the like
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2400/00—Loudspeakers
- H04R2400/03—Transducers capable of generating both sound as well as tactile vibration, e.g. as used in cellular phones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R5/00—Stereophonic arrangements
- H04R5/02—Spatial or constructional arrangements of loudspeakers
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 describes a technique for outputting a tactile stimulus to a predetermined device when an event occurs in a virtual space.
- the output tactile stimulus differs depending on, for example, the position to be touched.
- the same tactile stimulus is output regardless of the position information.
- the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of adaptively changing the output of a tactile stimulus according to position information.
- an output control unit that controls output of tactile stimulation to at least two or more tactile stimulation units
- the output control unit relates to predetermined position information and a tactile output related to the position information.
- An information processing apparatus is provided that changes the output of the tactile stimulation unit corresponding to the predetermined position information in accordance with the information.
- the predetermined tactile stimulus output is controlled with respect to at least two tactile stimulation units, and the predetermined position information and the predetermined tactile information related to the tactile output are associated with the predetermined position information.
- An information processing method including: a processor changing an output of a tactile stimulation unit corresponding to the position information.
- a program for causing a computer to function as an output control unit that controls output of a tactile stimulus to at least two or more tactile stimulus units includes:
- a program is provided for changing the output of the haptic stimulation unit corresponding to the predetermined position information in accordance with the predetermined position information and information relating to the haptic output related to the position information.
- FIG. 5 is an explanatory diagram showing an example of the relationship between the output intensity of two tactile stimulation units 200 and the perceived position in the situation shown in FIG. 4.
- 6 is a graph showing a relationship between a perceived position and an output intensity set for each of two tactile stimulation units 200 according to a comparative example of the present disclosure.
- 7 is a graph showing the relationship between the perceived position and the perceived intensity when the function shown in FIG.
- FIG. 6 is applied. It is the functional block diagram which showed the structural example of the server 10 by the embodiment. It is explanatory drawing which showed an example of the movement path
- route 220 of the target perceived position. It is explanatory drawing which showed the example which the range 230 of a target perceived position expands. 6 is an explanatory diagram showing an example in which a range 230 of a target perceived position is continuously moved along a target movement path 220. FIG. FIG. 5 is an explanatory diagram showing an example in which an output intensity adjustment function according to the embodiment is applied in the situation shown in FIG. 4.
- FIG. 16 is a graph showing the relationship between the perceived position and the perceived intensity when the function shown in FIG. 15 is applied. It is a figure for demonstrating the example of planar adjustment of the output intensity of the some tactile sense stimulation part. It is explanatory drawing which showed the example with which the triangle comprised regarding the perceived position of two targets overlaps. It is explanatory drawing which showed the example which emphasizes the range 240 of the perceived position currently moving, and generates a vibration.
- 3 is a flowchart showing an overall flow of an operation example according to the embodiment.
- 5 is a flowchart showing a flow of “output intensity calculation processing” according to the embodiment.
- FIG. 12 is a schematic diagram illustrating tactile stimuli and sounds output to a user according to application example 1.
- FIG. 12 is the figure which showed the image 40 displayed by the application example 2 of the embodiment.
- 12 is a schematic diagram illustrating a movement path of a perceptual position of a tactile stimulus output by Application Example 2.
- FIG. 12 is a schematic diagram illustrating tactile stimuli and sounds output to a user according to application example 2.
- FIG. 10 is a schematic diagram illustrating tactile stimuli and sounds output to a user according to application example 3.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
- a plurality of configurations having substantially the same functional configuration are distinguished as the server 10a and the server 10b as necessary.
- the same reference numerals are given.
- the server 10a and the server 10b they are simply referred to as the server 10.
- FIG. 1 is an explanatory diagram showing the configuration of the information processing system according to the present embodiment.
- the information processing system includes a server 10, a display device 30, and a communication network 32.
- the user 2 can wear the jacket 20 mentioned later.
- FIG. 2 is a view showing the appearance of the jacket 20.
- the jacket 20 includes a plurality of tactile stimulation units 200 and two sound output units 202 inside the jacket 20.
- a predetermined number for example, six
- the individual haptic stimulation units 200 are arranged in such a positional relationship that the individual haptic stimulation units 200 arranged on the front side and the individual haptic stimulation units 200 arranged on the back side face each other.
- FIG. 2 shows an example in which the jacket 20 is clothes without sleeves, the jacket 20 is not limited to such an example, and the jacket 20 may have sleeves.
- one or more tactile stimulation units 200 may be disposed not only on the user's chest and abdomen but also on positions corresponding to the user's arms.
- Tactile stimulation unit 200 The tactile stimulation unit 200 outputs a tactile stimulation such as vibration according to a control signal received from the server 10, for example.
- a tactile stimulation such as vibration according to a control signal received from the server 10, for example.
- the generated vibrations are perceived only by the peripheral part 210 of the tactile stimulation unit 200. obtain. That is, when the individual haptic stimulation units 200 are arranged apart from each other, vibrations generated separately by the individual haptic stimulation units 200 can be perceived discretely in the user's body.
- This phantom sensation is an illusion phenomenon in which when a stimulus is presented simultaneously to different positions on the skin, a human perceives only one stimulus between the presented stimulus positions.
- a perceived position the position of the stimulus normally perceived by the user (hereinafter referred to as a perceived position). Is known to be located between the two tactile stimulation units 200.
- FIG. 5 is an explanatory diagram showing an example (an example of phantom sensation) between the output intensity of each of the two tactile stimulation units 200 and the perceived position in the situation shown in FIG.
- the output intensity of the tactile stimulation unit 200a is continuously weakened as, for example, “1”, “0.6”, “0” as time passes, and It is assumed that the output intensity of the haptic stimulation unit 200b is continuously increased like “0”, “0.6”, “1”.
- the perceived position 50 (perceived by the user) can continuously move from the contact position of the tactile stimulus unit 200a to the contact position of the tactile stimulus unit 200b. In this way, by changing the output intensity of the plurality of tactile stimulation units 200, the range of tactile stimulation that can be presented by the plurality of tactile stimulation units 200 is continuously changed without changing the arrangement interval of the individual tactile stimulation units 200. Can be enlarged.
- Audio output unit 202 The audio output unit 202 outputs audio in accordance with, for example, a control signal received from the server 10. As shown in FIG. 2, one audio output unit 202 can be arranged on each of the left and right sides of the jacket 20. For example, the audio output unit 202 is arranged so as to be positioned on the user's shoulder or in the vicinity of the shoulder when the jacket 20 is worn. However, the present invention is not limited to this example, and only one audio output unit 202 may be arranged on the jacket 20, or three or more may be arranged.
- the audio output unit 202 may be disposed in the predetermined space as an independent device instead of being included in the jacket 20, or a wearable device (for example, a headphone or a headset) different from the jacket 20 ) Or a portable device (for example, a portable music player, a smartphone, a portable game machine, etc.).
- a wearable device for example, a headphone or a headset
- a portable device for example, a portable music player, a smartphone, a portable game machine, etc.
- the display device 30 is a device that displays image information.
- the display device 30 projects image information onto the projection target 4 in accordance with a control signal received from the server 10 described later.
- 1 illustrates an example in which the display device 30 is a projector.
- the display device 30 is not limited to such an example, and the display device 30 may be a liquid crystal display (LCD) device or an OLED (Organic Light Emitting). (Diode) device or the like.
- the display device 30 may be included in a portable device such as a tablet terminal or a smartphone, or a wearable device such as an HMD or AR (Augmented Reality) glass. In these cases, the audio output unit 202 and the display device 30 may be included in the same device.
- the server 10 is an example of an information processing device according to the present disclosure.
- the server 10 is a device that controls the output of tactile stimulation to a plurality of tactile stimulation units 200 (or the jacket 20).
- the server 10 controls the generation of vibration for each of the plurality of tactile stimulation units 200 included in the jacket 20.
- the server 10 can also instruct the jacket 20 to generate vibrations by each of the plurality of tactile stimulation units 200. is there.
- the server 10 may have a function of controlling content reproduction.
- the content includes an image (image information) and a sound (sound information).
- the server 10 controls display of images on the display device 30.
- the server 10 controls the sound output unit 202 to output sound.
- the user located in the predetermined space is generated by the plurality of tactile stimulation units 200 while watching the video displayed by the display device 30 and / or the music output by the audio output unit 202, for example. You can experience the vibrations that are made at the same time.
- the server 10 can communicate with other devices (such as the tactile stimulation unit 200, the audio output unit 202, and the display device 30) via, for example, a communication network 32 described later.
- devices such as the tactile stimulation unit 200, the audio output unit 202, and the display device 30.
- the communication network 32 is a wired or wireless transmission path for information transmitted from a device connected to the communication network 32.
- the communication network 32 may include a public line network such as a telephone line network, the Internet, a satellite communication network, various local area networks (LANs) including the Ethernet (registered trademark), a wide area network (WAN), and the like.
- LANs local area networks
- WAN wide area network
- the communication network 32 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
- the intensity that the user actually perceives the vibration by the tactile stimulation unit 200 (hereinafter referred to as the perceptual intensity) can be reduced according to the distance from the tactile stimulation unit 200.
- the perceived intensity of vibration by the haptic stimulation unit 200 may decrease in inverse proportion to the distance from the haptic stimulation unit 200.
- FIG. 6 is a graph showing the function f0 (x) and the function f1 (x) of the output intensity of the two tactile stimulation units 200 according to this comparative example in the situation shown in FIG.
- vibration is generated in the haptic stimulation unit 200a with an output intensity like the function f0 (x) shown in FIG. 6, and at the same time, a haptic sense with an output intensity like the function f1 (x) shown in FIG. Vibration is generated in the stimulation unit 200a.
- the function f0 (x) and the function f1 (x) are functions for adjusting the output intensity of the haptic stimulation unit 200 in a non-linear manner, the sensation that the perceived position moves can be more strongly presented to the user by phantom sensation. .
- the perceived intensity is higher in the vicinity of the middle between the contact position of the tactile stimulus unit 200 a and the contact position of the tactile stimulus unit 200 b than the two contact positions. It will decline. For this reason, in this comparative example, it is difficult to present a tactile stimulus having a desired perceptual intensity to a user at a desired perceived position.
- the server 10 according to the present embodiment changes the output of one or more tactile stimulation units 200 corresponding to the predetermined position information in accordance with predetermined position information on the user's body and information related to the tactile output related to the position information.
- the information related to the haptic output includes, for example, a target perceptual intensity of the haptic stimulus or a (target) haptic output value.
- the server 10 can change the output intensity of one or more tactile stimulation units 200 according to the position to be subjected to tactile stimulation and the perceived intensity of the target at the position. Therefore, for example, the user can perceive tactile stimulation (vibration or the like) so that the perceived intensity is substantially constant between the contact positions of the plurality of tactile stimulation units 200.
- FIG. 8 is a functional block diagram showing a configuration example of the server 10 according to the present embodiment.
- the server 10 includes a control unit 100, a communication unit 120, and a storage unit 122.
- Control unit 100 The control unit 100 comprehensively controls the operation of the server 10 using hardware such as a CPU (Central Processing Unit) 150 and a RAM (Random Access Memory) 154 which are built in the server 10 to be described later. As shown in FIG. 8, the control unit 100 includes a content reproduction unit 102, a target position / intensity determination unit 104, and an output control unit 106.
- a CPU Central Processing Unit
- RAM Random Access Memory
- the content reproduction unit 102 controls content reproduction. For example, when the content to be reproduced includes an image, the content reproduction unit 102 performs image display control on the display device 30. In addition, when the content to be reproduced includes audio, the content reproduction unit 102 performs audio output control on the audio output unit 202.
- the content reproduction unit 102 can cause the display device 30 and / or the audio output unit 202 to reproduce the content to be reproduced in association with the output control of the tactile stimulus by the output control unit 106 described later.
- the content reproduction unit 102 reproduces the content in synchronization with the generation and timing of vibrations by one or more tactile stimulation units 200.
- the type of content to be reproduced may be determined based on predetermined setting information (such as a reproduction list), or may be determined based on a reproduction request input by the user.
- Target position / strength determination unit 104 determines the perceived position of the target of tactile stimulation and the perceived intensity of the target at the perceived position at a predetermined timing.
- the target perceived position can basically be set with respect to the user's body.
- the target position / intensity determination unit 104 determines the target perceived position of the tactile stimulus at the predetermined timing and the target perceived intensity at the perceived position according to the content to be reproduced.
- the target position / intensity determination unit 104 determines the target perceived position and target at each timing in accordance with an image displayed at each timing during the playback period of the content to be played back and / or audio output. Determine the perceived intensity of.
- the target position / strength determination unit 104 determines each content during the reproduction period of the content based on the target movement route. Determine the perceived position of the target at the timing.
- the target position / intensity determination unit 104 may determine the target perceived position at a predetermined future timing in real time according to the current (target) perceived position and the content currently being played back.
- FIG. 9 is an explanatory diagram showing an example of a target movement path 220 set for the user's body.
- the target movement path 220 is a path connecting the contact position of the tactile stimulus unit 200a, the contact position of the tactile stimulus unit 200c, the contact position of the tactile stimulus unit 200d, and the contact position of the tactile stimulus unit 200h. Is shown. In this case, it is possible to present the tactile stimulus to the user so that the perceived position continuously moves from the contact position of the tactile stimulus unit 200a that is the start point to the contact position of the tactile stimulus unit 200h that is the end point.
- the target movement path can be set as a path connecting the first surface of the user's body, the inside of the user's body, and the second surface facing the first surface.
- the first surface may be the front surface of the user, and the second surface may be the back surface of the user.
- the first surface may be a surface on the front side of a predetermined part such as an arm, and the second surface may be a surface on the back side of the part.
- FIG. 10 is an explanatory diagram showing another example of the target movement path 220.
- the target movement path 220 is a path that connects the contact position of the tactile stimulation unit 200a on the front of the user, the inside of the user's body, and the contact position of the tactile stimulation unit 200b on the back of the user. ing.
- the user can be presented with a sensation of sticking into the body from the front to the back.
- the target position / intensity determination unit 104 determines each of the content during the playback period based on the target moving area. It is possible to determine a set (region) of target perceived positions at timing. Alternatively, the target position / intensity determination unit 104 determines a set of target perceived positions at a predetermined future timing in real time according to the current set of (target) perceived positions and the content currently being played back. May be.
- the target movement area can be set as an area that spreads over the user's body as time passes.
- FIG. 11 is an explanatory diagram showing an example of a target movement area 230 set for the user's body.
- FIG. 11 shows an example in which the target movement area 230 is an area where the set of target perceived positions spreads over time starting from the contact position of the tactile stimulation unit 200a.
- the target movement area 230 is an area in which the distance between the contact position of the tactile stimulation unit 200a and the set of target perception positions gradually increases with time.
- the contact position of the haptic stimulation unit 200a (that is, the start point of the target movement region) is an example of a third position in the present disclosure.
- the target position / intensity determination unit 104 can determine the target perceptual intensity at each timing during the reproduction period of the content according to the content to be reproduced.
- the target movement path (or target movement area) and the perceived intensity of the target at each position on the target movement path (or target movement area) may be associated in advance.
- the target position / intensity determination unit 104 first determines a target movement path according to, for example, the content to be reproduced, and then, based on the target movement path, each timing during the reproduction period of the content. Determine the perceived intensity of the target at.
- the perceived intensity of each target at each position on the target movement path (or target movement area) may be set to be the same or different for each position.
- the perceived intensity of the target at each position on the target movement path (or target movement area) may be manually set by the user.
- the target position / intensity determination unit 104 may determine the target perception intensity at a predetermined future timing in real time according to the current (target) perception intensity and the content currently being played back.
- a waveform of the target perceived intensity can be registered in advance.
- the target position / intensity determination unit 104 may determine the target perception intensity at the target perception position based on the waveform of the target perception intensity and the determined target perception position, for example, in real time. Is possible.
- the target perceptual intensity waveform may be a constant function or may be registered in association with the content to be played.
- Output control unit 106 controls generation of vibrations for the plurality of tactile stimulation units 200 corresponding to the target perceived position according to the target perceived position and the target perceived intensity determined by the target position / intensity determining unit 104.
- the output control unit 106 first specifies a plurality of (for example, three) tactile stimulation units 200 located in the vicinity of the current target perceived position. Then, the output control unit 106 determines each of the plurality of tactile stimulation units 200 based on the positional relationship between each of the plurality of tactile stimulation units 200 and the perceived position of the target and the current perceptual intensity of the target. Determine the output intensity. Thereby, for example, the vibration of the perceived intensity of the current target can be perceived by the user at the perceived position of the current target.
- the output control unit 106 responds to the changed target perceived position whenever the target perceived position and the target perceived intensity determined by the target position / intensity determining unit 104 change over time.
- the output intensity of each of the plurality of tactile stimulation units 200 to be adjusted is sequentially adjusted.
- a target movement path 220 is set on the surface of the user's body, and the target perceptual intensity in the target movement path 220 is set to be constant.
- the output control unit 106 includes a plurality of corresponding haptic stimulation units 200 so that vibrations of the same perceptual intensity continuously move at each position on the target movement path 220 as time passes. Adjust the output intensity.
- the target movement path 220 is set as a movement path that connects the front and back of the user's body.
- the output control unit 106 adjusts the output intensity of the corresponding plurality of tactile stimulation units 200 so that the perceived position passes through the inside of the body along the target movement path 220 as time elapses. .
- the target moving area 230 is set as an area that spreads in a plane.
- the output control unit 106 corresponds so that the distance between the start point (for example, the contact position of the haptic stimulation unit 200a) and the set (range) of the perceived position continuously increases with time.
- the output intensity of the plurality of tactile stimulation units 200 to be adjusted is adjusted.
- the output control unit 106 outputs the output intensities of the corresponding haptic stimulation units 200 so that the target perceived position range 230 continuously moves along the target movement path 220. Adjust.
- the output control unit 106 determines the output intensity of the haptic stimulation unit 200 (determined based on the target perceptual intensity) and the haptic stimulation unit 200 located in the vicinity of the target perception position and the target perception position. It is possible to change based on the distance. For example, when the tactile stimulation unit 200a and the tactile stimulation unit 200b are located in the vicinity of the perceived position of the target, the output control unit 106 sets the distance between the contact position of the tactile stimulation unit 200a on the user's body and the perceived position of the target. Based on this, the output intensity of the haptic stimulation unit 200a is changed.
- the output control unit 106 changes the output intensity of the haptic stimulation unit 200b based on the distance between the contact position of the haptic stimulation unit 200b on the user's body and the target perceived position.
- the tactile stimulation unit 200a is an example of a first tactile stimulation unit in the present disclosure
- the tactile stimulation unit 200b is an example of a second tactile stimulation unit in the present disclosure.
- the output control unit 106 determines the output intensity and the haptics of the haptic stimulation unit 200a based on the positional relationship between the target position and the intermediate position between the contact position of the haptic stimulation unit 200a and the contact position of the haptic stimulation unit 200b.
- the output intensity of the stimulation unit 200b is changed.
- the intermediate position is an example of a fourth position in the present disclosure.
- the output control unit 106 is configured so that the total value of the output intensity of the tactile stimulus unit 200a and the output intensity of the tactile stimulus unit 200b increases as the distance between the intermediate position and the target perceived position decreases.
- the output intensity of 200a and haptic stimulation unit 200b may be changed.
- the output control unit 106 determines whether the contact position of the tactile stimulus unit 200a and the target perceived position are the same.
- the output intensity of the haptic stimulation unit 200a may be changed so that the output intensity of the haptic stimulation unit 200a increases as the distance increases. The same applies to the tactile stimulation unit 200b (that is, the reverse relationship is established).
- the output control unit 106 changes the ratio between the output intensity of the haptic stimulation unit 200a and the output intensity of the haptic stimulation unit 200b based on the positional relationship between the intermediate position and the target perceived position.
- FIG. 13 is an explanatory diagram showing an example of a scene to which the above output intensity adjustment is applied, corresponding to the example shown in FIG. 13A and 13C show a situation where the target perceived position 50 matches the contact position of the haptic stimulation unit 200a or the contact position of the haptic stimulation unit 200b.
- FIG. 13B shows the timing at which the target perceived position 50 is located at an intermediate position between the two tactile stimulation units 200.
- the output control unit 106 sets the output intensity of the haptic stimulation unit 200a to “1” and the output intensity of the haptic stimulation unit 200b to “0”. Further, at the timing shown in FIG.
- the output control unit 106 sets the output intensity of the haptic stimulation unit 200a to “1.5” and the output intensity of the haptic stimulation unit 200b to “1.5”, respectively. To do. Further, at the timing shown in FIG. 13C, the output control unit 106 sets the output intensity of the haptic stimulation unit 200a to “0” and the output intensity of the haptic stimulation unit 200b to “1”, respectively. That is, when the distance between the intermediate position of the two tactile stimulation units 200 and the target perceived position 50 is smaller (FIG. 13B), the distance between the intermediate position and the target perceived position 50 is larger.
- the output control unit 106 includes the tactile stimulation unit 200a and the tactile stimulation unit 200b so that the sum of the output intensities of the two tactile stimulation units 200 is larger than the case (FIGS. 13A and 13C). Adjust the output intensity.
- the output control unit 106 multiplies the function f0 (x) as shown in FIG. 6 by the adjustment function h0 (x) as shown in FIG.
- a function g0 (x) is obtained.
- the function g0 (x) is a function indicating the output intensity of the haptic stimulation unit 200a when the target perceived position is x.
- the adjustment function h0 (x) is a function that increases the output intensity of the haptic stimulation unit 200a in proportion to the distance between the haptic stimulation unit 200a and the target perceived position.
- the output control unit 106 multiplies the function f1 (x) shown in FIG. 6 by the adjustment function h1 (x) as shown in the following formula (2), as shown in FIG. A function g1 (x) is obtained.
- the function g1 (x) is a function indicating the output intensity of the haptic stimulation unit 200b when the target perceived position is x.
- the adjustment function h1 (x) is a function that increases the output intensity of the haptic stimulation unit 200b in proportion to the distance between the haptic stimulation unit 200b and the target perceived position.
- the function f0 (x) and the function f1 (x) are not limited to the square root function as shown in FIG. 6, and may be a logarithmic function or a linear function (such as a linear function).
- the target perceptual intensity is set to “1” in the entire section between the two tactile stimulation units 200, and the tactile sense is used using the function g ⁇ b> 0 (x) illustrated in FIG. 14.
- 15 shows a graph in the case where vibration is generated in the stimulation unit 200a and vibration is generated in the tactile stimulation unit 200b using the function g1 (x) shown in FIG. As shown in FIG.
- the perceptual intensity can be substantially constant in the entire section between the haptic stimulation unit 200a and the haptic stimulation unit 200b. . For example, even if the target perceived position is located near the intermediate position, the perceived intensity hardly decreases.
- planar adjustment of the output intensity of the plurality of tactile stimulation units 200 will be described. More specifically, an example of adjusting the output intensity of the three tactile stimulation units 200 when the target perceived position is located inside a triangle defined by the contact positions of the three tactile stimulation units 200 will be described.
- the output control unit 106 changes the output intensity of the haptic stimulation unit 200 for each of the three haptic stimulation units 200 based on the distance between the contact position of the haptic stimulation unit 200 and the target perceived position. To do.
- FIG. 16 shows an example in which the point A is the target perceived position.
- the output control unit 106 first determines the three haptic stimulation units 200 based on the positional relationship of the contact positions (A0, A1, A2) of the three haptic stimulation units 200 located in the vicinity of the point A. Each output intensity is calculated (temporarily). Then, for each of the three tactile stimulation units 200, the output control unit 106 determines the distance between the contact position of the tactile stimulation unit 200 and the target perceived position (in the example shown in FIG. 16, L0, L1, or Based on L2), the calculated output intensity is changed (corrected).
- the perceived intensity (actually perceived by the user) is almost the same in all positions in the triangle. Can be constant. For example, even if the target perceived position is located near the center of gravity of the triangle, the perceived intensity hardly decreases.
- the output control unit 106 sets all the parameters of the adjustment function applied to each tactile stimulation unit 200 to a predetermined value. (Same value) may be set.
- the contact position of the tactile stimulation unit 200 with respect to the user's body may be unknown.
- the device including the tactile stimulation unit 200 is a belt-type device
- the user may wear the device on the wrist or the ankle. Therefore, when the contact position of the tactile stimulation unit 200 is unknown, the output control unit 106 assumes that the individual tactile stimulation units 200 are positioned at equal intervals, thereby outputting the output intensity of the individual tactile stimulation unit 200.
- the parameter value of the adjustment function may be determined.
- the contact pressure of the tactile stimulation unit 200 against the user's body may be unknown.
- the output control unit 106 may set the parameter of the output function of the haptic stimulation unit 200 to a predetermined value.
- the output control unit 106 detects the tactile stimulation unit 200 located in the vicinity of the corresponding tactile stimulation unit 200. It is also possible to change the parameter value of the output intensity adjustment function (from the value before the failure). Thereby, even if one tactile stimulation unit 200 breaks down, it is possible to prevent the lack of a presentation area of tactile stimulation for the user.
- the output control unit 106 includes all the tactile stimulation units 200.
- the generation of vibration due to may be stopped.
- the output control unit 106 stops the output of the tactile stimulation by the corresponding tactile stimulation unit 200 and the output intensity adjustment function of the tactile stimulation unit 200 located in the vicinity of the corresponding tactile stimulation unit 200.
- the parameter value may be changed (from the value before the cancellation).
- the output control unit 106 can further determine the moving speed of the target position. For example, the output control unit 106 may set the moving speed of the target position to a predetermined value or may change it dynamically.
- the output control unit 106 may dynamically change the moving speed of the target position in accordance with the content to be reproduced (by the content reproduction unit 102).
- the output control unit 106 may change the movement speed of the target position based on, for example, the measurement result of the rotation speed of the user's body or the operation speed of the user with respect to a predetermined device.
- the rotation speed of the user's body can be specified based on measurement determination by various sensors such as an acceleration sensor and a gyroscope that the user carries or wears.
- the various sensors may be included in the jacket 20, the tactile stimulation unit 200, or another device that the user carries or wears.
- the rotational speed of the user's body is specified by combining measurement results obtained by different sensors (for example, the tactile stimulation unit 200 includes an acceleration sensor and the jacket 20 includes a gyroscope). Also good.
- the output control unit 106 may dynamically change the output intensity of the plurality of tactile stimulation units 200 based on a predetermined criterion. For example, the output control unit 106 can dynamically change the output intensities of the plurality of tactile stimulation units 200 based on the currently perceived position (or range of perceived positions). As an example, as illustrated in FIG. 18, the output control unit 106 continuously moves the target perceived position range 230 along the target movement path 220 while moving the target in the perceived position range 240 currently being moved. Only the perceived intensity of is relatively larger. As a result, the perceived position currently being moved can be emphasized and presented to the user.
- the output control unit 106 can dynamically change the output intensity of the plurality of tactile stimulation units 200 based on the prediction of the movement direction of the perceived position. For example, the output control unit 106 may relatively reduce the output intensity at the movement source or the movement destination of the perceived position. Thereby, the contrast of the movement of the perceived position can be emphasized and presented to the user.
- the output control unit 106 can dynamically change the output intensities of the plurality of tactile stimulation units 200 based on the positional relationship between a predetermined region in the user's body and the perceived position. For example, the output control unit 106 may control the generation of vibrations for the plurality of tactile stimulation units 200 so that the perceived position moves while avoiding a predetermined region in the user's body.
- the predetermined area may be a predetermined part such as the vicinity of the heart or an injured area.
- the output control unit 106 can also dynamically change the output intensities of the plurality of tactile stimulation units 200 in accordance with the part including the target perceived position. For example, when the target perceived position is included in a highly sensitive part, the output control unit 106 relatively weakens the output intensity of the plurality of tactile stimulation units 200. In addition, when the target perceived position is included in a portion with low sensitivity, the output control unit 106 relatively increases the output intensity of the plurality of tactile stimulation units 200. Alternatively, the output control unit 106 may change the frequency of vibrations generated by the plurality of tactile stimulation units 200 (instead of the output intensity) according to the part including the target perceived position. Accordingly, a desired perceptual intensity can be presented to the user at the perceived position without depending on the site including the perceived position.
- the output control unit 106 can also dynamically change the output intensities of the plurality of tactile stimulation units 200 according to the moving speed of the target perceived position. For example, the output control unit 106 dynamically changes the peak value (maximum value) of the output intensity of the plurality of tactile stimulation units 200 or the amount of change in the output intensity. As an example, the output control unit 106 decreases the maximum value of the output intensity or the amount of change in the output intensity as the moving speed of the target perceived position increases.
- the characteristics related to the generation of vibration may vary depending on the individual tactile stimulation unit 200. Therefore, for example, when the characteristics of the individual haptic stimulation units 200 are known in advance, the output control unit 106 further sets the output intensity of the individual haptic stimulation units 200 according to the characteristics of the individual haptic stimulation units 200. It may be changed dynamically. For example, the output control unit 106 changes the ratio of the output intensities of the individual haptic stimulation units 200 according to the characteristics of the individual haptic stimulation units 200.
- the output control unit 106 further outputs the output intensity of each haptic stimulation unit 200 (or the value of the parameter of the adjustment function of the output intensity) based on the measurement result or estimation result of the contact pressure of each haptic stimulation unit 200. May be changed dynamically.
- the contact pressure of the haptic stimulation unit 200 can be measured or estimated by the following method. For example, when the tactile stimulation unit 200 includes a pressure sensor, the contact pressure of the tactile stimulation unit 200 can be measured by the pressure sensor. Alternatively, the contact pressure of the haptic stimulation unit 200 can be estimated based on the analysis of the change in the current value in the haptic stimulation unit 200.
- the relationship between the contact pressure on the human body (or other object) and the current value in the haptic stimulation unit 200 can be obtained in advance by a developer, for example.
- the contact pressure of the haptic stimulation unit 200 can be estimated based on the measurement result of the current value in the haptic stimulation unit 200 and the relationship.
- the tactile stimulation unit 200 when the tactile stimulation unit 200 is worn on the user's body (for example, when the user wears the jacket 20), the tactile stimulation is based on the posture of the tactile stimulation unit 200 and / or the measurement result of the user's posture.
- the contact pressure of the part 200 can be estimated.
- the posture of the tactile stimulation unit 200 can be measured based on, for example, a gyroscope and an acceleration sensor built in the tactile stimulation unit 200.
- the posture of the user can be measured based on a gyroscope and an acceleration sensor built in another device that the user carries or wears.
- the communication unit 120 transmits and receives information to and from other devices. For example, the communication unit 120 transmits a control signal of the output of the tactile stimulus to each of the plurality of tactile stimulus units 200 (or the jacket 20) according to the control of the output control unit 106. In addition, the communication unit 120 transmits a control signal for displaying an image to be reproduced to the display device 30 according to the control of the content reproducing unit 102, and outputs a control signal for outputting the sound to be reproduced to a plurality of audio output units 202. To each (or jacket 20).
- Storage unit 122 stores various data and various software.
- FIG. 19 is a flowchart showing an operation example according to this embodiment.
- the target position / intensity determination unit 104 of the server 10 perceives the target perception position of the tactile stimulus and the perception of the target at the perception position after the unit time ( ⁇ T seconds) of the current time.
- the strength is determined according to, for example, the content currently being played back (S101).
- the output control unit 106 specifies the three tactile stimulation units 200 located in the vicinity of the target perceived position (S103).
- the output control unit 106 performs “output intensity calculation processing” to be described later (S105).
- the process waits until ⁇ T seconds elapse from the point of S101 (S107).
- ⁇ T seconds elapse S107: Yes
- the output control unit 106 causes each of the three tactile stimulation units 200 to output a tactile stimulus with the output intensity calculated in S105 (S109). .
- the server 10 determines whether or not a predetermined termination condition is satisfied (S111). When it is determined that the predetermined termination condition is not satisfied (S111: No), the server 10 performs the processing subsequent to S101 again. On the other hand, when it is determined that the predetermined end condition is satisfied (S111: Yes), the server 10 ends the operation.
- the predetermined end condition may be that a predetermined time elapses from the start of this operation, the content to be reproduced is ended, or the user inputs an end instruction.
- the output control unit 106 repeats the following processes of S155 to S159 (S153).
- the output control unit 106 determines the positional relationship between the target perceived position determined in S101 and the contact position of each of the three tactile stimulation units 200, and the target determined in S101.
- the output intensity of the I-th tactile stimulation unit 200 is (temporarily) calculated based on the perceived intensity (S155).
- the output control unit 106 changes (corrects) the output intensity calculated in S155 based on the distance between the perceived position of the target and the contact position of the I-th tactile stimulation unit 200 (S157). Then, the output control unit 106 adds “1” to I (S159).
- the server 10 corresponds to the predetermined position information according to the predetermined position information on the user's body and the perceived intensity of the target of the tactile stimulus related to the position information.
- the output intensity of one or more tactile stimulation units 200 is changed.
- the user can perceive a strong tactile stimulus adapted to the position information in the user's body.
- the perceived position is continuously moved between the contact positions of the plurality of tactile stimulation units 200 and the user perceives tactile stimuli (vibration, etc.) so that the perceived intensity is substantially constant during the movement. be able to.
- FIG. 21 is a diagram illustrating an example of an image 40 displayed by the display device 30 according to Application Example 1.
- the image 40 is an image (for example, a three-dimensional CG (Computer Graphics) image) generated based on the user's viewpoint.
- the image 40 includes a plurality of ball-shaped objects 400.
- the object 400 is an example of a target area in the present disclosure.
- the image is moved from the image 40a to the image 40b so that an animation is displayed as if the object 400a moves to the left front in the image 40, that is, as if it hits the left side of the user's body.
- the output control unit 106 specifies the position (collision position) on the user's body corresponding to the position of the object 400a in the image 40b shown in FIG. To do.
- the collision position is near the contact position of the tactile stimulation unit 200e.
- the output control unit 106 touches the plurality of tactile stimulation units 200 positioned around the target perceived position set 230 so that the set 230 of the target perceived positions is spread out. Control the output of the stimulus. Thereby, it is possible to present the user with a feeling that the impact is transmitted to the surroundings starting from the collision position.
- the content playback unit 102 has two audio output units as if an impact sound is heard from the collision position on the user's body, as shown by the two dashed curves in FIG. 202 causes an impact sound to be output.
- the content playback unit 102 since the collision position is located on the left side of the user's body, the content playback unit 102 causes the audio output unit 202b on the left side of the user's body to output the impact sound at a high volume as illustrated in FIG. And the said impact sound is output by the audio
- FIG. 23 is a diagram illustrating another example of the image 40 displayed by the display device 30.
- the image 40 includes a bowl-shaped object 410.
- the object 410 is an example of a target area in the present disclosure.
- the image 410a to the image 40b are displayed such that the object 410 moves to the left front in the image 40, that is, an animation is displayed as if the tip of the heel hits the left side of the user's body.
- the target position / intensity determination unit 104 determines the movement path 220 of the target perceived position based on the movement locus of the object 410 in the image 40.
- the target position / intensity determination unit 104 first specifies the position (collision position) on the user's body corresponding to the position of the tip of the object 410 in the image 40b shown in FIG. In the example shown in FIG. 25, the collision position is near the contact position of the tactile stimulation unit 200e. Then, the target position / intensity determination unit 104 determines the movement path 220 of the target perceived position based on the specified position and the movement trajectory of the object 410 in the image 40. In the example illustrated in FIG. 24, the movement path 220 of the target perceived position includes a position in the vicinity of the tactile stimulation unit 200 e positioned on the left front of the user and a vicinity of the tactile stimulation unit 200 i positioned on the back of the left side of the user. This is a route connecting the positions of Further, the target position / intensity determination unit 104 determines a range 230 of the target perceived position.
- the output control unit 106 causes the tactile stimulus unit 200e and the tactile stimulus to be perceived by the user so that the target perceived position range 230 continuously moves along the movement path 220 of the target perceived position. It controls the output of tactile stimulation to a plurality of tactile stimulation units 200 such as the unit 200i. As a result, the user can be presented with a sensation of a heel sticking from the left front of the user toward the left rear starting from the collision position.
- the content playback unit 102 has two audio output units as if an impact sound is heard from the collision position on the user's body, as indicated by two dashed curves in FIG. 202 causes an impact sound to be output.
- the content reproduction unit 102 causes the audio output unit 202b on the left side of the user's body to output the impact sound at a high volume as illustrated in FIG.
- the said impact sound is output by the audio
- FIG. 26 is a diagram showing still another example of the image 40 displayed by the display device 30.
- the image 40 includes a monster object 420.
- the object 420 is an example of a target area in the present disclosure.
- the output control unit 106 first determines the position on the body of the user corresponding to the position of the object 420 (for example, the position of the hand of the object 420) in the image 40a shown in FIG. Specify (collision position). In the example shown in FIG. 27, the collision position is near the contact position of the tactile stimulation unit 200a.
- the target position / intensity determination unit 104 determines the movement path 220 of the target perceived position based on the specified position and the movement locus (attack locus) of the object 420 in the image 40.
- the movement path 220 of the target perceived position is in the vicinity of the contact position of the haptic stimulation unit 200a located on the right side of the user and in the vicinity of the contact position of the haptic stimulation unit 200b located on the left side of the user. This is a route connecting Further, the target position / intensity determination unit 104 determines a range 230 of the target perceived position.
- the output control unit 106 is configured such that the tactile stimulation unit 200a, the tactile stimulation unit 200e, and the like are perceived by the user such that the range 230 of the target perceived position is continuously moved along the target movement path 220.
- the output of the tactile stimulation to the plurality of tactile stimulation units 200 is controlled. Accordingly, it is possible to present a feeling that the object 420 is attacked from the right side to the left side of the user starting from the collision position.
- the content reproduction unit 102 receives two sounds so that the impact sound can be heard along the movement locus 250 of the impact sound as time passes.
- the output unit 202 is caused to output an impact sound.
- the impact sound can be output in conjunction with the movement of the perceived position along the movement path 220.
- the content reproduction unit 102 calculates the ratio between the volume of the impact sound output to the audio output unit 202a on the right side of the user's body and the volume of the impact sound output to the audio output unit 202b on the left side of the user's body. Change according to the movement of the perceived position of the target.
- the server 10 includes a CPU 150, a ROM (Read Only Memory) 152, a RAM 154, a bus 156, an interface 158, a storage device 160, and a communication device 162.
- the CPU 150 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the server 10 according to various programs. Further, the CPU 150 realizes the function of the control unit 100 in the server 10.
- the CPU 150 is configured by a processor such as a microprocessor.
- the ROM 152 stores programs used by the CPU 150 and control data such as calculation parameters.
- the RAM 154 temporarily stores a program executed by the CPU 150, for example.
- the bus 156 includes a CPU bus and the like.
- the bus 156 connects the CPU 150, the ROM 152, and the RAM 154 to each other.
- the interface 158 connects the storage device 160 and the communication device 162 to the bus 156.
- the storage device 160 is a data storage device that functions as the storage unit 122.
- the storage device 160 includes, for example, a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, or a deletion device that deletes data recorded on the storage medium.
- the communication device 162 is a communication interface composed of a communication device for connecting to the communication network 32, for example. Further, the communication device 162 may be a wireless LAN compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication. This communication device 162 functions as the communication unit 120.
- LTE Long Term Evolution
- the target perceived position may be located outside the presentation range of the tactile stimulation by all the tactile stimulation units 200.
- the distance between the arrangement positions of the tactile stimulation units 200 is very large, it may be difficult for the user to perceive the tactile stimulation with the target perception intensity in the range between the plurality of tactile stimulation units 200. obtain.
- the server 10 senses that the perceived position is moved outside the presentation range (illusion).
- the content being reproduced may be changed so as to give the user a message.
- the server 10 when the perceived position of the target moves in the left-right direction of the user's body and moves outside the presenting range, the server 10 (for example, as described in “3. Application Example”) ) Move the displayed image (or a specific area in the image) in the same direction as the movement direction of the target perceived position, or change the ratio of the volume to be output on the left and right sides of the user's body.
- the server 10 sets the output intensity of the corresponding one or more haptic stimulation units 200 to be the same as the output intensity when the target perceived position is located at the boundary of the range, or the target perceived position.
- the output intensity is weakened according to the distance between and the boundary.
- the server 10 moves up the plurality of tactile stimulation units 200 along the vertical (vertical) arrangement of the plurality of tactile stimulation units 200 in the user's body.
- You may control to output a tactile stimulus in order from bottom to bottom or from bottom to top.
- a gravity feeling, a rising feeling, or a falling feeling can be presented to the user.
- the user can be expected to feel as if he is on a moving elevator.
- the server 10 may control the output of tactile stimulation to the plurality of tactile stimulation units 200 so as to present the user with a sense that the body (eg, skin) is pulled.
- the server 10 includes the one or more tactile stimulation units 200 in response to a user operation (shaking or turning) on a predetermined housing having the one or more tactile stimulation units 200. And / or the output of the tactile stimulation to the plurality of tactile stimulation units 200 included in the jacket 20 may be controlled. This can further improve the resolution of spatial and aerial tactile stimulation using other sensory organs such as somatic sensations.
- the server 10 can also control the output of tactile stimulation to the plurality of tactile stimulation units 200 so as to present the user with the illusion that the perceived position is localized between the plurality of tactile stimulation units 200. .
- the output control unit 106 makes the two tactile stimulations so that the perceived position reciprocates between the two tactile stimulation units 200 in a short cycle (particularly, reciprocating near the intermediate position).
- the output intensity of the unit 200 may be changed sequentially.
- the output control unit 106 first changes the output intensity of the two haptic stimulation units 200 so that the perceived position moves from one contact position of the two haptic stimulation units 200 to the intermediate position. Then, after the perceived position reaches the intermediate position, the output intensities of the two tactile stimulation units 200 may be maintained (fixed) with the same value.
- the output control unit 106 sets the output intensity of one of the two tactile stimulation units 200 to “0” or a predetermined small value, or The output intensity of the corresponding haptic stimulation unit 200 may be gradually reduced. According to these control examples, it can be expected that the user obtains a sense that the perceived position is localized between the two tactile stimulation units 200.
- the output intensity waveform of the haptic stimulation unit 200 (determined based on the target perceptual intensity waveform) is set so that the maximum value of the output intensity (amplitude) is equal to or less than a predetermined value.
- the amplitude can be adjusted. For example, even if the waveform of the output intensity is multiplied by the output intensity adjustment function (described above), the amplitude of the waveform of the output intensity can be adjusted so that the maximum value of the amplitude is not more than a predetermined value. .
- the amplitude of the output intensity waveform may be adjusted in advance or may be adjusted in real time.
- the parameter value of the output intensity adjustment function (described above) may be adjusted such that the maximum value of the amplitude is equal to or less than a predetermined value.
- the perceived intensity of the tactile stimulus may vary depending on the frequency of vibration.
- a human may feel the vibration most strongly when the vibration frequency is about 200 Hz, and may feel the vibration weaker when the frequency is higher than 200 Hz. Therefore, the output control unit 106 may change the frequency of vibration generated in the plurality of tactile stimulation units 200 (instead of changing the amplitude) based on the target perceived position and the target perceived intensity. Thereby, the tactile stimulus having the target perceived intensity can be presented to the user at the target perceived position (as in the case of changing the amplitude).
- the tactile stimulation unit 200 has mainly been described as an example of generating vibration as a tactile stimulus.
- the present invention is not limited to this example, and the tactile stimulation unit 200 may include temperature, haptic information, or electrical A stimulus or the like may be output as a tactile stimulus.
- the server 10 presents a target temperature at a position between the plurality of tactile stimulation units 200 by adjusting the temperatures of the plurality of tactile stimulation units 200 arranged away from the user's body. obtain.
- the server 10 can present the target perceptual intensity at a position between the plurality of tactile stimulation units 200 by adjusting the strength of the haptic information to be output to the plurality of tactile stimulation units 200.
- the server 10 can present the target perceptual intensity at a position between the plurality of tactile stimulation units 200 by adjusting the strength of the electrical stimulation output to the plurality of tactile stimulation units 200.
- the present invention is not limited to this example, and may be position information in space.
- the position information may be position information in real space.
- a target perceived position or a region of the target perceived position, a path of the target perceived position, etc.
- the server 10 causes the plurality of tactile stimulation units 200 to generate vibrations, The output intensity of vibration of each of the plurality of tactile stimulation units 200 may be changed.
- the position information can be position information in the virtual space.
- a target perceived position (or a region of the target perceived position, a path of the target perceived position, etc.) can be set on the virtual space. Then, when it is detected that the object corresponding to the user has moved to the target perceived position in the virtual space, the server 10 generates vibrations in the plurality of tactile stimulation units 200, The output intensity of vibration of each of the plurality of tactile stimulation units 200 may be changed.
- the information processing apparatus is a mobile phone such as a PC (Personal Computer), a tablet terminal, or a smartphone, a game machine, a portable music player, for example, a wearable apparatus such as an HMD or a wristwatch type device, or a jacket 20. Also good.
- the tactile stimulation unit according to the present disclosure is included in the jacket 20 has been mainly described.
- the embodiment is not limited thereto, and may be included in other types of apparatuses.
- the tactile stimulation unit may be included in a wristwatch type device or a wrist type device, or may be included in a chair.
- each step in the operation of the above-described embodiment does not necessarily have to be processed in the order described.
- the steps may be processed by changing the order as appropriate.
- Each step may be processed in parallel or individually instead of being processed in time series. Further, some of the described steps may be omitted, or another step may be further added.
- An output control unit for controlling the output of tactile stimulation to at least two tactile stimulation units;
- the information processing apparatus wherein the output control unit changes an output of the haptic stimulation unit corresponding to the predetermined position information in accordance with the predetermined position information and information related to the haptic output related to the position information.
- the output control unit changes the output intensity of the haptic stimulation unit corresponding to the predetermined position information in accordance with the predetermined position information and information related to the haptic output related to the position information.
- Information processing device (3) The information processing apparatus according to (1) or (2), wherein the predetermined position information is position information on a user's body.
- the information processing apparatus according to (7), wherein the predetermined position information is changed according to a type or volume of sound output to the user.
- the information processing apparatus according to any one of (3) to (8), wherein the predetermined position information is position information that is moved along a target route determined for the body.
- the information processing apparatus according to (9), wherein the predetermined position information is moved on the target route as time elapses.
- the at least two or more tactile stimulation units include a first tactile stimulation unit and a second tactile stimulation unit,
- the information processing apparatus according to (10), wherein the target path is a path that connects a contact position of the first haptic stimulation unit in the body and a contact position of the second haptic stimulation unit in the body.
- the target path is a path that connects a first position on a first surface of the body, an interior of the body, and a second position on a second surface facing the first surface, (9)
- the information processing apparatus according to any one of (11) to (11).
- the at least two or more tactile stimulation units include a first tactile stimulation unit and a second tactile stimulation unit,
- the first position is a contact position of the first tactile stimulation unit on the first surface
- the information processing apparatus according to (12) wherein the second position is a contact position of the second tactile stimulation unit on the second surface.
- the first surface is the front of the user;
- the predetermined position information is changed so that the distance between the third position on the body and the position indicated by the predetermined position information increases with time, (3) to (14) The information processing apparatus according to any one of the above.
- the at least two or more tactile stimulation units include a first tactile stimulation unit,
- the output control unit changes an output of the first haptic stimulation unit based on a distance between a contact position of the first haptic stimulation unit on the body and a position indicated by the predetermined position information.
- the at least two or more tactile stimulation units further include a second tactile stimulation unit,
- the output control unit changes the output of the second haptic stimulation unit based on a distance between a contact position of the second haptic stimulation unit on the body and a position indicated by the predetermined position information.
- the information processing apparatus according to (17). (19) Controlling the output of tactile stimulation for at least two tactile stimulation units;
- the processor changes the output of the haptic stimulation unit corresponding to the predetermined position information in accordance with the predetermined position information and information related to the haptic output related to the position information; Including an information processing method.
- An output control unit for controlling the output of tactile stimulation to at least two or more tactile stimulation units Is a program for functioning as The said output control part is a program which changes the output of the tactile stimulus part corresponding to the said predetermined position information according to the information regarding the predetermined position information and the tactile output related to the said position information.
- server 20 jacket 30 display device 32 communication network 100 control unit 102 content reproduction unit 104 target position / strength determination unit 106 output control unit 120 communication unit 122 storage unit 200 tactile stimulation unit 202 audio output unit
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.情報処理システムの構成
2.実施形態の詳細な説明
3.適用例
4.ハードウェア構成
5.変形例
まず、本開示の実施形態による情報処理システムの構成について、図1を参照して説明する。図1は、本実施形態による情報処理システムの構成を示した説明図である。図1に示すように、当該情報処理システムは、サーバ10、表示装置30、および、通信網32を有する。また、本実施形態では、図1に示すように、ユーザ2は、後述するジャケット20を装着し得る。
図2は、ジャケット20の外観を示した図である。図2に示すように、ジャケット20は、ジャケット20の内部に複数の触覚刺激部200、および、2個の音声出力部202を有する。例えば、ジャケット20の内部において、ユーザの正面側および背面側にそれぞれ所定の個数(例えば6個)ずつ触覚刺激部200が配置され得る。一例として、正面側に配置されている個々の触覚刺激部200と、背面側に配置されている個々の触覚刺激部200とが向かい合うような位置関係で、個々の触覚刺激部200は配置される。なお、図2では、ジャケット20が袖なしの服である例を示しているが、かかる例に限定されず、ジャケット20は、袖を有してもよい。この場合、触覚刺激部200は、ユーザの胸部および腹部だけでなく、ユーザの両腕に対応する位置にも一以上配置され得る。
触覚刺激部200は、例えばサーバ10から受信される制御信号に従って、例えば振動などの触覚刺激を出力する。なお、以下では、触覚刺激部200が、触覚刺激として、振動を発生する例を中心として説明を行う。
音声出力部202は、例えばサーバ10から受信される制御信号に従って、音声を出力する。この音声出力部202は、図2に示すように、ジャケット20の左右にそれぞれ一つずつ配置され得る。例えば、ジャケット20の装着時にユーザの肩または肩の近辺に位置するように、音声出力部202は配置される。但し、かかる例に限定されず、ジャケット20には音声出力部202が1個だけ配置されてもよいし、または、3個以上配置されてもよい。また、音声出力部202は、ジャケット20に含まれる代わりに、独立した装置として当該所定の空間内に配置されてもよいし、または、ジャケット20とは異なる装着型装置(例えばヘッドフォンやヘッドセットなど)や携帯型装置(例えば携帯型音楽プレーヤ、スマートフォン、携帯型ゲーム機など)に含まれてもよい。
表示装置30は、画像情報を表示する装置である。例えば、表示装置30は、後述するサーバ10から受信される制御信号に従って、投影対象4に対して画像情報を投影する。なお、図1では、表示装置30がプロジェクタである例を記載しているが、かかる例に限定されず、表示装置30は、液晶ディスプレイ(LCD:Liquid Crystal Display)装置や、OLED(Organic Light Emitting Diode)装置などであってもよい。また、表示装置30は、タブレット端末やスマートフォンなどの携帯型装置、または、HMDやAR(Augmented Reality)グラスなどの装着型装置に含まれてもよい。また、これらの場合、音声出力部202および表示装置30は同一の装置に含まれてもよい。
サーバ10は、本開示における情報処理装置の一例である。サーバ10は、複数の触覚刺激部200(またはジャケット20)に対する触覚刺激の出力を制御する装置である。例えば、サーバ10は、ジャケット20に含まれる複数の触覚刺激部200の各々に対して振動の発生を制御する。あるいは、複数の触覚刺激部200を制御する機能をジャケット20が有する場合には、サーバ10は、当該複数の触覚刺激部200の各々による振動の発生をジャケット20に対して指示することも可能である。
通信網32は、通信網32に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、通信網32は、電話回線網、インターネット、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)などを含んでもよい。また、通信網32は、IP-VPN(Internet Protocol-Virtual Private Network)などの専用回線網を含んでもよい。
以上、本実施形態による情報処理システムの構成について説明した。ところで、触覚刺激部200による振動を実際にユーザが知覚する強度(以下、知覚強度と称する)は、当該触覚刺激部200からの距離に応じて低下し得る。例えば、触覚刺激部200による振動の知覚強度は、当該触覚刺激部200からの距離に反比例的に低下し得る。
<2-1.構成>
次に、本実施形態によるサーバ10の構成について詳細に説明する。図8は、本実施形態によるサーバ10の構成例を示した機能ブロック図である。図8に示すように、サーバ10は、制御部100、通信部120、および、記憶部122を有する。
制御部100は、サーバ10に内蔵される、後述するCPU(Central Processing Unit)150や、RAM(Random Access Memory)154などのハードウェアを用いて、サーバ10の動作を統括的に制御する。また、図8に示すように、制御部100は、コンテンツ再生部102、目標位置・強度決定部104、および、出力制御部106を有する。
コンテンツ再生部102は、コンテンツの再生を制御する。例えば、再生対象のコンテンツが画像を含む場合、コンテンツ再生部102は、表示装置30に対して画像の表示制御を行う。また、再生対象のコンテンツが音声を含む場合、コンテンツ再生部102は、音声出力部202に対して音声の出力制御を行う。
(2-1-3-1.目標の知覚位置の決定)
目標位置・強度決定部104は、所定のタイミングにおける、触覚刺激の目標の知覚位置および当該知覚位置における目標の知覚強度を決定する。ここで、目標の知覚位置は、基本的に、ユーザの身体に対して設定され得る。例えば、目標位置・強度決定部104は、再生対象のコンテンツに応じて、所定のタイミングにおける、触覚刺激の目標の知覚位置および当該知覚位置における目標の知覚強度を決定する。一例として、目標位置・強度決定部104は、再生対象のコンテンツの再生期間中の各タイミングにおいて表示される画像、および/または、出力される音声に応じて、各タイミングにおける目標の知覚位置および目標の知覚強度を決定する。
例えば、再生対象のコンテンツに目標の移動経路が予め関連付けられている場合には、目標位置・強度決定部104は、当該目標の移動経路に基づいて、当該コンテンツの再生期間中の各タイミングにおける目標の知覚位置を決定する。または、目標位置・強度決定部104は、現在の(目標の)知覚位置と、現在再生中のコンテンツとに応じて、将来の所定のタイミングにおける目標の知覚位置をリアルタイムに決定してもよい。
また、再生対象のコンテンツに目標の移動領域が予め関連付けられている場合には、目標位置・強度決定部104は、当該目標の移動領域に基づいて、当該コンテンツの再生期間中の各タイミングにおける目標の知覚位置の集合(領域)を決定することが可能である。または、目標位置・強度決定部104は、現在の(目標の)知覚位置の集合と、現在再生中のコンテンツとに応じて、将来の所定のタイミングにおける目標の知覚位置の集合をリアルタイムに決定してもよい。
また、目標位置・強度決定部104は、再生対象のコンテンツに応じて、当該コンテンツの再生期間中の各タイミングにおける目標の知覚強度を決定することが可能である。
例えば、目標の移動経路(または目標の移動領域)と、当該目標の移動経路(または目標の移動領域)上の各位置における目標の知覚強度とが予め関連付けられ得る。この場合、目標位置・強度決定部104は、まず、例えば再生対象のコンテンツに応じて目標の移動経路を決定し、そして、当該目標の移動経路に基づいて、当該コンテンツの再生期間中の各タイミングにおける目標の知覚強度を決定する。なお、目標の移動経路(または目標の移動領域)上の各位置における目標の知覚強度は、全て同一に定められてもよいし、位置ごとに異なるように定められてもよい。また、目標の移動経路(または目標の移動領域)上の各位置における目標の知覚強度は、ユーザが手動で設定してもよい。
または、目標の知覚強度の波形が予め登録され得る。この場合、目標位置・強度決定部104は、当該目標の知覚強度の波形と、決定した目標の知覚位置とに基づいて、当該目標の知覚位置における目標の知覚強度を例えばリアルタイムに決定することが可能である。なお、目標の知覚強度の波形は、定数関数であってもよいし、または、再生対象のコンテンツに関連付けて登録されてもよい。
(2-1-4-1.目標位置の連続的な移動)
出力制御部106は、目標位置・強度決定部104により決定された目標の知覚位置および目標の知覚強度に応じて、当該目標の知覚位置に対応する複数の触覚刺激部200に対する振動の発生を制御する。例えば、出力制御部106は、まず、現在の目標の知覚位置の近隣に位置する複数(例えば3個)の触覚刺激部200を特定する。そして、出力制御部106は、当該複数の触覚刺激部200の各々と当該目標の知覚位置との位置関係、および、現在の目標の知覚強度に基づいて、当該複数の触覚刺激部200の各々の出力強度を決定する。これにより、例えば、現在の目標の知覚位置において現在の目標の知覚強度の振動がユーザに知覚され得る。
例えば、図9に示すように、ユーザの身体の表面上に目標の移動経路220が設定されており、かつ、当該目標の移動経路220における目標の知覚強度が一定に定められているとする。この場合、出力制御部106は、時間の経過に応じて、当該目標の移動経路220上の各位置において同一の知覚強度の振動が連続的に移動するように、該当する複数の触覚刺激部200の出力強度を調整する。
また、出力制御部106は、(目標の知覚強度に基づいて決定される)当該触覚刺激部200の出力強度を、目標の知覚位置と、当該目標の知覚位置の近隣に位置する触覚刺激部200との距離に基づいて変更することが可能である。例えば、当該目標の知覚位置の近隣に触覚刺激部200aおよび触覚刺激部200bが位置する場合、出力制御部106は、ユーザの身体における触覚刺激部200aの接触位置と目標の知覚位置との距離に基づいて、触覚刺激部200aの出力強度を変更する。また、出力制御部106は、ユーザの身体における触覚刺激部200bの接触位置と目標の知覚位置との距離に基づいて、触覚刺激部200bの出力強度を変更する。ここで、触覚刺激部200aは、本開示における第1の触覚刺激部の一例であり、また、触覚刺激部200bは、本開示における第2の触覚刺激部の一例である。
以下、上記の機能についてより詳細に説明する。まず、2個の触覚刺激部200の接触位置の間に目標の知覚位置が位置する場合における、当該2個の触覚刺激部200の出力強度の調整例について説明する。例えば、出力制御部106は、触覚刺激部200aの接触位置と触覚刺激部200bの接触位置との中間位置と、目標の知覚位置との位置関係に基づいて、触覚刺激部200aの出力強度および触覚刺激部200bの出力強度をそれぞれ変更する。ここで、当該中間位置は、本開示における第4の位置の一例である。
次に、複数の触覚刺激部200の出力強度の平面的な調整例について説明する。より具体的には、3個の触覚刺激部200の接触位置で定まる三角形の内部に目標の知覚位置が位置する場合における、当該3個の触覚刺激部200の出力強度の調整例について説明する。
なお、目標の知覚位置が複数個設定される場合には、個々の目標の知覚位置に関して構成される三角形が互いに重なるように決定されることが許容され得る。例えば、図17に示したように、第1の目標の知覚位置(点A)の近隣に位置する3個の触覚刺激部200の接触位置(A0、A1、A2)で定まる三角形と、第2の目標の知覚位置(点B)の近隣に位置する3個の触覚刺激部200の接触位置(A0、A2、A3)で定まる三角形とは重なり合ってもよい。
ここで、上述した複数の触覚刺激部200の出力強度の調整方法の変形例について説明する。例えば、知覚位置(または感度特性)が不明(例えば特定不能)である場合には、出力制御部106は、個々の触覚刺激部200に対して適用する上記の調整関数のパラメータを全て所定の値(同一の値)に設定してもよい。
また、出力制御部106は、さらに、目標位置の移動速度を決定することが可能である。例えば、出力制御部106は、目標位置の移動速度を所定の値に設定してもよいし、または、動的に変更してもよい。
また、出力制御部106は、さらに、所定の基準に基づいて、複数の触覚刺激部200の出力強度を動的に変更してもよい。例えば、出力制御部106は、現在移動中の知覚位置(または知覚位置の範囲)に基づいて、複数の触覚刺激部200の出力強度を動的に変更することが可能である。一例として、図18に示すように、出力制御部106は、目標の知覚位置の範囲230を目標の移動経路220に沿って連続的に移動させながら、現在移動中の知覚位置の範囲240における目標の知覚強度のみを相対的により大きくする。これにより、現在移動中の知覚位置をより強調してユーザに提示することができる。
通信部120は、他の装置との間で情報の送受信を行う。例えば、通信部120は、出力制御部106の制御に従って、触覚刺激の出力の制御信号を複数の触覚刺激部200の各々(またはジャケット20)へ送信する。また、通信部120は、コンテンツ再生部102の制御に従って、再生対象の画像の表示の制御信号を表示装置30へ送信し、かつ、再生対象の音声の出力の制御信号を複数の音声出力部202の各々(またはジャケット20)へ送信する。
記憶部122は、各種のデータや各種のソフトウェアを記憶する。
以上、本実施形態による構成について説明した。次に、本実施形態による動作の一例について、図19を参照して説明する。図19は、本実施形態による動作例を示したフローチャートである。
図19に示したように、まず、サーバ10の目標位置・強度決定部104は、現在の時刻の単位時間(ΔT秒)後における、触覚刺激の目標の知覚位置および当該知覚位置における目標の知覚強度を、例えば現在再生中のコンテンツなどに応じて決定する(S101)。
ここで、図20を参照して、S105における「出力強度の算出処理」の流れについて説明する。図20に示したように、まず、出力制御部106は、S103で特定された3個の触覚刺激部200のうち、処理対象の触覚刺激部200の番号を示す変数Iに「1」を設定する(S151)。
以上説明したように、本実施形態によるサーバ10は、ユーザの身体における所定の位置情報、および、当該位置情報に関連する触覚刺激の目標の知覚強度に応じて、当該所定の位置情報に対応する一以上の触覚刺激部200の出力強度を変更する。このため、ユーザの身体における位置情報に適応した強度の触覚刺激をユーザに知覚させることができる。例えば、複数の触覚刺激部200の接触位置の間において知覚位置が連続的に移動し、かつ、当該移動中において知覚強度がほぼ一定であるように、ユーザに触覚刺激(振動など)を知覚させることができる。
<3-1.適用例1>
次に、上述した本実施形態の適用例について説明する。まず、本実施形態の適用例1について、図21および図22を参照して説明する。図21は、適用例1による、表示装置30により表示される画像40の例を示した図である。この画像40は、ユーザの視点を基準として生成された画像(例えば三次元のCG(Computer Graphics)画像など)である。なお、図21に示した例では、画像40は、ボール型のオブジェクト400を複数含む。オブジェクト400は、本開示におけるターゲット領域の一例である。
次に、本実施形態の適用例2について、図23~図25を参照して説明する。図23は、表示装置30により表示される画像40の別の例を示した図である。図23に示した例では、画像40は、槍型のオブジェクト410を含む。なお、オブジェクト410は、本開示におけるターゲット領域の一例である。
次に、本実施形態の適用例3について、図26および図27を参照して説明する。図26は、表示装置30により表示される画像40のさらに別の例を示した図である。図26に示した例では、画像40は、モンスターのオブジェクト420を含む。なお、オブジェクト420は、本開示におけるターゲット領域の一例である。
次に、本実施形態によるサーバ10のハードウェア構成について、図28を参照して説明する。図28に示すように、サーバ10は、CPU150、ROM(Read Only Memory)152、RAM154、バス156、インターフェース158、ストレージ装置160、および通信装置162を備える。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示はかかる例に限定されない。本開示の属する技術の分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
例えば、目標の知覚位置が、全ての触覚刺激部200による触覚刺激の提示範囲の外部に位置する場合も生じ得る。また、触覚刺激部200の配置位置間の距離が非常に大きい場合には、複数の触覚刺激部200の間の範囲において目標の知覚強度でユーザに触覚刺激を知覚させることが困難な場合も生じ得る。
また、別の変形例として、サーバ10(出力制御部106)は、ユーザの身体における複数の触覚刺激部200の上下方向(垂直方向)の配列に沿って、当該複数の触覚刺激部200を上から下、または、下から上へ順々に触覚刺激を出力するように制御してもよい。これにより、重力感、上昇感、または、下降感がユーザに提示され得る。例えば、あたかも、動いているエレベータに乗っているかのような感覚をユーザが得ることが期待できる。
また、別の変形例として、サーバ10は、身体(例えば皮膚など)が引っ張られる感覚をユーザに提示するように、複数の触覚刺激部200に対する触覚刺激の出力を制御してもよい。
また、別の変形例として、サーバ10は、一以上の触覚刺激部200を有する所定の筐体に対するユーザの操作(振ることや、回すことなど)に応じて、当該一以上の触覚刺激部200、および/または、ジャケット20に含まれる複数の触覚刺激部200に対する触覚刺激の出力を制御してもよい。これにより、さらに、体性感覚などの他の感覚器官も利用した空間的・空中的な触覚刺激の解像度の向上を実現することができる。
また、上記の説明では、ファントムセンセーションを利用することにより知覚位置の移動を提示する例について説明したが、本開示はかかる例に限定されない。例えば、サーバ10は、知覚位置が複数の触覚刺激部200の間で定位するような錯覚をユーザに提示するように、複数の触覚刺激部200に対する触覚刺激の出力を制御することも可能である。
また、別の変形例として、出力強度(振幅)の最大値が所定の値以下になるように、(目標の知覚強度の波形に基づいて決定される)触覚刺激部200の出力強度の波形の振幅が調整され得る。例えば、仮に当該出力強度の波形に対して(上述した)出力強度の調整関数を乗算したとしても振幅の最大値が所定の値以下になるように、当該出力強度の波形の振幅が調整され得る。なお、当該出力強度の波形の振幅は事前に調整されてもよいし、リアルタイムに調整されてもよい。あるいは、振幅の最大値が所定の値以下になるように、(上述した)出力強度の調整関数のパラメータの値が調整されてもよい。
また、一般的に、振動の周波数によっても触覚刺激の知覚強度は異なり得る。例えば、人間は、振動の周波数が約200Hzである場合に振動を一番強く感じ、また、周波数が200Hzよりも高くなると振動をより弱く感じ得る。そこで、出力制御部106は、目標の知覚位置および目標の知覚強度に基づいて、(振幅を変更する代わりに)複数の触覚刺激部200に発生させる振動の周波数を変更してもよい。これにより、(振幅を変更する場合と同様に)目標の知覚位置において目標の知覚強度の触覚刺激をユーザに提示することができる。
また、上述した実施形態では、触覚刺激部200が触覚刺激として振動を発生する例を中心として説明したが、かかる例に限定されず、触覚刺激部200は、温度、力覚情報、または、電気刺激などを触覚刺激として出力してもよい。例えば、サーバ10は、ユーザの身体に対して離れて配置された複数の触覚刺激部200の温度をそれぞれ調整することにより、当該複数の触覚刺激部200の間の位置において目標の温度を提示し得る。また、サーバ10は、当該複数の触覚刺激部200に出力させる力覚情報の強さをそれぞれ調整することにより、当該複数の触覚刺激部200の間の位置において目標の知覚強度を提示し得る。また、サーバ10は、当該複数の触覚刺激部200に出力させる電気刺激の強さをそれぞれ調整することにより、当該複数の触覚刺激部200の間の位置において目標の知覚強度を提示し得る。
また、上述した実施形態では、本開示における位置情報が、ユーザの身体における位置情報である例を中心に説明したが、かかる例に限定されず、空間上の位置情報であってもよい。例えば、当該位置情報は、実空間上の位置情報であり得る。この場合、実空間上に目標の知覚位置(または、目標の知覚位置の領域や目標の知覚位置の経路など)が設定され得る。そして、当該目標の知覚位置に対応する場所に、例えばジャケット20を装着するユーザが移動したことが測定された場合には、サーバ10は、複数の触覚刺激部200に振動を発生させたり、当該複数の触覚刺激部200の各々の振動の出力強度を変更してもよい。
また、上述した実施形態では、本開示における情報処理装置がサーバ10である例について説明したが、かかる例に限定されない。例えば、当該情報処理装置は、PC(Personal Computer)、タブレット端末、スマートフォンなどの携帯電話、ゲーム機、携帯型音楽プレーヤ、例えばHMDや腕時計型デバイスなどの装着型装置、または、ジャケット20であってもよい。
また、上述した実施形態の動作における各ステップは、必ずしも記載された順序に沿って処理されなくてもよい。例えば、各ステップは、適宜順序が変更されて処理されてもよい。また、各ステップは、時系列的に処理される代わりに、一部並列的に又は個別的に処理されてもよい。また、記載されたステップのうちの一部が省略されたり、または、別のステップがさらに追加されてもよい。
(1)
少なくとも二つ以上の触覚刺激部に対して触覚刺激の出力を制御する出力制御部を備え、
前記出力制御部は、所定の位置情報および当該位置情報に関連する触覚出力に関する情報に応じて、前記所定の位置情報に対応する触覚刺激部の出力を変更する、情報処理装置。
(2)
前記出力制御部は、前記所定の位置情報および当該位置情報に関連する触覚出力に関する情報に応じて、前記所定の位置情報に対応する触覚刺激部の出力強度を変更する、前記(1)に記載の情報処理装置。
(3)
前記所定の位置情報は、ユーザの身体における位置情報である、前記(1)または(2)に記載の情報処理装置。
(4)
前記触覚刺激部による触覚刺激の出力と関連付けて表示される画像に応じて、前記所定の位置情報は変更される、前記(3)に記載の情報処理装置。
(5)
前記所定の位置情報および当該位置情報に関連する触覚出力に関する情報は、前記画像におけるターゲット領域の位置に応じて変更される、前記(4)に記載の情報処理装置。
(6)
前記画像における前記ターゲット領域の移動に応じて、前記所定の位置情報は、前記ターゲット領域の移動方向に応じた方向へ移動される、前記(5)に記載の情報処理装置。
(7)
前記触覚刺激部による触覚刺激の出力と関連付けて出力される音情報に応じて、前記所定の位置情報は変更される、前記(3)~(6)のいずれか一項に記載の情報処理装置。
(8)
前記ユーザに対して出力される音の種類または音量に応じて、前記所定の位置情報は変更される、前記(7)に記載の情報処理装置。
(9)
前記所定の位置情報は、前記身体に対して定められる目標経路において移動される位置情報である、前記(3)~(8)のいずれか一項に記載の情報処理装置。
(10)
前記所定の位置情報は、時間の経過に応じて、前記目標経路上を移動される、前記(9)に記載の情報処理装置。
(11)
前記少なくとも二つ以上の触覚刺激部は、第1の触覚刺激部および第2の触覚刺激部を含み、
前記目標経路は、前記身体における前記第1の触覚刺激部の接触位置と前記身体における前記第2の触覚刺激部の接触位置とを結ぶ経路である、前記(10)に記載の情報処理装置。
(12)
前記目標経路は、前記身体における第1の面上の第1の位置、前記身体の内部、および、前記第1の面と向かい合う第2の面上の第2の位置を結ぶ経路である、前記(9)~(11)のいずれか一項に記載の情報処理装置。
(13)
前記少なくとも二つ以上の触覚刺激部は、第1の触覚刺激部および第2の触覚刺激部を含み、
前記第1の位置は、前記第1の面における前記第1の触覚刺激部の接触位置であり、
前記第2の位置は、前記第2の面における前記第2の触覚刺激部の接触位置である、前記(12)に記載の情報処理装置。
(14)
前記第1の面は、前記ユーザの正面であり、
前記第2の面は、前記ユーザの背面である、前記(12)または(13)に記載の情報処理装置。
(15)
前記身体における第3の位置と、前記所定の位置情報が示す位置との距離が時間の経過に応じて大きくなるように、前記所定の位置情報は変更される、前記(3)~(14)のいずれか一項に記載の情報処理装置。
(16)
前記第3の位置は、前記少なくとも二つ以上の触覚刺激部のうちのいずれかの、前記身体における接触位置である、前記(15)に記載の情報処理装置。
(17)
前記少なくとも二つ以上の触覚刺激部は、第1の触覚刺激部を含み、
前記出力制御部は、前記身体における前記第1の触覚刺激部の接触位置と、前記所定の位置情報が示す位置との距離に基づいて、前記第1の触覚刺激部の出力を変更する、前記(3)~(16)のいずれか一項に記載の情報処理装置。
(18)
前記少なくとも二つ以上の触覚刺激部は、さらに、第2の触覚刺激部を含み、
前記出力制御部は、前記身体における前記第2の触覚刺激部の接触位置と、前記所定の位置情報が示す位置との距離に基づいて、前記第2の触覚刺激部の出力を変更する、前記(17)に記載の情報処理装置。
(19)
少なくとも二つ以上の触覚刺激部に対して触覚刺激の出力を制御することと、
所定の位置情報および当該位置情報に関連する触覚出力に関する情報に応じて、前記所定の位置情報に対応する触覚刺激部の出力をプロセッサが変更することと、
を含む、情報処理方法。
(20)
コンピュータを、
少なくとも二つ以上の触覚刺激部に対して触覚刺激の出力を制御する出力制御部、
として機能させるための、プログラムであって、
前記出力制御部は、所定の位置情報および当該位置情報に関連する触覚出力に関する情報に応じて、前記所定の位置情報に対応する触覚刺激部の出力を変更する、プログラム。
20 ジャケット
30 表示装置
32 通信網
100 制御部
102 コンテンツ再生部
104 目標位置・強度決定部
106 出力制御部
120 通信部
122 記憶部
200 触覚刺激部
202 音声出力部
Claims (20)
- 少なくとも二つ以上の触覚刺激部に対して触覚刺激の出力を制御する出力制御部を備え、
前記出力制御部は、所定の位置情報および当該位置情報に関連する触覚出力に関する情報に応じて、前記所定の位置情報に対応する触覚刺激部の出力を変更する、情報処理装置。 - 前記出力制御部は、前記所定の位置情報および当該位置情報に関連する触覚出力に関する情報に応じて、前記所定の位置情報に対応する触覚刺激部の出力強度を変更する、請求項1に記載の情報処理装置。
- 前記所定の位置情報は、ユーザの身体における位置情報である、請求項1に記載の情報処理装置。
- 前記触覚刺激部による触覚刺激の出力と関連付けて表示される画像に応じて、前記所定の位置情報は変更される、請求項3に記載の情報処理装置。
- 前記所定の位置情報および当該位置情報に関連する触覚出力に関する情報は、前記画像におけるターゲット領域の位置に応じて変更される、請求項4に記載の情報処理装置。
- 前記画像における前記ターゲット領域の移動に応じて、前記所定の位置情報は、前記ターゲット領域の移動方向に応じた方向へ移動される、請求項5に記載の情報処理装置。
- 前記触覚刺激部による触覚刺激の出力と関連付けて出力される音情報に応じて、前記所定の位置情報は変更される、請求項3に記載の情報処理装置。
- 前記ユーザに対して出力される音の種類または音量に応じて、前記所定の位置情報は変更される、請求項7に記載の情報処理装置。
- 前記所定の位置情報は、前記身体に対して定められる目標経路において移動される位置情報である、請求項3に記載の情報処理装置。
- 前記所定の位置情報は、時間の経過に応じて、前記目標経路上を移動される、請求項9に記載の情報処理装置。
- 前記少なくとも二つ以上の触覚刺激部は、第1の触覚刺激部および第2の触覚刺激部を含み、
前記目標経路は、前記身体における前記第1の触覚刺激部の接触位置と前記身体における前記第2の触覚刺激部の接触位置とを結ぶ経路である、請求項10に記載の情報処理装置。 - 前記目標経路は、前記身体における第1の面上の第1の位置、前記身体の内部、および、前記第1の面と向かい合う第2の面上の第2の位置を結ぶ経路である、請求項9に記載の情報処理装置。
- 前記少なくとも二つ以上の触覚刺激部は、第1の触覚刺激部および第2の触覚刺激部を含み、
前記第1の位置は、前記第1の面における前記第1の触覚刺激部の接触位置であり、
前記第2の位置は、前記第2の面における前記第2の触覚刺激部の接触位置である、請求項12に記載の情報処理装置。 - 前記第1の面は、前記ユーザの正面であり、
前記第2の面は、前記ユーザの背面である、請求項12に記載の情報処理装置。 - 前記身体における第3の位置と、前記所定の位置情報が示す位置との距離が時間の経過に応じて大きくなるように、前記所定の位置情報は変更される、請求項3に記載の情報処理装置。
- 前記第3の位置は、前記少なくとも二つ以上の触覚刺激部のうちのいずれかの、前記身体における接触位置である、請求項15に記載の情報処理装置。
- 前記少なくとも二つ以上の触覚刺激部は、第1の触覚刺激部を含み、
前記出力制御部は、前記身体における前記第1の触覚刺激部の接触位置と、前記所定の位置情報が示す位置との距離に基づいて、前記第1の触覚刺激部の出力を変更する、請求項3に記載の情報処理装置。 - 前記少なくとも二つ以上の触覚刺激部は、さらに、第2の触覚刺激部を含み、
前記出力制御部は、前記身体における前記第2の触覚刺激部の接触位置と、前記所定の位置情報が示す位置との距離に基づいて、前記第2の触覚刺激部の出力を変更する、請求項17に記載の情報処理装置。 - 少なくとも二つ以上の触覚刺激部に対して触覚刺激の出力を制御することと、
所定の位置情報および当該位置情報に関連する触覚出力に関する情報に応じて、前記所定の位置情報に対応する触覚刺激部の出力をプロセッサが変更することと、
を含む、情報処理方法。 - コンピュータを、
少なくとも二つ以上の触覚刺激部に対して触覚刺激の出力を制御する出力制御部、
として機能させるための、プログラムであって、
前記出力制御部は、所定の位置情報および当該位置情報に関連する触覚出力に関する情報に応じて、前記所定の位置情報に対応する触覚刺激部の出力を変更する、プログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018525939A JP6922908B2 (ja) | 2016-07-07 | 2017-04-06 | 情報処理装置、情報処理方法、およびプログラム |
US16/302,187 US10976821B2 (en) | 2016-07-07 | 2017-04-06 | Information processing device, information processing method, and program for controlling output of a tactile stimulus to a plurality of tactile stimulus units |
EP17823825.9A EP3483701B1 (en) | 2016-07-07 | 2017-04-06 | Information processing device, information processing method, and program |
KR1020187037422A KR102427212B1 (ko) | 2016-07-07 | 2017-04-06 | 정보 처리 장치, 정보 처리 방법 및 프로그램 |
CN201780041103.4A CN109416584B (zh) | 2016-07-07 | 2017-04-06 | 信息处理装置、信息处理方法和程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016134717 | 2016-07-07 | ||
JP2016-134717 | 2016-07-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018008217A1 true WO2018008217A1 (ja) | 2018-01-11 |
Family
ID=60912056
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/014379 WO2018008217A1 (ja) | 2016-07-07 | 2017-04-06 | 情報処理装置、情報処理方法、およびプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US10976821B2 (ja) |
EP (1) | EP3483701B1 (ja) |
JP (1) | JP6922908B2 (ja) |
KR (1) | KR102427212B1 (ja) |
CN (1) | CN109416584B (ja) |
WO (1) | WO2018008217A1 (ja) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019163260A1 (ja) * | 2018-02-20 | 2019-08-29 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2019187527A1 (ja) * | 2018-03-27 | 2019-10-03 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2019244716A1 (ja) * | 2018-06-19 | 2019-12-26 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2020008856A1 (ja) * | 2018-07-02 | 2020-01-09 | ソニー株式会社 | 情報処理装置、情報処理方法、および記録媒体 |
WO2020054415A1 (ja) * | 2018-09-11 | 2020-03-19 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、記録媒体 |
WO2020100583A1 (ja) * | 2018-11-14 | 2020-05-22 | ソニー株式会社 | 情報処理装置、情報処理方法、および記憶媒体 |
WO2020162210A1 (ja) * | 2019-02-04 | 2020-08-13 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JPWO2020008716A1 (ja) * | 2018-07-03 | 2021-08-02 | ソニーグループ株式会社 | 符号化装置、符号化方法、復号装置、復号方法、伝送システム、受信装置、プログラム |
CN113874815A (zh) * | 2019-05-28 | 2021-12-31 | 索尼集团公司 | 信息处理装置、信息处理方法和程序 |
DE112021000532T5 (de) | 2020-01-16 | 2022-12-01 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsendgerät und programm |
DE112021000541T5 (de) | 2020-01-16 | 2022-12-01 | Sony Group Corporation | Informationsverarbeitungsvorrichtung und informationsverarbeitungsendgerät |
DE112021000549T5 (de) | 2020-01-16 | 2022-12-01 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsendgerät und programm |
DE112021000578T5 (de) | 2020-01-17 | 2022-12-01 | Sony Group Corporation | Informationsverarbeitungsvorrichtung und informationsverarbeitungsendgerät |
DE112021006311T5 (de) | 2020-12-04 | 2023-10-12 | Sony Group Corporation | Informationsverarbeitungseinrichtung, informationsverarbeitungsverfahren, programm und informationsverarbeitungssystem |
DE112021006306T5 (de) | 2020-12-04 | 2023-11-16 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, programm und informationsverarbeitungssystem |
DE112021006304T5 (de) | 2020-12-04 | 2023-11-16 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, programm und informationsverarbeitungssystem |
US11922797B2 (en) | 2018-06-28 | 2024-03-05 | Sony Corporation | Encoding device, encoding method, decoding device, decoding method, and program |
US12019804B2 (en) | 2018-11-14 | 2024-06-25 | Sony Group Corporation | Information processing system, tactile presentation apparatus, tactile presentation method, and storage medium |
WO2024135778A1 (ja) * | 2022-12-21 | 2024-06-27 | 株式会社Jvcケンウッド | 生体フィードバック検出装置、生体感覚拡張装置、生体感覚拡張方法、及びプログラム |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102639118B1 (ko) * | 2015-09-08 | 2024-02-22 | 소니그룹주식회사 | 정보 처리 장치, 방법 및 컴퓨터 프로그램 |
US10732714B2 (en) | 2017-05-08 | 2020-08-04 | Cirrus Logic, Inc. | Integrated haptic system |
WO2019013056A1 (ja) * | 2017-07-10 | 2019-01-17 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
EP3654665B1 (en) | 2017-07-10 | 2024-09-18 | Sony Group Corporation | Information processing device, information processing method and program |
US11259121B2 (en) | 2017-07-21 | 2022-02-22 | Cirrus Logic, Inc. | Surface speaker |
US20200384358A1 (en) * | 2017-12-19 | 2020-12-10 | Sony Corporation | Information processing apparatus, information processing method, and program |
US10832537B2 (en) | 2018-04-04 | 2020-11-10 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
US11269415B2 (en) * | 2018-08-14 | 2022-03-08 | Cirrus Logic, Inc. | Haptic output systems |
GB201817495D0 (en) | 2018-10-26 | 2018-12-12 | Cirrus Logic Int Semiconductor Ltd | A force sensing system and method |
US10726683B1 (en) | 2019-03-29 | 2020-07-28 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using a two-tone stimulus |
US10955955B2 (en) | 2019-03-29 | 2021-03-23 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors |
US11644370B2 (en) | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load |
US12035445B2 (en) | 2019-03-29 | 2024-07-09 | Cirrus Logic Inc. | Resonant tracking of an electromagnetic load |
US10828672B2 (en) | 2019-03-29 | 2020-11-10 | Cirrus Logic, Inc. | Driver circuitry |
US10992297B2 (en) | 2019-03-29 | 2021-04-27 | Cirrus Logic, Inc. | Device comprising force sensors |
US11283337B2 (en) | 2019-03-29 | 2022-03-22 | Cirrus Logic, Inc. | Methods and systems for improving transducer dynamics |
US11509292B2 (en) | 2019-03-29 | 2022-11-22 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter |
US10976825B2 (en) | 2019-06-07 | 2021-04-13 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system |
CN114008569A (zh) | 2019-06-21 | 2022-02-01 | 思睿逻辑国际半导体有限公司 | 用于在装置上配置多个虚拟按钮的方法和设备 |
US11408787B2 (en) | 2019-10-15 | 2022-08-09 | Cirrus Logic, Inc. | Control methods for a force sensor system |
US11380175B2 (en) | 2019-10-24 | 2022-07-05 | Cirrus Logic, Inc. | Reproducibility of haptic waveform |
US11545951B2 (en) | 2019-12-06 | 2023-01-03 | Cirrus Logic, Inc. | Methods and systems for detecting and managing amplifier instability |
US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator |
US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters |
US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive |
US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system |
US11552649B1 (en) | 2021-12-03 | 2023-01-10 | Cirrus Logic, Inc. | Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007048268A (ja) * | 2005-07-15 | 2007-02-22 | Sca:Kk | 形状記憶合金の機械振動を情報伝達手段とする触覚による情報伝達装置 |
JP2009070263A (ja) * | 2007-09-14 | 2009-04-02 | Japan Science & Technology Agency | 貫通触感覚提示装置 |
JP2015166890A (ja) | 2014-03-03 | 2015-09-24 | ソニー株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001005601A (ja) * | 1999-06-17 | 2001-01-12 | Sony Corp | 電気刺激装置及び電気刺激を用いた力触覚呈示装置並びにこれらの制御方法 |
JP4023214B2 (ja) * | 2002-05-20 | 2007-12-19 | セイコーエプソン株式会社 | 触覚・力覚提示装置および触覚・力覚提示システム |
US20040095311A1 (en) * | 2002-11-19 | 2004-05-20 | Motorola, Inc. | Body-centric virtual interactive apparatus and method |
KR20080048837A (ko) * | 2006-11-29 | 2008-06-03 | 삼성전자주식회사 | 촉각 피드백을 출력하는 장치 및 방법 |
US20110080273A1 (en) * | 2008-06-13 | 2011-04-07 | Waseda University | Sensation Presenting System and Sensation Presenting Device |
US8540571B2 (en) * | 2010-03-31 | 2013-09-24 | Immersion Corporation | System and method for providing haptic stimulus based on position |
US20110267294A1 (en) * | 2010-04-29 | 2011-11-03 | Nokia Corporation | Apparatus and method for providing tactile feedback for user |
TWI492096B (zh) * | 2010-10-29 | 2015-07-11 | Au Optronics Corp | 立體影像互動系統及其位置偏移補償方法 |
JP5920862B2 (ja) * | 2011-03-08 | 2016-05-18 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理方法、コンピュータプログラム、および情報処理システム |
WO2013089490A1 (ko) * | 2011-12-15 | 2013-06-20 | 한양대학교 산학협력단 | 디스플레이 장치와 연동하는 촉감 제공 장치 및 방법 |
US9046925B2 (en) * | 2012-09-11 | 2015-06-02 | Dell Products L.P. | Method for using the GPU to create haptic friction maps |
US9330544B2 (en) * | 2012-11-20 | 2016-05-03 | Immersion Corporation | System and method for simulated physical interactions with haptic effects |
US9164587B2 (en) * | 2013-11-14 | 2015-10-20 | Immersion Corporation | Haptic spatialization system |
US9671826B2 (en) * | 2013-11-27 | 2017-06-06 | Immersion Corporation | Method and apparatus of body-mediated digital content transfer and haptic feedback |
WO2016088246A1 (ja) * | 2014-12-05 | 2016-06-09 | 富士通株式会社 | 触感提供システム、及び、触感提供装置 |
KR102188157B1 (ko) * | 2015-12-11 | 2020-12-07 | 코오롱인더스트리 주식회사 | 촉각자극 제공 장치 및 그 구동 방법 |
US10198086B2 (en) * | 2016-10-27 | 2019-02-05 | Fluidity Technologies, Inc. | Dynamically balanced, multi-degrees-of-freedom hand controller |
-
2017
- 2017-04-06 US US16/302,187 patent/US10976821B2/en active Active
- 2017-04-06 CN CN201780041103.4A patent/CN109416584B/zh active Active
- 2017-04-06 KR KR1020187037422A patent/KR102427212B1/ko active IP Right Grant
- 2017-04-06 EP EP17823825.9A patent/EP3483701B1/en active Active
- 2017-04-06 JP JP2018525939A patent/JP6922908B2/ja active Active
- 2017-04-06 WO PCT/JP2017/014379 patent/WO2018008217A1/ja unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007048268A (ja) * | 2005-07-15 | 2007-02-22 | Sca:Kk | 形状記憶合金の機械振動を情報伝達手段とする触覚による情報伝達装置 |
JP2009070263A (ja) * | 2007-09-14 | 2009-04-02 | Japan Science & Technology Agency | 貫通触感覚提示装置 |
JP2015166890A (ja) | 2014-03-03 | 2015-09-24 | ソニー株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019163260A1 (ja) * | 2018-02-20 | 2019-08-29 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
US11334226B2 (en) | 2018-02-20 | 2022-05-17 | Sony Corporation | Information processing device, information processing method, and program |
EP3757721A4 (en) * | 2018-02-20 | 2021-04-21 | Sony Corporation | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM |
WO2019187527A1 (ja) * | 2018-03-27 | 2019-10-03 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP7238886B2 (ja) | 2018-03-27 | 2023-03-14 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JPWO2019187527A1 (ja) * | 2018-03-27 | 2021-04-22 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2019244716A1 (ja) * | 2018-06-19 | 2019-12-26 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
US11709550B2 (en) | 2018-06-19 | 2023-07-25 | Sony Corporation | Information processing apparatus, method for processing information, and program |
JPWO2019244716A1 (ja) * | 2018-06-19 | 2021-06-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
US11922797B2 (en) | 2018-06-28 | 2024-03-05 | Sony Corporation | Encoding device, encoding method, decoding device, decoding method, and program |
WO2020008856A1 (ja) * | 2018-07-02 | 2020-01-09 | ソニー株式会社 | 情報処理装置、情報処理方法、および記録媒体 |
US11276282B2 (en) | 2018-07-02 | 2022-03-15 | Sony Corporation | Information processing apparatus, information processing method, and recording medium |
US11823557B2 (en) | 2018-07-03 | 2023-11-21 | Sony Corporation | Encoding apparatus, encoding method, decoding apparatus, decoding method, transmission system, receiving apparatus, and program |
JPWO2020008716A1 (ja) * | 2018-07-03 | 2021-08-02 | ソニーグループ株式会社 | 符号化装置、符号化方法、復号装置、復号方法、伝送システム、受信装置、プログラム |
JP7351299B2 (ja) | 2018-07-03 | 2023-09-27 | ソニーグループ株式会社 | 符号化装置、符号化方法、復号装置、復号方法、プログラム |
JPWO2020054415A1 (ja) * | 2018-09-11 | 2021-09-09 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、記録媒体 |
JP7396288B2 (ja) | 2018-09-11 | 2023-12-12 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、記録媒体 |
CN112673334A (zh) * | 2018-09-11 | 2021-04-16 | 索尼公司 | 信息处理装置、信息处理方法和记录介质 |
US12053101B2 (en) | 2018-09-11 | 2024-08-06 | Sony Corporation | Information processing device, information processing method, and recording medium |
CN112673334B (zh) * | 2018-09-11 | 2024-06-25 | 索尼公司 | 信息处理装置、信息处理方法和记录介质 |
WO2020054415A1 (ja) * | 2018-09-11 | 2020-03-19 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、記録媒体 |
CN112969984A (zh) * | 2018-11-14 | 2021-06-15 | 索尼集团公司 | 信息处理装置、信息处理方法和存储介质 |
US12019804B2 (en) | 2018-11-14 | 2024-06-25 | Sony Group Corporation | Information processing system, tactile presentation apparatus, tactile presentation method, and storage medium |
WO2020100583A1 (ja) * | 2018-11-14 | 2020-05-22 | ソニー株式会社 | 情報処理装置、情報処理方法、および記憶媒体 |
JPWO2020162210A1 (ja) * | 2019-02-04 | 2021-12-02 | ソニーグループ株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP7512901B2 (ja) | 2019-02-04 | 2024-07-09 | ソニーグループ株式会社 | 情報処理装置、情報処理方法およびプログラム |
US12102909B2 (en) | 2019-02-04 | 2024-10-01 | Sony Group Corporation | Information processing apparatus and information processing method |
WO2020162210A1 (ja) * | 2019-02-04 | 2020-08-13 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
CN113874815A (zh) * | 2019-05-28 | 2021-12-31 | 索尼集团公司 | 信息处理装置、信息处理方法和程序 |
DE112021000532T5 (de) | 2020-01-16 | 2022-12-01 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsendgerät und programm |
US11983324B2 (en) | 2020-01-16 | 2024-05-14 | Sony Group Corporation | Information processing device, information processing terminal, and program |
DE112021000549T5 (de) | 2020-01-16 | 2022-12-01 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsendgerät und programm |
DE112021000541T5 (de) | 2020-01-16 | 2022-12-01 | Sony Group Corporation | Informationsverarbeitungsvorrichtung und informationsverarbeitungsendgerät |
US11941177B2 (en) | 2020-01-17 | 2024-03-26 | Sony Group Corporation | Information processing device and information processing terminal |
DE112021000578T5 (de) | 2020-01-17 | 2022-12-01 | Sony Group Corporation | Informationsverarbeitungsvorrichtung und informationsverarbeitungsendgerät |
DE112021006304T5 (de) | 2020-12-04 | 2023-11-16 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, programm und informationsverarbeitungssystem |
DE112021006306T5 (de) | 2020-12-04 | 2023-11-16 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, programm und informationsverarbeitungssystem |
DE112021006311T5 (de) | 2020-12-04 | 2023-10-12 | Sony Group Corporation | Informationsverarbeitungseinrichtung, informationsverarbeitungsverfahren, programm und informationsverarbeitungssystem |
WO2024135778A1 (ja) * | 2022-12-21 | 2024-06-27 | 株式会社Jvcケンウッド | 生体フィードバック検出装置、生体感覚拡張装置、生体感覚拡張方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP3483701B1 (en) | 2023-11-01 |
EP3483701A1 (en) | 2019-05-15 |
US10976821B2 (en) | 2021-04-13 |
JP6922908B2 (ja) | 2021-08-18 |
KR20190025844A (ko) | 2019-03-12 |
CN109416584B (zh) | 2022-08-02 |
CN109416584A (zh) | 2019-03-01 |
JPWO2018008217A1 (ja) | 2019-04-18 |
US20190196596A1 (en) | 2019-06-27 |
EP3483701A4 (en) | 2019-07-03 |
KR102427212B1 (ko) | 2022-07-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018008217A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US10792569B2 (en) | Motion sickness monitoring and application of supplemental sound to counteract sickness | |
JP6994556B2 (ja) | 異なるレンダリング及び音像定位を使用する無線ヘッドマウントディスプレイ | |
US10324531B2 (en) | Haptic feedback using a field of view | |
WO2020138107A1 (ja) | 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム | |
JP2020004395A (ja) | 仮想現実感ユーザのための実世界触覚対話 | |
US20150325027A1 (en) | Method and system for reducing motion sickness in virtual reality ride systems | |
JP6908053B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2015231443A (ja) | プログラム及びゲームシステム | |
JP2017500989A (ja) | 可変音声パラメータ設定 | |
TW201610757A (zh) | 外部刺激隔離式之使用者實境體驗產生系統 | |
WO2014013483A1 (en) | System and method for social dancing | |
WO2019163260A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20200201437A1 (en) | Haptically-enabled media | |
JP2017182130A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
WO2019244716A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
CN113906368A (zh) | 基于生理观察修改音频 | |
JP2021189674A (ja) | コンピュータプログラム、サーバ装置、端末装置、及び方法 | |
KR101992618B1 (ko) | 선호환경조건 방식의 기억법을 적용한 mci 집중력 증진 콘텐츠 제공방법 | |
Kapralos et al. | Serious games: Customizing the audio-visual interface | |
JP2024011187A (ja) | 情報処理装置 | |
JP6127792B2 (ja) | 音響再生装置及び音響再生プログラム | |
WO2018234318A1 (en) | REDUCING VIRTUAL DISEASE IN VIRTUAL REALITY APPLICATIONS | |
Harvey et al. | The Effect of Discretised and Fully Converged Spatialised Sound on Directional Attention and Distraction. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018525939 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17823825 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20187037422 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017823825 Country of ref document: EP Effective date: 20190207 |