JP2017063916A - Apparatus, method and program for determining force sense to be presented - Google Patents

Apparatus, method and program for determining force sense to be presented Download PDF

Info

Publication number
JP2017063916A
JP2017063916A JP2015190727A JP2015190727A JP2017063916A JP 2017063916 A JP2017063916 A JP 2017063916A JP 2015190727 A JP2015190727 A JP 2015190727A JP 2015190727 A JP2015190727 A JP 2015190727A JP 2017063916 A JP2017063916 A JP 2017063916A
Authority
JP
Japan
Prior art keywords
force sense
content
state
device
presentation device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2015190727A
Other languages
Japanese (ja)
Other versions
JP6509698B2 (en
Inventor
慎也 高椋
Shinya Takamuku
慎也 高椋
智浩 雨宮
Tomohiro Amamiya
智浩 雨宮
五味 裕章
Hiroaki Gomi
裕章 五味
Original Assignee
日本電信電話株式会社
Nippon Telegr & Teleph Corp <Ntt>
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社, Nippon Telegr & Teleph Corp <Ntt> filed Critical 日本電信電話株式会社
Priority to JP2015190727A priority Critical patent/JP6509698B2/en
Publication of JP2017063916A publication Critical patent/JP2017063916A/en
Application granted granted Critical
Publication of JP6509698B2 publication Critical patent/JP6509698B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Abstract

To provide a game system that presents a sense of force corresponding to output content and allows a user to experience a realistic virtual experience.
In order to enable a more realistic fish pulling sensation to be experienced in a fishing game, the state of output content (eg, fish) and the relative relationship between a pseudo force sense presentation device and an apparatus The force sense (vibration magnitude) to be presented by the pseudo force sense presenting device is determined according to the positional relationship.
[Selection] Figure 1

Description

  The present invention relates to a technique for causing a user to perceive a pseudo force sensation, and more particularly to a technique for determining a force sensation to be perceived.

  Various tactile feedback technologies have been used in the field of home game machines and arcade game machines. However, tactile feedback has become a difficult task in the field of mobile games that have become popular in recent years.

  For example, for a fishing game using a mobile terminal device, a technique for presenting a fish pulling sensation using a built-in vibrator has been proposed (see, for example, Patent Document 1). However, only a vibration sensation can be presented by a normal vibrator, and the vibration is generated at hand. Therefore, such a technique gives a sense of incongruity as if the fish is in the hand, not the tip of the fishing line. In addition, when a fish is trying to present a feeling of pulling a string (a feeling of pulling) by a normal method, a device that applies traction to the user is required. Since such a device cannot be used unless it is fixed somewhere (otherwise, the device itself moves when towed), it is difficult to mount it on a mobile terminal device.

  As one effective approach to such a problem, use of a pseudo force sense presentation device (see, for example, Non-Patent Document 1) can be considered. This device presents a sensation (force sense) as if a directional external force is applied to the body part by applying asymmetric vibration to the user's body. Since it can be realized with a small vibrator, it can be expected to be applied to force sense presentation on a mobile terminal device.

JP 2001-352414 A

Tomohiro Amemiya, Shinya Takahata, Sho Ito, Hiroaki Gomi, "Brunavi 3", a device that creates a sensation of being pulled by fingers, NTT Technical Journal, September 2014, Vol. 26, No. 9, pp. 23 -26.

  However, it is not easy to present a sense of force according to the output content and to experience a realistic virtual experience. An object of the present invention is to present a sense of force corresponding to output content and to experience a realistic virtual experience.

  In the present invention, the force sense to be presented by the pseudo force sense presentation device is determined according to the state of the output content and the relative positional relationship between the pseudo force sense presentation device and the device.

  Thereby, it is possible to present a sense of force corresponding to the output content and to experience a realistic virtual experience.

FIG. 1 is a block diagram illustrating a system configuration of the embodiment. FIG. 2 is a diagram illustrating a use state of the system according to the embodiment. FIG. 3 is a diagram illustrating the “state” of the content according to the embodiment and “vibration”, “video”, and “sound” corresponding thereto. FIG. 4 is a diagram illustrating a force sense presented in the embodiment. FIG. 5 is a flowchart illustrating the state transition of the embodiment. FIG. 6 is a flowchart illustrating the state transition of the embodiment.

Embodiments of the present invention will be described below.
[Overview]
First, an outline of the present embodiment will be described.
In the present embodiment, the pseudo force sense presentation device (pseudo force sense presentation device) according to the state of the content output from the device and the relative positional relationship between the device that outputs the content and the pseudo force sense presentation device. ) To determine the sense of force to present. Thereby, a force sense corresponding to the output content can be presented from the pseudo force sense presenting device, and a realistic virtual experience can be experienced. For example, in the case of the “fishing game” described in Non-Patent Document 1, the position information of the controller can be obtained only when the contact is touching the touch panel, and the force sense presented using only the two-dimensional position information. Is determined. Therefore, it is impossible to present a force sense corresponding to the three-dimensional movement of the controller (including movement of the controller approaching or moving away from the screen). In the case of the “dog walk game” described in Non-Patent Document 1, the position information of the controller can be obtained with a Web camera, but only the position information (two-dimensional position information) of the component horizontal to the screen is used. Therefore, it is impossible to present a force sense corresponding to the three-dimensional movement of the controller. In this embodiment, since the force sense is determined according to the relative positional relationship between the content output device and the pseudo force sense presenting device, it is possible to present a force sense corresponding to the three-dimensional movement of the controller. In addition, in the “fishing game” and “dog walking game” described in Non-Patent Document 1, the sense of force is presented without considering the state of the content of the displayed fish or dog itself (state other than the position). Therefore, you can not experience a realistic virtual experience. In this embodiment, the force sense is determined according to the “content state” and the “relative positional relationship”, so that a realistic virtual experience can be experienced. Further, in Non-Patent Document 1, a force sense is continuously given without considering the state of the content itself, and a pseudo force sense that the efficiency of force sense presentation is reduced if the vibration is continuously given. There were also problems specific to the presentation device. In the present embodiment, since the force sense is determined according to the “content state”, the frequency of continuously applying the same vibration is reduced, and a reduction in the efficiency of force sense presentation can be suppressed. Furthermore, by using the relative positional relationship between the device that outputs the content and the pseudo force sense presentation device, an external reference position (such as an AR marker fixed in a room as in Non-Patent Document 1) is used as a reference. Compared to the case of specifying the position, the position estimation accuracy of the pseudo force sense presentation device can be improved, and a highly accurate virtual experience can be experienced.

  Examples of “content” are video content such as moving images, acoustic content such as music, and combination content of video content and acoustic content. The video content may be 2D video content, 3D video content, or video content using human illusion. Examples of video content are fish animations and water ripples in a fishing game. The audio content may be monaural content, stereo content, or multi-channel content. An example of acoustic content is the sound of waves generated by the movement of fish.

  “Content status” means the “state” of the content. The “content state” may be a state that changes (transitions) with time, or may be a state according to the type of content that does not change with time. The “content state” may or may not be limited to a state that changes with time. The “content state” may or may not be limited to a state corresponding to a content type that does not change with time. The state of the video content includes “status” other than the position (coordinates on the display screen) in the “device” that outputs (displays) the content. In other words, the state of the video content includes an element that does not depend on the position in the “device”. That is, the state of the video content may be different even if the position in the “device” is the same, and the state of the video content may be the same even if the position in the “device” is different. The state of the video content may or may not correspond to the position in the “device” that outputs the content. A specific example of the state of the video content will be described later. The state of the audio content may or may not correspond to the state of the video content. The state of the audio content will also be described later.

  Examples of the “device” that outputs content are a mobile terminal device, a stationary information processing device, and the like. Examples of the mobile terminal device are a smart phone terminal device, a tablet terminal device, a notebook personal computer, a portable game machine, and the like. Examples of the stationary information processing apparatus are a desktop personal computer, a display or a television apparatus to which a computer such as a game machine or a stick personal computer is connected.

  The “pseudo force sense presentation device” is a device that presents a force sense by vibration. The force sense presented by the “pseudo force sense presentation device” is a “pseudo force sense”. "Pseudo force sense" actually feels as if a steady traction force is applied in one direction of vibration, even though the pseudo force sense presentation device is actually vibrating. Perception. “Traction force” means the pulling force. A “pseudo force sense presentation device” includes a vibrator that performs asymmetric vibration (asymmetric force perturbation), and controls the waveform and amplitude of the force that the vibrator presents to achieve a desired strength in a desired direction. Force sense can be perceived. For example, when this vibrator is controlled to perform a periodic force perturbation with a force waveform that applies a large force in the direction A for a short time and a small force in the direction B opposite to the direction A for a long time, The user holding the “pseudo force sense presentation device” perceives this pseudo force sense in the direction A. This utilizes human perceptual characteristics, and is a phenomenon that occurs due to a peculiar sensation and a tactile sensation related to a gripping operation (see, for example, Non-Patent Document 1 and Reference Document 1 “Patent No. 4551448”). That is, by controlling the pattern of asymmetric vibration of the vibrator (ie, “vibration vibration asymmetry” “vibrator force perturbation asymmetry”), the direction and strength of the force sense to be presented (clarity) ) Can be controlled. For example, when an apparatus including an actuator disclosed in Non-Patent Document 1 is used as a “pseudo force sense presentation device”, the direction and strength of the force sense are controlled by controlling the cycle and reversal ratio of the current supplied to the actuator. Can be controlled. When the frequency of the periodic force perturbation is controlled to include a frequency component in the vicinity of 80 Hz or 80 Hz, the user holding the “pseudo force sense presentation device” perceives a strong force sense. In the case of the configuration of Non-Patent Document 1, by supplying a current including a frequency component of 40 Hz to the actuator, periodic force perturbation including a frequency component near 80 Hz or 80 Hz can be realized. On the other hand, when the frequency of this periodic acceleration motion is controlled not to include a frequency component in the vicinity of 80 Hz, the user holding the “pseudo force sense presentation device” perceives a weak force sense, It doesn't perceive force sense at all. That is, the perceived strength of the force sense can be controlled by changing the vibration period of the vibrator. The perceived strength of the force sense can also be controlled by the amplitude of the force perturbation waveform (vibration force waveform) of the vibrator. For example, if you want to present a strong force sensation, drive the transducer so that the force waveform has a large force perturbation, and if you want to present a weak force sensation, the force waveform has a smaller amplitude. The vibrator may be driven so as to be perturbed. The “amplitude of the force waveform (vibration force waveform)” means the maximum absolute value of the force output from the vibrator. In the case of the configuration of Non-Patent Document 1, the magnitude of the force sense can be controlled by controlling the amplitude of the current supplied to the actuator. That is, the “pseudo force sense presentation device” is a device that presents a force sense by vibration, and changes in “content state” and / or “relative positional relationship between the content output device and the pseudo force sense presentation device”. In response to this, at least one of the vibration period, asymmetry, and amplitude of the acceleration waveform is changed. Preferably, the “pseudo force sense presentation device” changes at least one of the vibration period, asymmetry, and the amplitude of the force waveform in accordance with a change in the “content state”.

  “Relative positional relationship between a device that outputs content and a pseudo force sense presentation device” is a “pseudo force sense presentation device” based on a “device” that outputs “content” (for example, “a predetermined position of the device”). (For example, “predetermined position of the pseudo force sense presentation device”) and / or a relative inclination of the “pseudo force sense presentation device” (for example, “predetermined position of the pseudo force sense presentation device”). ”) As a reference, the relative position and / or relative inclination of the“ device ”(eg,“ predetermined position of the device ”) may be used. For example, the “relative positional relationship” may be a relative position and / or a relative inclination of the “pseudo force sense presentation device” with reference to the “screen” of the “device” displaying the video content. The relative position and / or relative inclination of the “screen” with respect to the “pseudo force sense presentation device” may be used. Preferably, the “relative position” and the “relative inclination” include a “normal direction component” with respect to the “screen”. That is, the change in the “relative positional relationship” may be a change in the “normal direction component” of the predetermined position and / or inclination of the “pseudo force sense presentation device” with respect to the predetermined position of the “screen”. Alternatively, the change in the “relative positional relationship” may be a change in the “normal direction component” of the predetermined position (coordinates) and / or orientation of the “screen” with respect to the predetermined position of the “pseudo force sense presentation device”. Good. The “device” that outputs “content” may have a sensor to obtain “relative positional relationship”. An example of such a sensor is a non-contact position sensor such as a web camera or other image sensor. Other devices may include sensors to obtain “relative positional relationship”.

  (A) Designated by the relative positional relationship between the “device” that determines the force sense to be presented by the pseudo force sense presentation device and the “pseudo force sense presentation device” or the direction of the “pseudo force sense presentation device” Depending on the relative positional relationship between the position and the output position of “content” and (B) time (ie, “relative positional relationship” and “time”), even if the state of “content” changes Good. Thereby, “content” whose state changes according to “three-dimensional space” corresponding to “relative positional relationship” and “time” is expressed, and a realistic virtual experience can be experienced.

  Preferably, the “apparatus” for determining the force sense presented by the simulated force sense presentation device has an “output unit” that outputs “content”, and (A) the “apparatus” and the “pseudo force sense presentation device”. Or the relative distance based on the position specified by the orientation of the (B) “pseudo force sense presentation device” and the output position of the content, the greater the traction force toward the “output unit” is perceived Determine the sense of force. This makes it possible to have a virtual experience of a place that is pulled back with such a strong force that it is pulled apart (for example, a fishing scene). Preferably, the “output unit” is a “screen” that displays image content, and the “device” has a larger traction force toward the “screen” as the normal component of the “screen” in the “relative distance” increases. Determine the force perception that is greatly perceived. Thereby, it is possible to present a force sense corresponding to the normal direction component of the “screen”.

  The screen for displaying the video content may be vibrated in synchronization with the force sense presented from the “pseudo force sense presentation device”. Thereby, a more realistic virtual experience can be experienced.

  As the above-described modification, the pseudo force sense presentation device according to the state of the output content and at least the relative positional relationship between the position specified by the orientation of the pseudo force sense presentation device and the output position of the content The force sense to be presented at may be determined. The “position specified by the orientation of the pseudo force sense presentation device” may be a position that exists outside the “pseudo force sense presentation device”, or a position that exists inside the “pseudo force sense presentation device”. There may be. The “position designated by the orientation of the pseudo force sense presentation device” may be determined by the orientation of the “pseudo force sense presentation device” and the “output position of the content”. The “content output position” may be a position (screen or the like) on the “device” that outputs the content, or may be a position outside the “device” (screen or external display screen). . For example, the “pseudo force sense presentation device” may include a pointing device such as a laser pointer, and the position pointed from the pointing device may be “a position designated by the direction of the pseudo force sense presentation device”. Even if it does in this way, the same effect as the above can be acquired.

[First Embodiment]
Next, a first embodiment will be described with reference to the drawings. In the present embodiment, an example in which the present invention is applied to a fishing game will be described. However, this does not limit the present invention.

<Configuration>
As illustrated in FIGS. 1 and 2, the system according to the present embodiment includes a controller 11, a driving device 12, and a terminal device 13. The controller 11 and the drive device 12 are wired to enable wired communication, and the drive device 12 and the terminal device 13 are configured to be capable of wireless communication. In the example of FIG. 2, the user 10 holds the controller 11 with the right hand, wears the wristwatch-type driving device 12 on the left arm, and holds the terminal device 13 with the left hand. However, this is merely an example, and the present invention is not limited to such usage. For example, in FIG. 2, the driving device 12 is attached to the hand opposite to the hand holding the controller 11 for the sake of easy understanding of the drawing, but the driving device 12 can be worn on the hand holding the controller 11. In general, it is desirable to do so because the wiring and the like can be shortened.

<Controller 11>
The controller 11 includes a pseudo force sense presentation device 111 that presents a fish pulling sensation by asymmetric vibration. For example, the controller 11 is a device in which the pseudo force sense presentation device 111 is put in a case made of a material such as ABS, and the shape of the case can be regarded as a fishing rod. As described above, the pseudo force sense presentation device 111 presents a pseudo force sense by asymmetric vibration. Since the function of the pseudo force sense presentation device 111 can be realized by a small vibrator, force sense presentation can be realized in a form that does not greatly impair the portability of a device such as a mobile terminal device. For the pseudo force sense presentation device 111, for example, an apparatus disclosed in Non-Patent Document 1 can be used.

<Drive device 12>
The driving device 12 includes a driving signal receiving unit 122 and a driving signal generating unit 121. The drive signal reception unit 122 is a wireless communication device, and the drive signal generation unit 121 generates a drive signal for driving the pseudo force sense presentation device 111. The drive device 12 includes, for example, a wireless communication device such as a Bluetooth (registered trademark) interface, a processor (hardware processor) such as a CPU (central processing unit), and a random-access memory (RAM) / read-only memory (ROM). A general-purpose or dedicated computer provided with a memory or the like executes a predetermined program. The computer may include a single processor and memory, or may include a plurality of processors and memory. This program may be installed in a computer, or may be recorded in a ROM or the like in advance. In addition, some or all of the processing units are configured using an electronic circuit that realizes a processing function without using a program, instead of an electronic circuit (circuitry) that realizes a functional configuration by reading a program like a CPU. May be. An electronic circuit constituting one device may include a plurality of CPUs.

<Configuration of Terminal Device 13>
The terminal device 13 includes a state recognition unit 131 (sensor), a state transition unit 132, a signal generation unit 133 (determination unit), a video display unit 134 (output unit), an acoustic presentation unit 135, and a vibration signal transmission unit 136. The terminal device 13 is, for example, a device configured by a computer as described above executing a predetermined program. Some or all of the processing units may be configured using an electronic circuit that realizes a processing function without using a program. Examples of the terminal device 13 are a mobile terminal device and a stationary information processing device.

<< Video display unit 134 >>
The video display unit 134 is a screen that displays a game video that is video content. For example, a display built in the mobile terminal device or a display of a stationary information processing apparatus can be used as the video display unit 134.

≪Sound presentation part 135≫
The acoustic presentation unit 135 is a speaker that reproduces game sound that is acoustic content. For example, a speaker built in the mobile terminal device or a speaker of a stationary information processing apparatus can be used as the sound presentation unit 135.

≪State recognition unit 131≫
The state recognition unit 131 is a non-contact position sensor, and acquires a relative position (relative positional relationship) of the controller 11 with respect to the video display unit 135 (for example, a screen). For example, the function of the state recognition unit 131 can be realized by a CPU that executes a predetermined program and a camera (for example, a built-in camera of a mobile terminal device) whose relative position with respect to the video display unit 134 is fixed. Various algorithms can be used to recognize the position and orientation of the controller 11. For example, by tracking a color object outside the captured controller 11 using a particle filter algorithm, the controller 11 can be tracked even if there is an object of a color that is difficult to distinguish from the controller. Further, in order to facilitate the recognition of the controller 11, a marker or the like may be attached to the lower part of the controller 11 (an end portion directed toward the video display unit 134).

<< State Transition Unit 132 >>
Based on the relative position and time of the controller 11 with respect to the video display unit 134, the state transition unit 132 determines the state of the fishing game (the position and posture of the fish, the wave surface of the water surface, whether the fish is caught on the fishing hook, The set timer, random number, etc.) are changed. That is, the state transition unit 132 changes the state of the content according to the relative positional relationship and time between the video display unit 134 and the controller 11.

<< Signal Generation Unit 133 >>
The signal generation unit 133 generates a video signal, an acoustic signal, and a vibration signal that respectively specify a game video, a game sound, and vibration according to the state of the fishing game described above. The characteristic points of the signal generation unit 133 are the state of the content displayed on the video display unit 134 and the relative positional relationship between the controller 11 and the terminal device 13 (the pseudo force sense presentation device 111 and the terminal device 13 The sense of force to be presented by the pseudo force sense presentation device 111 is determined in accordance with the relative force relationship), and a vibration signal for causing the pseudo force sense presentation device 111 to present the force sense is generated.

<< Vibration signal transmitter 136 >>
The vibration signal transmission unit 136 is a wireless communication device, and transmits a vibration signal to the driving device 12. For example, the Bluetooth (registered trademark) interface of the mobile terminal device can be used as the vibration signal transmission unit 136.

<Operation>
Next, the operation of this embodiment will be described. In the fishing game of this embodiment, for example, as illustrated in FIG. 3, as a state of the video content of the fish, a state where a fish without a fishhook is swimming freely (FREE state), and a fish bites the fishhook. State (HIT state), fish holding a fishhook descending underwater (SINK state), fish holding a fishhook rising toward the water surface (RISE state), fish with a fishhook firmly attached Is handled by 6 states: a state where the user is pulled toward the water surface (CATCH state) and a state where the fish is caught (GOAL state). In this way, the state of the video content of the fish is divided, and a series of states constituting the fishing experience are expressed by reproducing vibration (force), game video, and game sound according to each case. Each of these states changes according to the relative positional relationship and time between the terminal device 13 and the controller 11 (pseudo force sense presentation device 111). That is, the state of the video content 134b is changed based on the relative positional relationship between the controller 11 and the video content 134b of the fish in the video display unit 134, information on the time change thereof, and the game state at that time, and the game video is accordingly displayed. And the interaction with fish is expressed by changing the force sense to present. That is, the force sensation and video presented by the pseudo force sense presentation device 111 according to the state of the content output by the terminal device 13 and the relative positional relationship between the pseudo force sense presentation device 111 and the terminal device 13 The game video displayed on the display unit 134 and the game sound presented on the sound presentation unit 135 are controlled (determined). In the example of the present embodiment, a video image 134 a of a cursor corresponding to a fishhook and a video image of a ripple corresponding to the water surface are displayed as a game video, in addition to a fish video content 134 b. Thereby, the reality of a game, an intelligibility, entertainment property, artistry, etc. can be improved. In particular, when the terminal device 13 instructs the controller 11 to present a force sense (synchronized with the instruction), the video display unit 134 may display rippled video content. In addition, by reproducing the acoustic content of the sound of the waves generated by the fish from the acoustic presentation unit 135 disposed in the vicinity of the video display unit 134, it is possible to give an illusion as if the fish actually existed there.

  In addition, in the fishing game of the present embodiment, for example, as shown in FIG. 4, video contents of a plurality of types of fish can be handled, and vibration patterns (vibration period, asymmetry, amplitude of force waveform, etc.) according to the type of fish. By changing (at least one of), it is possible to express the difference in the pulling sensation of different types of fish. For example, because ayu is small, the pull is generally weak, the carp pull is strong and accompanied by a fluttering vibration sensation, and the black bass expresses its monotonous pull by reducing the strength of the pull sensation. In addition, an expression that causes a small pull when inquiring with a small creature such as an insect other than a fish, or does not vibrate when an object other than a fish such as garbage or yo-yo is applied may be implemented.

  Next, the operation of the hill fishing game of this embodiment will be described with reference to FIGS. The state recognition unit 131 continuously or intermittently acquires the relative positional relationship between the terminal device 13 (video display unit 134) and the controller 11 (pseudo force sense presentation device 111), and the acquired relative positional relationship is obtained. The data is sent to the state transition unit 132.

  The state transition unit 132 changes the input relative positional relationship, time, and state of content at each time (“FREE state”, “HIT state”, “SINK state”, “RISE state”, “CATCH state”, “GOAL state”). Based on this, the state of the content is changed. The initial state is the “FREE state”. The state transition unit 132 outputs the content state at each time. The signal generation unit 133 generates a video signal, an acoustic signal, and a vibration signal that respectively specify game video, game sound, and vibration according to the input state.

  In the “FREE state”, the signal generation unit 133 generates and outputs a video signal of “video where fish swim freely”. The signal generator 133 generates and outputs a video signal of “cursor video” corresponding to a fishhook. The display coordinates of this “cursor image” correspond to the relative positional relationship between the terminal device 13 (image display unit 134) and the controller 11 (pseudo force sense presentation device 111). If the relative positional relationship changes, the display coordinates of the “cursor image” also change. For example, the display coordinates of the “cursor image” may change according to only the displacement of the horizontal component relative to the video display unit 134 (screen) in the relative positional relationship, or the video display unit in the relative positional relationship. The display coordinates of the “cursor image” may change in accordance with a change in at least one of the horizontal direction component and the normal direction component with respect to 134 (screen). The video signal of “video where the fish swims freely” and the video signal of “video of the cursor” are input to the video display unit 134, and the video display unit 134 responds accordingly to the video content 134 b of “video where the fish swim freely” and The video content 134a of “cursor video” is displayed. In the “FREE state”, the pseudo force sense presentation device 111 does not present a force sense (step S132a).

  The state transition unit 132 determines whether the display coordinates of the video content 134b of the fish and the display coordinates of the video content 134a of the cursor corresponding to the relative positional relationship are close (whether the fish and the cursor are close). For example, the state transition unit 132 determines whether the display coordinates of the cursor video content 134a are within a predetermined range of the display coordinates of the fish video content 134b (display coordinates at a predetermined position of the fish video content 134b). For example, using a fish mouth as the video content 134b and a vector corresponding to the orientation of the displayed video content 134b, a certain area in the direction along the vector indicating the direction of the fish from the fish mouth is defined as “predetermined range”. However, this determination may be performed. If the display coordinates of the cursor video content 134a are within the predetermined range of the display coordinates of the fish video content 134b, it is determined that the fish and the cursor have approached, and otherwise, it is determined that the fish and the cursor have not approached. (Step S132b). When it is determined that the fish and the cursor are not approaching, the state transition unit 132 maintains the “FREE state” (step S132a). On the other hand, when it is determined that the fish and the cursor have approached, the state transition unit 132 transitions the content state to the “HIT state”.

  In the “HIT state”, the signal generation unit 133 presents a vibration signal for presenting a force sensation that causes the pseudo force sense presentation device 111 to perceive the traction force at intervals of 50 ms, “a ripple generated near the fish moving near the fishhook” A video signal of “No. of images” and an acoustic signal of “wave sound generated by the movement of fish” are generated and output. For example, this vibration signal is for presenting a “weak traction force” when the fish is “Ayu”, and “medium strength” when the fish is “Carp” or “Black Bass”. This is for presenting “short traction force” (FIG. 4). The vibration signal is sent to the vibration signal transmission unit 136 and transmitted. The vibration signal is received by the drive signal receiver 122 of the drive device 12 and sent to the drive signal generator 121. The drive signal generation unit 121 generates a drive signal for presenting a force sense according to the vibration signal and supplies the drive signal to the pseudo force sense presentation device 111. The pseudo force sense presentation device 111 presents a force sense corresponding to the supplied drive signal (force sense corresponding to a force sense pattern specified by the drive signal). The video signal of “the ripple image generated near the fish moving in the vicinity of the fishhook” is input to the video display section 134, and the video display section 134 accordingly generates “the same movement as the fish moving in the vicinity of the fishhook”. The video content 134b of “ripple image” is displayed. The acoustic signal of “wave sound generated by fish movement” is input to the acoustic presentation unit 135, and the acoustic presentation unit 135 outputs “wave sound generated by fish movement” accordingly (step S132c).

  The state transition unit 132 monitors the input relative positional relationship and determines whether the controller 11 is located within a predetermined range from the video display unit 134 (step S132d). The “predetermined range” here may be the same as the “predetermined range” in step S132b, may be a wider range, or may be a narrower range. When it is determined that the controller 11 has gone out of the predetermined range of the video display unit 134 (the controller 11 has moved away from the video display unit 134), the state transition unit 132 returns the processing to step S132a, thereby “HIT state”. To “FREE” state. This expresses that the fish has escaped. On the other hand, when the controller 11 is located within the predetermined range from the video display unit 134, the state transition unit 132 determines whether a certain time has elapsed since the transition to the “HIT state” (step S132e). . Here, when it is determined that the predetermined time has not elapsed, the state transition unit 132 returns the process to step S132d. On the other hand, when it is determined that the predetermined time has elapsed, the state transition unit 132 transitions the content state to the “SINK state”.

  In the “SINK state”, the signal generation unit 133 presents a vibration signal for presenting the traction force that the controller 11 pulls in the direction of the video content 134b of the fish, and a video signal of “video in which the fish holding the fishhook is descending in the water” , And an acoustic signal of “wave sound generated by fish movement” is generated and output. For example, this vibration signal is for presenting a “weak and strong traction force” when the fish is “Ayu”, and when the fish is “Carp” or “Black Bass” It is intended to present “medium strength sustained traction and vibration” (FIG. 4). The vibration signal is sent to the vibration signal transmission unit 136 and the drive signal generation unit 121, and the drive signal generated by the drive signal generation unit 121 is supplied to the pseudo force sense presentation device 111 according to the vibration signal transmission unit 136 and the drive signal generation unit 111. Presents a force sense according to the drive signal. The video signal of “the image of the fish holding the fishhook descending underwater” is input to the image display unit 134, and the image display unit 134 accordingly “the image of the fish holding the fishhook descending underwater” Video content 134b is displayed. The acoustic signal of “wave sound generated by fish movement” is input to the acoustic presentation unit 135, and the acoustic presentation unit 135 outputs “wave sound generated by fish movement” accordingly (step S132f).

  The state transition unit 132 monitors the input relative positional relationship and determines whether the controller 11 is located within a predetermined range from the video display unit 134 (step S132g). The “predetermined range” here may be the same as the “predetermined range” in steps S132b and S132d, may be a wider range, or may be a narrower range. When it is determined that the controller 11 is not out of the predetermined range of the video display unit 134, the state transition unit 132 generates a random integer (step S132h) and determines whether the random integer is a multiple of 30. (Step S132i). If the integer random number is not a multiple of 30, the state transition unit 132 returns the process to step S132f and maintains the “SINK state”. On the other hand, when the integer random number is a multiple of 30, the state transition unit 132 transitions the content state to the “RISE state”.

  In the “RISE state”, the signal generation unit 133 generates and outputs a video signal of “video in which a fish holding a fishhook is rising toward the water surface”. The video signal of “video in which a fish holding a fishhook is rising toward the water surface” is input to the video display unit 134, and the video display unit 134 responds accordingly by saying “the fish holding a fishhook is rising toward the water surface. The video content 134b of “current video” is displayed. In the “RISE state”, the simulated force sense presentation device 111 does not present a force sense (step S132j).

  The state transition unit 132 monitors the input relative positional relationship and determines whether the controller 11 is located within a predetermined range from the video display unit 134 (step S132k). The “predetermined range” here may be the same as the “predetermined range” in steps S132b, S132d, and S132g, or may be a wider range or a narrower range. When it is determined that the controller 11 is not out of the predetermined range of the video display unit 134, the state transition unit 132 generates a random integer (step S132m) and determines whether the random integer is a multiple of 30. (Step S132i). If the integer random number is not a multiple of 30, the state transition unit 132 returns the process to step S132f and maintains the “RISE state”. On the other hand, when the integer random number is a multiple of 30, the state transition unit 132 transitions the content state to the “SINK state” by returning the process to step S132f.

  In steps S132f to S132n described above, the state of the fish video content 134b randomly changes between the “SINK state” and the “RISE state”. Here, in step S132g in the “SINK state”, when it is determined that the controller 11 has moved out of the predetermined range of the video display unit 134, the state transition unit 132 advances the processing to step S132p, thereby causing the “CATCH state”. Transition to. On the other hand, if it is determined in step S132k in the “RISE state” that the controller 11 has gone out of the predetermined range of the video display unit 134, the state transition unit 132 returns the process to step S132a to set the “RISE state” to “ Transition to “FREE state”. This expresses that the fish has escaped.

  In the “CATCH state”, a vibration signal for the signal generator 133 to present a traction force that the controller 11 is strongly pulled in the direction of the video content 134b of the fish, “video in which the fish holding the fishhook is rising toward the water surface” And an audio signal of “wave sound generated by fish movement” are generated and output. For example, this vibration signal is for presenting “medium-strength continuous traction” if the fish is “Ayu”, and “strong-strength persistent” if the fish is “carp”. ”And” when the fish is “Black Bass”, it is for presenting “medium strength sustained traction and vibration” (FIG. 4). The vibration signal is sent to the vibration signal transmission unit 136 and the drive signal generation unit 121, and the drive signal generated by the drive signal generation unit 121 is supplied to the pseudo force sense presentation device 111 according to the vibration signal transmission unit 136 and the drive signal generation unit 111. Presents a force sense according to the drive signal. The video signal of “video in which a fish holding a fishhook is rising toward the water surface” is input to the video display unit 134, and the video display unit 134 responds accordingly by saying “the fish holding a fishhook is rising toward the water surface. The video content 134b of “current video” is displayed. The acoustic signal of “wave sound generated by fish movement” is input to the acoustic presentation unit 135, and the acoustic presentation unit 135 outputs “wave sound generated by fish movement” accordingly. In the “CATCH state”, the distance to the water surface of the fish decreases when the distance of the controller 11 to the video display unit 134 (for example, the distance of the normal direction component of the screen that is the video display unit 134) is a predetermined value or more. I will decide. At this time, as the relative distance between the terminal device 13 and the pseudo force sense presentation device 111 (for example, the relative distance of the normal direction component of the screen that is the video display unit 134) is larger, the traction force toward the video display unit 134 (fish image) A force sensation in which a large pulling force that causes the controller 11 to be strongly pulled in the direction of the content 134b may be presented (step S132q). Here, when the fish has not reached the water surface, the signal generation unit 133 maintains the “CATCH state” in step S132p. On the other hand, when the fish reaches the water surface (when the distance to the water surface of the fish becomes 0), the signal generation unit 133 causes the process to proceed to step S132r to change the “CATCH state” to the “GOAL state”.

  In the “GOAL state”, a video signal of “video showing fish caught” and an audio signal of “melody sound notifying that fish were caught” are generated and output. The video signal of “video showing fish caught” is input to the video display unit 134, and the video display unit 134 displays video content 134 b of “video showing fish caught” accordingly. The sound signal of “melody sound notifying that fish has been caught” is input to the sound presentation unit 135, and the sound presentation unit 135 outputs “melody sound notifying that fish has been caught” accordingly (step S132r).

  The state transition unit 132 monitors the input relative positional relationship and determines whether the controller 11 is located within a predetermined range from the video display unit 134 (step S132s). The “predetermined range” here may be the same as the “predetermined range” in steps S132b, S132d, S132g, and S132k, or may be a wider range or a narrower range. When it is determined that the controller 11 is not out of the predetermined range of the video display unit 134, the state transition unit 132 maintains the “GOAL state” by returning the process to step S132r. On the other hand, when it is determined that the controller 11 has gone out of the predetermined range of the video display unit 134, the state transition unit 132 returns the process to step S132a to transition the “GOAL state” to the “FREE state”.

  As described above, by controlling the pseudo force sense presentation device 111 so as to apply a traction force in the “SINK state” in which the fish with the fishhook descends underwater and not to apply the traction force in the “RISE state” in which the fish moves up, There is a fishing line between the content 134b and the controller 11, and an illusion that the tension is applied can be generated. In addition, when the controller 11 is pulled away from the video display unit 134 (screen) at an appropriate timing (SINK state), a state transition (SINK state → CATCH state) that increases the traction force is provided to pull the fish. It can express the real feeling of being.

<Features of this embodiment>
In this embodiment, the force sense presented by the pseudo force sense presentation device 111 is determined according to the state of the output video content 134b of the fish and the relative positional relationship between the pseudo force sense presentation device 111 and the terminal device 13. It was decided to. Thus, a fishing game can be realized in which a force sense corresponding to the output video content 134b of the fish is presented from the pseudo force sense presentation device 111 and a realistic virtual experience is experienced.

  In the fishing game of Non-Patent Document 1, a weight is hung from a pseudo force sense presentation device with a thread, and the weight is brought into contact with a screen with a touch sensor that displays the video content of the fish. The positional relationship of was detected. However, the mobile terminal device has a small screen size, and its operation is difficult particularly when the mobile terminal device is used in a mobile terminal device that shakes or tilts. In this embodiment, since the force sense is determined according to the relative positional relationship between the pseudo force sense presenting device 111 and the terminal device 13, the operation is simpler than the method of Non-Patent Document 1, and even a mobile terminal device with a small screen is used. Fully available.

  Further, in the fishing game of Non-Patent Document 1, a force sense presented according to a relative position (for example, a relative distance or a relative direction) in the normal direction of the screen between the video content of the fish and the pseudo force sense presenting device. It was not possible to control the change. In this embodiment, since the force sense is determined according to the relative positional relationship between the pseudo force sense presentation device 111 and the terminal device 13, the relative position in the normal direction of the screen between the video content of the fish and the pseudo force sense presentation device. The force sensation to be presented can be changed in response to the three-dimensional positional relationship between the video content and the pseudo force sensation presentation device. Thereby, the realism of a fishing game can be raised.

  Furthermore, in the fishing game of Non-Patent Document 1, if the same vibration is continuously applied, the tactile receptor adapts, and the haptic intensity experienced by the user is attenuated over time. There were also problems specific to the sense presentation device. In this embodiment, the content state changes according to the relative positional relationship between the terminal device 13 and the pseudo force sense presentation device 111 and the time, and the force sense changes according to the content state. Therefore, it is possible to prevent attenuation of the strength of the force sense due to adaptation of the tactile receptor. In addition, by changing the vibration frequency, vibration intensity, or vibration pattern for presenting the force sense according to the change in the length of the virtual fishing line and the type of fish, the force sense that can be experienced in the game is more It can be rich.

  Further, based on the three-dimensional relative positional relationship between the fish video content 134b in the game video and the controller 11, the force to present so as to reproduce the natural law that the tension of the fishing line is proportional to the elongation of its length. By controlling the sense of sight, the illusion of being pulled by the fish in the screen can be given.

[Second Embodiment]
The force sense to be presented by the pseudo force sense presentation device is determined according to the state of the output content and the relative positional relationship between the position specified by the orientation of the pseudo force sense presentation device and the output position of the content. May be. For example, in the first embodiment, the relative position between the controller and the terminal device is calculated by photographing the controller with a state recognition unit such as a camera of the terminal device. However, the controller (force sense presentation device) also has a pointing function, and instead of using the relative position between the controller and the terminal device, the relative position between the pointed position and the content output from the terminal device is used. Also good.

  In this case, as illustrated in FIG. 1 and the like, the controller 11 of the first embodiment is replaced with the controller 21, and the terminal device 13 is replaced with the terminal device 23. The controller 21 has a pointer such as a laser pointer in addition to the pseudo force sense presentation device 111. The terminal device 23 is the same as the terminal device 13 except that the terminal device 23 includes a state recognition unit 231 instead of the state recognition unit 131. The position pointed to by the pointer 212 of the controller 21 (at least the position specified by the direction of the pseudo force sense presentation device 111 (the relative direction with respect to the terminal device 13), for example, the direction and position of the pseudo force sense presentation device 111 ( The position specified by the position relative to the terminal device 13 is sent to the state recognition unit 131. The state recognizing unit 131 has a relative positional relationship between the position and the video content displayed on the video display unit 134 (for example, a coordinate position indicating a fish mouth portion) (for example, the position of the ponting and the position of the video content). The distance between the two coordinate values) is sent to the state transition unit 132. The subsequent processing is the same as in the first embodiment. In this case, the content state may change according to the relative positional relationship between the position specified by the orientation of the pseudo force sense presentation device 111 and the output position of the content and the time, or the orientation of the pseudo force sense presentation device 111 As the relative distance based on the position specified by and the output position of the content is larger, a force sense that perceives the traction force toward the video display unit 134 may be presented.

  The controller 21 may not have a pointer such as a laser pointer. For example, a marker pattern is attached to the surface of the controller 11 as described in the first embodiment, and based on how the marker on the surface of the controller 11 is obtained by a camera (state recognition unit 231) provided in the terminal device 23, Detecting the relative orientation of the pseudo force sense presentation device 111 with respect to the terminal device 23 (for example, the angle formed by the normal direction of the video display unit 134 of the terminal device 23 and the direction indicated by the pseudo force sense presentation device 111). Also good. As another method, each of the pseudo force sense presentation device 111 and the terminal device 23 includes a sensor for detecting the tilt, and the detection result of the sensor for detecting the tilt of the pseudo force sense presentation device 111 is communicated to the terminal device 23. It is good also as calculating | requiring the relative direction of the pseudo force sense presentation device 111 and the terminal device 23 by the terminal device 23 being transmitted.

  When the relative orientation between the pseudo force sense presentation device 111 and the terminal device 23 is obtained by the terminal device 23, the position designated by the pseudo force sense presentation device 111 is specified based only on the information of the “obtained orientation”. In addition to the “determined position”, the position specified by the pseudo force sense presentation device 111 may be specified based on the relative position between the pseudo force sense presentation device 111 and the terminal device 23. . When based on only the “determined orientation”, for example, a relative orientation that points to a predetermined position (eg, the central portion) of the video display unit 134 of the terminal device 23 is determined in advance. Only the distance based on the difference in angle between the predetermined direction of the image display unit 134 according to the difference in direction between the determined direction and the determined direction (difference in direction, difference in angle), or in the direction according to the difference in direction What is necessary is just to instruct | indicate the distant position with the pseudo force sense presentation device 111. On the other hand, when based on the relative position between the pseudo force sense presentation device 111 and the terminal device 23 in addition to the “determined direction”, the pseudo force sense is based on the position and orientation of the pseudo force sense presentation device 111. A straight line is extended in a predetermined direction from a predetermined position on the presentation device 111 (including in the device and in the vicinity of the device), and a portion where the straight line intersects the video display unit 134 is a direction of the pseudo force sense presentation device 111. The position specified by may be used.

[Third Embodiment]
As a modification of the first embodiment and the second embodiment, the video display unit 134 that is a screen may be vibrated in synchronization with the force sense presented from the pseudo force sense presentation device 111. In this case, the terminal device 13 of the first embodiment is replaced with the terminal device 33. The terminal device 33 is the same as the terminal device 13 except that it includes a signal generation unit 333 instead of the signal generation unit 133 and further includes a vibration unit 337. The vibration unit 337 is disposed in the vicinity of the video display unit 134 and vibrates the video display unit 134 based on the input control signal.

  The signal generation unit 333 vibrates the video display unit 134 that is a screen in synchronization with the force sense presented from the pseudo force sense presentation device 111 in accordance with the content state determined by the state transition unit 132 as described above. Thereby, the presence of the fish displayed on the screen can be increased.

[Other variations]
The present invention is not limited to the above-described embodiment. For example, in each of the above-described embodiments, the video content is displayed on the screen of the terminal device. However, the terminal device may have a projection function, and a projector projection surface projected by the terminal device may be used as an alternative to the screen. In this case, a region projected by the projector may be acquired in advance by calibration, and the relative distance between the projected region and the controller may be used as the “relative positional relationship”. When the projector projection plane is used as an alternative to the screen, the pointed position on the projector projection plane may be used as the “position specified by the orientation of the pseudo force sense presentation device” in the second embodiment.

  In each of the above-described embodiments, the drive device and the controller are separate housings, but the drive device and the controller may be included in one housing.

  In the above-described embodiment, the relative positional relationship between the controller (pseudo force sense presentation device) and the terminal device is recognized only by the state recognition unit (for example, camera) of the terminal device. However, the relative positional relationship between the terminal device and the controller (pseudo force sense presentation device) is obtained from a sensor (for example, an acceleration sensor) built in the controller or a drive device attached to the arm holding the controller, and input to the state recognition unit. It is good.

  Information (vibration period, asymmetry, force waveform amplitude, etc.) representing a vibration signal corresponding to each state of the content may be shared in advance between the terminal device side and the driving device. In this case, the terminal device only needs to transmit information representing the state of the content to the drive device, and the drive device can obtain a vibration signal based on this information and generate a drive signal.

  In the above-described embodiment, six states of “FREE state”, “HIT state”, “SINK state”, “RISE state”, “CATCH state”, and “GOAL state” are exemplified as the state of the content. However, these are examples and do not limit the present invention. The “content state” is not limited to a state that changes with time according to the progress of the game. For example, the type of fish or the size of the fish (such as fry, adult fish) may be used as the state. Conversely, the content state may be limited to only information that changes over time in the game. Furthermore, the display position of the content (not the relative position between the controller and the terminal device, for example, the display coordinates of the video content of the fish in the screen that is the video display unit of the terminal device) may be used as an example of the “content state”. Good or not used.

  In the above-described embodiment, a fishing game for fishing a fish is taken as an example, but the present invention can be applied to other uses. For example, it can be used for other purposes such as a tug-of-war game, other types of games such as a game that simulates the pulling condition of riding reins with simulated force sense, and a positioning skill acquisition application for performing a predetermined work. The invention may be applied.

  In the above-mentioned embodiment, the example which extracts the relative position of a controller and a terminal device as a relative positional relationship was shown. However, instead of the controller, a hand (or finger) holding the controller may be a detection target, and the relative position between the hand and the terminal device may be a relative positional relationship. Or it is good also considering the combination of a controller and a hand (or finger) as a detection target, and making the relative position of them and a terminal device into a relative positional relationship.

  Further, the terminal device or the controller includes a gyro sensor, and the direction of the controller (or the time change thereof) with respect to the screen of the terminal device detected by the terminal device or the controller is changed (for example, “FREE state”, “HIT state”, “CATCH state”, etc. It may be used as a trigger for a change in the state of

  In addition, the various processes described above are not only executed in time series according to the description, but may also be executed in parallel or individually as required by the processing capability of the apparatus that executes the processes. Needless to say, other modifications are possible without departing from the spirit of the present invention.

  When the above-described terminal device is realized by a computer, processing contents of functions that the terminal device should have are described by a program. By executing this program on a computer, the processing functions of the terminal device are realized on the computer. The program describing the processing contents can be recorded on a computer-readable recording medium. An example of a computer-readable recording medium is a non-transitory recording medium. Examples of such a recording medium are a magnetic recording device, an optical disk, a magneto-optical recording medium, a semiconductor memory, and the like.

  This program is distributed, for example, by selling, transferring, or lending a portable recording medium such as a DVD or CD-ROM in which the program is recorded. Furthermore, the program may be distributed by storing the program in a storage device of the server computer and transferring the program from the server computer to another computer via a network.

  A computer that executes such a program first stores, for example, a program recorded on a portable recording medium or a program transferred from a server computer in its own storage device. When executing the process, the computer reads a program stored in its own storage device, and executes a process according to the read program. As another execution form of the program, the computer may read the program directly from the portable recording medium and execute processing according to the program, and each time the program is transferred from the server computer to the computer. The processing according to the received program may be executed sequentially. The above-described processing may be executed by a so-called ASP (Application Service Provider) type service that realizes a processing function only by an execution instruction and result acquisition without transferring a program from the server computer to the computer. Good.

  In the above embodiment, the processing functions of the apparatus are realized by executing a predetermined program on a computer. However, at least a part of these processing functions may be realized by hardware.

11, 21 Controller 12 Driving device 13, 23, 33 Terminal device

Claims (10)

  1. An apparatus for determining a force sense to be presented by a pseudo force sense presenting device,
    A determination unit configured to determine a force sense to be presented by the pseudo force sense presentation device according to a state of content output by the device and a relative positional relationship between the pseudo force sense presentation device and the device; apparatus.
  2.   Force sensation presented by the pseudo force sense presentation device according to the state of the content being output and at least the relative positional relationship between the position specified by the orientation of the pseudo force sense presentation device and the output position of the content An apparatus having a determining unit for determining
  3. The apparatus of claim 1 or 2, comprising:
    A) Relative positional relationship between the apparatus and the simulated force sense presentation device, or a relative positional relationship between the position specified by the orientation of the simulated force sense presentation device and the output position of the content, and B) time A device in which the state of the content changes in response to.
  4. The apparatus of claim 1 or 2, comprising:
    An output unit for outputting the content;
    The determination unit
    A) Relative distance between the device and the pseudo force sense presentation device, or
    B) The apparatus which determines the said force sense by which the traction force toward the said output part is perceived so that the relative distance based on the position designated by the direction of the said pseudo force sense presentation device and the output position of the said content is large.
  5. The apparatus of claim 1 or 2, comprising:
    The pseudo force sense presentation device is a device that presents the force sense by vibration, and changes at least one of the vibration period, asymmetry, and amplitude of a force waveform in accordance with a change in the state of the content. .
  6. The apparatus according to any one of claims 1 to 5,
    A screen for displaying the content;
    A vibration unit that vibrates the screen in synchronization with a force sense presented from the pseudo force sense presentation device.
  7. The apparatus according to any one of claims 1 to 6,
    An apparatus having a sensor to obtain the relative positional relationship.
  8. A method for determining a force sense to be presented by a pseudo force sense presentation device,
    A method for determining a force sense to be presented by the pseudo force sense presentation device according to a state of content output by the device and a relative positional relationship between the pseudo force sense presentation device and the device.
  9.   Depending on the state of the content being output and the relative positional relationship between the position specified by the orientation of the pseudo force sense presentation device and the output position of the content, the force sense presented by the pseudo force sense presentation device is displayed. How to decide.
  10.   The program for functioning a computer as an apparatus in any one of Claim 1-7.
JP2015190727A 2015-09-29 2015-09-29 Apparatus, method, and program for determining force sense to be presented Active JP6509698B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015190727A JP6509698B2 (en) 2015-09-29 2015-09-29 Apparatus, method, and program for determining force sense to be presented

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015190727A JP6509698B2 (en) 2015-09-29 2015-09-29 Apparatus, method, and program for determining force sense to be presented

Publications (2)

Publication Number Publication Date
JP2017063916A true JP2017063916A (en) 2017-04-06
JP6509698B2 JP6509698B2 (en) 2019-05-08

Family

ID=58490688

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015190727A Active JP6509698B2 (en) 2015-09-29 2015-09-29 Apparatus, method, and program for determining force sense to be presented

Country Status (1)

Country Link
JP (1) JP6509698B2 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11169557A (en) * 1997-12-12 1999-06-29 Namco Ltd Image forming device and information storage medium
WO2002073385A1 (en) * 2001-03-08 2002-09-19 National Institute Of Advanced Industrial Science And Technology Method and unit for presenting inner force sense using gyro

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11169557A (en) * 1997-12-12 1999-06-29 Namco Ltd Image forming device and information storage medium
WO2002073385A1 (en) * 2001-03-08 2002-09-19 National Institute Of Advanced Industrial Science And Technology Method and unit for presenting inner force sense using gyro

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Wiiリモコン大百科/釣りマスター", デンゲキニンテンドーDS, vol. 第6巻第18号, JPN6018032146, 1 December 2006 (2006-12-01), JP *
五味裕章他3名: "力覚呈示ガジェットの研究開発", 画像ラボ, vol. 第26巻第7号, JPN6018032147, 10 July 2015 (2015-07-10), JP, pages 第41-44頁 *
宮本渓他1名: "赤外線を用いた座標取得方法を利用したARコンテンツ", 第72回(平成22年)全国大会講演論文集(4), JPN6018044785, 8 March 2010 (2010-03-08), JP, pages 第4-427〜4-428頁 *
雨宮智浩他3名: "指でつまむと引っ張られる感覚を生み出す装置「ぶるなび3」", NTT技術ジャーナル, vol. 第26巻第9号, JPN6018032136, 1 September 2014 (2014-09-01), JP, pages 第23-26頁 *

Also Published As

Publication number Publication date
JP6509698B2 (en) 2019-05-08

Similar Documents

Publication Publication Date Title
US9881420B2 (en) Inferential avatar rendering techniques in augmented or virtual reality systems
TWI470534B (en) Three dimensional user interface effects on a display by using properties of motion
RU2691589C2 (en) Non-visual feedback of visual change in a method and a tracking device
KR101643020B1 (en) Chaining animations
JP6092506B2 (en) Tactilely enhanced interactivity of interactive content
US8553049B2 (en) Information-processing apparatus and information-processing method
KR20140093970A (en) System and method for augmented and virtual reality
JP6001542B2 (en) System for enabling video capture of interactive applications on mobile devices
US20140364215A1 (en) Methods for Rendering Interactive Content to a Head Mounted Display
US20090221368A1 (en) Method and system for creating a shared game space for a networked game
US10137374B2 (en) Method for an augmented reality character to maintain and exhibit awareness of an observer
JP6335613B2 (en) System and method for tactile enabled adaptive and multi-faceted display
US20120229248A1 (en) Multipurpose controller for electronic devices, facial expressions management and drowsiness detection
CN102008823B (en) Method and system for controlling movements of objects in a videogame
KR20170026567A (en) Three dimensional contextual feedback
US9833705B2 (en) Storage medium storing information processing program, information processing apparatus and information processing method
JP6316387B2 (en) Wide-area simultaneous remote digital presentation world
US8540571B2 (en) System and method for providing haptic stimulus based on position
US9905052B2 (en) System and method for controlling immersiveness of head-worn displays
US9367136B2 (en) Holographic object feedback
KR20120112720A (en) Simulation of three dimensional motion using haptic actuators
US9599821B2 (en) Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
US20050093887A1 (en) Image processing system, image processing apparatus, and display apparatus
US7536655B2 (en) Three-dimensional-model processing apparatus, three-dimensional-model processing method, and computer program
JP6539351B2 (en) Sensory feedback system and method for guiding a user in a virtual reality environment

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170829

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180821

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20181011

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20181120

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20181207

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190402

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190403

R150 Certificate of patent or registration of utility model

Ref document number: 6509698

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150