WO2012053141A1 - 端末装置および情報処理システム - Google Patents
端末装置および情報処理システム Download PDFInfo
- Publication number
- WO2012053141A1 WO2012053141A1 PCT/JP2011/004538 JP2011004538W WO2012053141A1 WO 2012053141 A1 WO2012053141 A1 WO 2012053141A1 JP 2011004538 W JP2011004538 W JP 2011004538W WO 2012053141 A1 WO2012053141 A1 WO 2012053141A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- terminal device
- unit
- image
- contact
- information processing
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
- A63F13/235—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1043—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being characterized by constructional details
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/301—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device using an additional display connected to the game console, e.g. on the controller
Definitions
- the present invention relates to a terminal device that operates as an input device or an output device, and an information processing system using the terminal device.
- a conventional game controller has an input interface such as a cross key and a button, and is connected to the game device by wire to transmit operation data by the user to the game device.
- the user holds the left and right grips of the controller with both hands, and operates the cross keys and buttons with fingers.
- controllers that can wirelessly transmit and receive data to and from game devices are becoming popular due to advances in wireless technology. Since the wireless controller does not require wiring with the game device, the user can play the game at a free position.
- the wireless controller disclosed in Patent Document 1 transmits and receives data wirelessly, it does not change an input interface such as a cross key or a button. Therefore, there is no difference in operability between the conventional wired controller and the wireless controller disclosed in Patent Document 1.
- the controller By providing the controller with a new input interface, the range of user operations can be expanded, and variations in game applications that take advantage of the operability can be increased.
- the conventional controller is used as an input device that accepts an operation by the user.
- the controller by providing the controller with a function as an output device that outputs image data from the game device, it is possible to develop an unprecedented game application. I can expect.
- the new input interface facilitates the development of an unprecedented application not only in the game field but also in processing devices in other fields.
- an object of the present invention is to provide a terminal device that realizes new operability and can be used not only as an input device but also as an output device.
- This terminal device may be used as an input device or an output device.
- Another object of the present invention is to provide an information processing system using this terminal device.
- a terminal device is a portable terminal device having a curved surface shape or a substantially curved surface shape, and a detection unit that detects contact with the surface of the terminal device;
- a display unit that displays an image, a transmission unit that transmits a detection value by the detection unit, a reception unit that receives image data for generating an image, and an image to be displayed on the display unit using the received image data
- a control unit to generate.
- This information processing system is an information processing system including a terminal device and an information processing device, and the terminal device includes a detection unit that detects contact with the surface of the terminal device, and a display unit that displays an image.
- a transmission unit that transmits a detection value from the detection unit to the information processing device, a reception unit that receives image data for generating an image from the information processing device, and an image that is displayed on the display unit using the received image data
- the information processing apparatus includes a receiving unit that receives a detection value from the terminal device, an application processing unit that generates image data based on the detection value, and a transmission unit that transmits the image data to the information processing device.
- a terminal device that realizes new operability can be provided.
- FIG. 1 It is a figure which shows the use environment of the information processing system concerning the Example of this invention. It is a figure which shows an example of an external appearance structure of a terminal device. It is a figure which shows the cross section of the terminal device shown in FIG. It is a figure which shows the structure inside an inner shell. It is a figure which shows another example of the external appearance structure of a terminal device. It is a figure which shows the cross section of the terminal device shown in FIG. It is a figure which shows the functional block of a terminal device. It is a figure which shows the functional block of information processing apparatus. It is a figure for demonstrating the process which specifies the relative position of the terminal device with respect to an output device. It is a figure which shows the state in which the terminal device is grasped by the user.
- (A) is a figure which shows a mode that a character is displayed on the screen of an output device
- (b) is a figure which shows a mode that the terminal device is pressed against the character currently displayed on the screen
- (c) is a figure which shows a mode that a character is displayed on a terminal device.
- (A) is a figure which shows the state in which the terminal device is pinched with a user's finger
- (b) is an explanation which shows the contact state which a touch sensor detects when a terminal device is pinched with five fingers.
- FIG. (A) is a figure which shows the game image displayed on the screen of an output device
- (b) is a figure which shows the game image displayed on a terminal device.
- FIG. 1 shows a use environment of an information processing system 1 according to an embodiment of the present invention.
- the information processing system 1 includes an information processing device 10, an output device 20, and a terminal device 100.
- the output device 20 is a television, for example, and outputs an image and sound.
- the information processing apparatus 10 is connected to the output device 20 and generates an image and sound to be output from the output device 20.
- the information processing apparatus 10 and the output apparatus 20 may be connected by wire or may be connected wirelessly.
- the terminal device 100 has a wireless communication function and can transmit and receive data by establishing a wireless link with the information processing device 10.
- the information processing system 1 may be a game system that provides a user with an environment for playing a game.
- the information processing device 10 is a game device that executes game software
- the terminal device 100 is a controller device for a user to input to the game.
- the information processing apparatus 10 executes game software based on information that the user operates the terminal device 100 and outputs image data and audio data indicating the execution result to the output device 20.
- the technique shown in the present embodiment can be used not only in a game but also in an information processing system including a processing device that executes other types of applications.
- the portable terminal device 100 has a curved surface shape or a substantially curved surface shape.
- the substantially curved surface means a surface approximated to a curved surface by arranging a plurality of planes side by side.
- the terminal device 100 may have a substantially spherical shape as shown in FIG. 1 or may have an oval shape.
- the terminal device 100 only needs to have a curved surface shape or a substantially curved surface shape at a part thereof, and the entire shape may not be a curved surface or a substantially curved surface, but at least half of the entire outer shape is a curved surface shape or a substantially curved shape. It preferably has a curved shape.
- the center of gravity is preferably set so as to maintain the placed posture.
- the center of gravity of the terminal device 100 is set to be positioned at the center of the sphere. Thereby, the terminal device 100 can maintain the posture when it is placed without rolling when it is placed.
- the terminal device 100 has a shape other than a sphere, for example, when it has an oval shape, the weight balance is set so that the center of gravity is located at the bottom instead of the top, and the bottom is placed on a flat surface. It is preferable that the posture at that time is maintained.
- the center of gravity may be dynamically set so that the center of gravity adjusting mechanism maintains the posture when placed.
- the surface of the terminal device 100 constitutes an operation surface operated by the user. Therefore, the operation surface is preferably formed with a curved surface or a substantially curved surface in order to provide a smooth operation feeling. Further, as shown in FIG. 1, the terminal device 100 preferably has a size that can be held by a user with one hand. Thereby, the user can operate the operation surface while holding the terminal device 100 with one hand, and can also freely operate the operation surface with the other hand while holding the terminal device 100 with one hand.
- the terminal device 100 operates as an input device that is operated by a user and transmits operation input data to the information processing device 10.
- the terminal device 100 includes a contact sensor that detects contact with the surface. When the surface of the terminal device 100 comes into contact with any object such as a user's finger or the display (screen) of the output device 20, the contact position is determined. To detect. At this time, not only the position but also the pressure may be detected.
- the contact sensor detects contact with the surface of the terminal device 100 by changing the position or strength of the terminal device 100 or by pressing the terminal device 100 against the screen of the output device 20. The terminal device 100 transmits the detected value to the information processing device 10.
- the contact sensor is continuously provided on the curved surface portion of the terminal device 100, and the user can obtain a smooth operation feeling by moving a finger on the curved surface.
- the contact sensor is preferably provided so as to be able to detect contact on the entire surface of the terminal device 100, whereby the user can perform an input operation using the entire surface of the terminal device 100.
- the terminal device 100 transmits input data generated based on the contact operation from the user to the information processing device 10.
- the contact sensor should just be provided in the operation surface.
- the operation surface is formed on a continuous surface on the terminal device 100 so that the user can easily operate the operation surface.
- the terminal device 100 also has a motion sensor that generates a sensor value for detecting the posture.
- the motion sensor has a triaxial acceleration sensor and a triaxial angular velocity sensor.
- the motion sensor may further include a triaxial geomagnetic sensor.
- the information processing apparatus 10 can process the attitude change of the terminal device 100 as input data to the application being executed. Therefore, the terminal device 100 periodically transmits the detection value of the motion sensor to the information processing device 10.
- the terminal device 100 includes a display unit and operates as an output device that displays an image.
- the terminal device 100 receives image data for generating an image from the information processing device 10, and displays the image on the display unit based on the image data.
- the terminal device 100 may generate an image from information stored in its own storage device. For example, a game character or the like is displayed on the display unit.
- a display object such as a game character is drawn based on detection values of a contact sensor, a motion sensor, or the like.
- This display control may be executed by the information processing apparatus 10 or may be executed by a control unit mounted on the terminal device 100.
- the display unit is provided on the curved surface portion of the terminal device 100.
- the display unit is formed of an EL (Electroluminescence) panel that can be flexibly bent, and the display panel is attached to the curved surface portion.
- the display unit is configured by combining a plurality of display panels, but may be configured by a single display panel.
- the display unit may be configured by bonding a plurality of liquid crystal panels to each surface of a substantially curved polyhedron.
- the substantially curved polyhedron refers to a polyhedron having 20 or more planes.
- the terminal device 100 can be arranged from the entire surface by continuously disposing a plurality of display panels so as to cover the entire surface of the terminal device 100 without a gap. The image can be displayed.
- the display object can be interlocked with the state change of the terminal device 100 detected by the contact sensor or the motion sensor.
- the terminal device 100 operates as an input / output device that acquires an operation by the user as a sensor value and displays an image reflecting the sensor value. Since the terminal device 100 includes the sensor value acquisition function and the image display function, an intuitive operational feeling can be provided to the user.
- the terminal device 100 has a touch panel function by overlapping the display unit and the sensor.
- the display unit may be a screen in which a projector is arranged inside the terminal device 100 and the light from the projector is displayed so that the user can visually recognize it from the outside.
- the interior of the terminal device 100 is formed hollow, and the display unit is formed of a transparent or translucent material that projects light from the projector to the outside.
- a core on which a processor is mounted is provided in the center of the space within the terminal device 100.
- a plurality of projectors are provided in the core.
- FIG. 2 shows an example of the external configuration of the terminal device 100.
- the terminal device 100 includes a display unit 112 that displays an image on the surface.
- the display unit 112 is configured by combining a plurality of display panels 110a to 110v (hereinafter referred to as “display panel 110” if not distinguished).
- display panel 110 a plurality of display panels 110 are provided on the back surface of the terminal device 100 shown in FIG.
- the terminal device 100 can display the image from the entire surface by covering the entire outer surface with the plurality of display panels 110 without gaps.
- the terminal device 100 includes a control unit that generates image data to be displayed on each display panel 110.
- the display panel 110 is formed by an EL panel, a liquid crystal panel, or the like.
- a panel that can have a curvature, such as an EL panel, is formed in a curved surface and is directly attached to the surface of a sphere.
- the plurality of display panels 110 are arranged in a truncated icosahedron in which 20 regular hexagons such as soccer balls and 12 regular pentagons are combined. It may be arranged on the surface.
- the surface of the terminal device 100 is protected by a transparent material such as resin, and forms a curved surface shape or a substantially curved surface shape.
- the display panel 110 is a liquid crystal panel and is attached to each surface of a truncated icosahedron
- the surface of the terminal device 100 is covered with a resin so as to have a curved shape, so that the user can A smooth operation feeling can be obtained when the fingers are moved up.
- FIG. 3 shows a cross section of the terminal device 100 shown in FIG.
- An inner core 140 is disposed in the center of the inner space of the terminal device 100.
- the internal core 140 is provided with a control unit that controls processing of the terminal device 100, a communication unit that communicates with an external device, a sensor for detecting the movement and posture of the terminal device 100, and the like.
- a transparent protective layer 130 covering the entirety of the terminal device 100 forms an outer shell, and a display unit 112 having a plurality of display panels 110 provided on the inner surface of the protective layer 130 without gaps is provided inside the protective layer 130. .
- the contact detection unit 122 that detects contact with the surface of the terminal device 100 is provided.
- the contact detection unit 122 includes a plurality of touch sensors 120.
- the plurality of touch sensors 120 are preferably supported by an inner shell 134 that is a hollow sphere, and are provided so as to be able to detect contact on the entire surface of the terminal device 100.
- the plurality of touch sensors 120 are arranged on the inner shell 134 without a gap.
- the contact detection unit 122 may include another sensor, such as a pressure sensor.
- the inner shell 134 is preferably formed of a flexible resin material.
- the inner shell 134 is supported by a rod-shaped support member 132 extending from the inner core 140. Thereby, the inner side of the inner shell 134 is made into a space, and the terminal device 100 can be reduced in weight.
- the support member 132 is preferably formed of an elastic material such as a spring so that the user can crush or deform the terminal device 100 to some extent. Accordingly, the user can recognize that the input operation has been performed because the surface of the spherical terminal device 100 is recessed, and an unprecedented input interface can be realized.
- the internal core 140 and the display unit 112 are electrically connected by, for example, wiring provided inside the support member 132.
- the wiring between the inner core 140 and the contact detection unit 122 may be similarly provided. Since the terminal device 100 shown in FIG. 3 has a hollow structure, wiring can be provided in this space.
- FIG. 4 shows the inner structure of the inner shell 134.
- six support members 132 are provided from the inner core 140 toward the inner shell 134, and are configured to contract in the length direction when pressed.
- the number of the support members 132 may be more than six, and when pressed by a user's finger or the like while maintaining the shape of the sphere of the terminal device 100, the pressed portion is deformed, and when the finger is released, What is necessary is just to be comprised so that it may return to the spherical shape of this.
- the terminal device 100 may have a solid structure filled with an elastic material.
- the hollow structure is superior in terms of weight reduction, but there is an advantage that the shape can be stably maintained by adopting the solid structure.
- FIG. 5 shows another example of the external configuration of the terminal device 100.
- the terminal device 100 has a projector inside, and projects an image on the display unit 112 from the inside.
- the display unit 112 is formed of a material that reflects light from the projector and allows an external user to see the projected image.
- the terminal device 100 of this example needs to take a hollow structure.
- FIG. 6 shows a cross section of the terminal device 100 shown in FIG.
- An inner core 140 is disposed in the center of the inner space of the terminal device 100.
- the internal core 140 includes a control unit, a communication unit, various sensors, and the like.
- a plurality of projectors 142 are provided on the surface of the inner core 140.
- a display unit 112 serving as a transparent or translucent projection surface is provided on the inner side of the transparent protective layer 130. Further, as shown in FIG.
- a contact detector 122 is formed.
- the inner shell 134 and the contact detection unit 122 are formed of a transparent material, and image light passes through the inner shell 134 and the contact detection unit 122.
- a plurality of projectors 142 are provided on the inner core 140, and image light reaches the entire display unit 112 by combining light output from the projectors 142.
- FIG. 7 shows functional blocks of the terminal device 100.
- the terminal device 100 includes a display unit 112 and a contact detection unit 122 on the surface thereof, and includes a control unit 200, a communication unit 150, and a motion sensor 160 in the internal core 140.
- Each component in the terminal device 100 is supplied with power by a battery (not shown).
- the battery is charged by wireless power feeding.
- the communication unit 150 includes a transmission unit 152 and a reception unit 154, and transmits / receives data to / from the information processing apparatus 10 using a predetermined communication protocol such as IEEE 802.11 or IEEE 802.15.1.
- the motion sensor 160 is a detection unit that detects data for detecting the movement and posture of the terminal device 100, and includes a triaxial acceleration sensor 162, a triaxial angular velocity sensor 164, and a triaxial geomagnetic sensor 166.
- the control unit 200 receives each detection value from the contact detection unit 122 and the motion sensor 160 and supplies it to the communication unit 150.
- the plurality of touch sensors 120 each supply a detection value to the control unit 200. At this time, each touch sensor 120 sends a detection value to the control unit 200 together with an identification number for identifying itself. As a result, the identification number and the detected value are associated with each other and sent to the control unit 200.
- the control unit 200 may identify the touch sensor 120 by a port to which the detection value is input, and associate the identification number with the detection value. In this way, the control unit 200 associates the identification number of the touch sensor 120 with the detection value and causes the information processing apparatus 10 to transmit the information from the transmission unit 152.
- the receiving unit 154 receives image data from the information processing apparatus 10 at a predetermined cycle.
- the reception cycle is set to 10 milliseconds as with the transmission cycle.
- the control unit 200 receives the image data from the reception unit 154, the control unit 200 generates an image to be displayed on the display unit 112 and displays the image on each display panel 110.
- Image drawing data to be displayed on the display panel 110 may be generated by the control unit 200 or may be generated by the information processing apparatus 10.
- FIG. 6 when projecting an image from the projector 142, the control unit 200 provides image data to the projector 142.
- the terminal device 100 may operate not only as the display unit 112 but also as an output device including a speaker.
- the receiving unit 154 receives audio data from the information processing apparatus 10, and the speaker outputs audio.
- the audio data is compressed together with the image data and transmitted, and the control unit 200 outputs the drawing data to the display unit 112 and the audio data to the speaker.
- the reception unit 154 may receive a vibrator drive signal from the information processing apparatus 10 and the control unit 200 may drive the vibrator.
- FIG. 8 shows functional blocks of the information processing apparatus 10.
- the information processing apparatus 10 includes a communication unit 30, a terminal information processing unit 40, and a control unit 50.
- the communication unit 30 includes a transmission unit 32 and a reception unit 34 and transmits / receives data to / from the terminal device 100 using a predetermined communication protocol such as IEEE802.11 or IEEE802.15.1.
- the terminal information processing unit 40 includes a posture specifying unit 42 and a position specifying unit 44.
- the posture specifying unit 42 specifies the posture and movement of the information processing apparatus 10
- the position specifying unit 44 specifies the position of the information processing apparatus 10 in the space. Specifically, the position specifying unit 44 specifies the position on the screen of the output device 20 when the terminal device 100 touches the display screen of the output device 20.
- the control unit 50 includes a key setting unit 52 and an application processing unit 54.
- the key setting unit 52 has a function of setting an operation key at an arbitrary position on the operation surface of the terminal device 100 when the terminal device 100 is used as a game controller.
- the application processing unit 54 has a function of acquiring sensor information and terminal information from the terminal device 100 and the terminal information processing unit 40 and reflecting the sensor information and terminal information in application processing.
- each element described as a functional block for performing various processes can be configured by a CPU (Central Processing Unit), a memory, and other LSIs in terms of hardware, and memory in terms of software. This is realized by a program loaded on the computer. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
- CPU Central Processing Unit
- the receiving unit 34 receives the detection value of the motion sensor 160 from the terminal device 100 and passes it to the posture specifying unit 42.
- the posture identifying unit 42 holds in advance the detection value of the triaxial acceleration sensor 162 in the terminal device 100 at rest as a reference value in the reference posture of the terminal device 100. If the terminal device 100 is stationary, the posture specifying unit 42 specifies the current posture from the difference between the received detection value and the reference value. When the terminal device 100 is moving, the posture specifying unit 42 specifies the current posture from the detection value of the triaxial acceleration sensor 162 and the detection value of the triaxial angular velocity sensor 164.
- the posture specifying unit 42 also specifies the movement of the terminal device 100 from the detection value of the triaxial acceleration sensor 162 and the detection value of the triaxial angular velocity sensor 164.
- the specified movement of the terminal device 100 includes the moving direction and the moving amount of the terminal device 100.
- the posture specifying unit 42 may specify the posture and movement of the terminal device 100 in consideration of the detection value of the geomagnetic sensor 166.
- the terminal device 100 shown in the embodiment has a spherically symmetric shape, even if the user rotates or tilts the terminal device 100, the user is not aware of the current posture of the terminal device 100. Therefore, it is important for the posture specifying unit 42 to grasp the posture of the terminal device 100 in real time using the detection value of the motion sensor 160 when executing an application described later.
- the position specifying unit 44 specifies the position of the terminal device 100.
- the position specifying unit 44 only needs to be able to specify the relative positional relationship between the terminal device 100 and the output device 20.
- the terminal device 100 touches the screen of the output device 20. It is only necessary that the position of the output device 20 on the screen can be specified. Further, it is preferable that the position specifying unit 44 can specify the distance between the terminal device 100 and the screen of the output device 20.
- a camera that captures the screen of the output device 20 may be arranged so that the position specifying unit 44 can specify the position when the terminal device 100 touches the screen.
- the position specifying unit 44 specifies the contact position on the screen of the output device 20 at that time.
- the position specifying unit 44 may determine whether or not the terminal device 100 has touched any object based on the detection value of the contact detection unit 122, and the camera when contact is detected by the contact determination.
- the contact position on the screen of the output device 20 may be specified from the captured image.
- FIG. 9 is a diagram for explaining processing in which the position specifying unit specifies the relative position of the terminal device 100 with respect to the output device 20 from the captured image of the camera.
- FIG. 9A shows a state in which the camera 60 is arranged at the upper right corner of the screen of the output device 20.
- the position specifying unit 44 virtually sets the X axis in the horizontal direction of the screen and the Y axis in the vertical direction of the screen, the coordinates of the upper right corner of the screen are (0, 0), and the screen
- the coordinates of the upper left corner are set to (Xmax, 0)
- the coordinates of the lower left corner of the screen are set to (Xmax, Ymax)
- the coordinates of the lower right corner of the screen are set to (0, Ymax).
- the arrangement position of the camera 60 is not limited to the upper right corner of the screen of the output device 20, but may be any place where the entire screen of the output device 20 enters the angle of view of the camera 60 and the camera 60 can capture the entire screen of the output device 20. Good.
- the camera 60 may be installed at the upper center position of the screen of the output device 20.
- FIG. 9A shows a state in which the user presses the terminal device 100 against the screen of the output device 20.
- FIG. 9B shows a captured image of the camera 60.
- the position specifying unit 44 specifies the coordinates (x, y) in the XY space where the terminal device 100 exists from the captured image.
- the position specifying unit 44 detects the contact of the terminal device 100 based on the detection value of the contact detection unit 122, the coordinates (x, y) are specified by specifying the contact point between the terminal device 100 and the output device 20 in the captured image. ) And the contact position on the screen of the output device 20 is obtained.
- the position specifying unit 44 can also specify the relative position of the terminal device 100 with respect to the output device 20 from only the captured image of the camera 60 without using the result of the contact determination.
- the position specifying unit 44 may detect whether the terminal device 100 is in contact with the screen of the output device 20 by analyzing the captured image.
- the position specifying unit 44 may be configured to derive the distance between the terminal device 100 and the screen of the output device 20 by image analysis.
- a camera having an optical axis parallel to the screen is separately installed, and the position specifying unit 44 determines contact from the captured image of the camera, Alternatively, the separation distance may be derived.
- the position when the terminal device 100 touches the screen may be specified by configuring the screen of the output device 20 with a touch panel.
- the position specifying unit 44 determines whether or not the terminal device 100 has touched any object based on the detection value of the contact detection unit 122. If the timing at which the terminal device 100 is determined to be in contact with the detection value of the contact detection unit 122 coincides with the timing at which the touch panel of the output device 20 is touched, the position specifying unit 44 determines that the touch panel of the output device 20 It is determined that the terminal device 100 has touched the position where the touch has been made. In addition, it may be determined that the terminal device 100 and the screen of the output device 20 are in contact with each other using not only the timing match but also the contact area information.
- the terminal device 100 periodically transmits detection value data in which the identification number of the touch sensor 120 and the detection value are associated with each other to the information processing device 10.
- the information processing apparatus 10 holds a table in which the identification number of the touch sensor 120 is associated with the position of the touch sensor 120 in the terminal device 100. This table is held by the control unit 50 and is used by the control unit 50 to generate data reflected in the processing of the application, but may be held by the position specifying unit 44.
- the position specifying unit 44 can specify the position of the touch sensor 120 that has detected contact in the terminal device 100 by holding this table.
- the position specifying unit 44 derives the contact area in the terminal device 100 from the plurality of touch sensors 120 that have newly detected contact. Since the terminal device 100 is leaned by the user's finger, the touch sensor 120 that detects contact with the finger exists before the contact between the terminal device 100 and the output device 20, but the position specifying unit 44 is By specifying the touch sensor 120 that has newly detected contact, the contact area on the terminal device 100 when the terminal device 100 contacts the output device 20 is specified. The position specifying unit 44 compares the area where the touch is made on the touch panel of the output device 20 with the area where the terminal device 100 is newly contacted. Contact with the screen of the output device 20 is determined. By using the touch panel, the contact position on the screen can be easily specified.
- the position specifying unit 44 specifies the position on the screen when the terminal device 100 touches the screen of the output device 20.
- usage examples of the terminal device 100 in the information processing system 1 will be described. Note that the processing and functions of the information processing apparatus 10 described in one usage example can be applied to another usage example.
- FIG. 10 shows a state where the terminal device 100 is held by the user.
- the palm and fingers come into contact with the surface of the terminal device 100.
- the touch sensor 120 located below the contact area supplies a detection value indicating that there is a contact to the control unit 200.
- the control unit 200 associates the identification number of each touch sensor 120 with the detection value, generates detection value data, and the transmission unit 152 transmits the detection value data to the information processing apparatus 10.
- the key setting unit 52 specifies the type of the finger from the detection values obtained by the touch sensors 120.
- the control unit 50 holds a table in which the identification number of the touch sensor 120 is associated with the position of the touch sensor 120 in the terminal device 100. Accordingly, when receiving the detection value data, the key setting unit 52 can reproduce the contact state of the terminal device 100 by developing the included identification number and detection value on the virtual sphere. The key setting unit 52 detects the palm from the reproduced contact state.
- the key setting unit 52 When the key setting unit 52 reproduces the contact state on the virtual sphere from the detection value data, five continuous elongated contact areas and a large continuous contact area located near the ends of the five contact areas are detected. The First, the key setting unit 52 identifies a large contact area as a palm. Subsequently, among the five elongated contact areas extending from the palm, two contact areas extending from both ends of the palm are specified. The key setting unit 52 compares the contact areas at both ends, and specifies a thick finger as a thumb and a thin finger as a little finger in the short direction of the contact area.
- the key setting unit 52 may compare the lengths in the longitudinal direction of the contact areas at both ends of the elongated portion extending from the palm, and may specify, for example, a short finger as a thumb and a thin finger as a little finger. Further, the key setting unit 52 may compare the areas of the contact areas at both ends of the elongated portion extending from the palm, and may specify, for example, a finger with a large area as a thumb and a small finger as a little finger. As described above, by specifying the thumb and the little finger, the fingers in the three contact areas between the thumb and the little finger can be specified, and the tip position of the finger can also be specified.
- the key setting unit 52 sets a key at the tip position of the finger.
- an up key 302a is set at the tip of the middle finger
- a left key 302b is set at the tip of the index finger
- a right key 302c is set at the tip of the ring finger
- a direction key 302d is set at the tip of the thumb.
- the key setting unit 52 delivers the position information of the direction key 302 set at the tip of each finger as key setting information to the application processing unit 54 that executes the application.
- the application processing unit 54 monitors the input from the user according to the key setting information.
- an input to the application is performed when the user removes his / her finger from the terminal device 100.
- the contact with the set area of the upward key 302a is lost, and the detection value of the touch sensor 120 at that position changes.
- the detection value of the touch sensor 120 in the area where the up direction key 302a is set changes from the on value indicating contact to the off value indicating non-contact while receiving the detection value data
- the application processing unit 54 In addition, it is detected that the upward key 302a has been input, and this is reflected in the processing of the application.
- an input to the application may be performed. For example, when the user lifts the middle finger away from the terminal device 100, the contact with the set area of the upward key 302a is lost, and the detection value of the touch sensor 120 at that position changes to an off value indicating non-contact. If the detected value again becomes an ON value indicating contact within a predetermined time after the change of the detected value, the application processing unit 54 detects that the upward key 302a has been input and reflects it in the processing of the application. May be.
- the application processing unit 54 monitors the contact / non-contact of the position opposite to the position where the re-contact occurred, specifically, the terminal device 100 from the re-contact position through the center of the sphere to the opposite position. Simultaneously with the detection of re-contact, when it is detected that a larger pressure is applied to the opposite position, it may be detected that the up key 302a has been input. For example, an increase in pressure is detected by an increase in the number of touch sensors 120 that output ON values.
- the contact detection part 122 may be comprised with a pressure sensor.
- a pressure sensor When configured with a pressure sensor, an input to the application may be performed when the user strongly presses the surface of the terminal device 100 with a finger.
- the application processing unit 54 monitors the detected value of the pressure sensor in the area where the direction key 302 is set according to the key setting information, and when the detected value becomes larger than a predetermined value, Detects input and reflects it in the application process.
- the key setting unit 52 dynamically sets the key in accordance with the position of the finger when the user grips the terminal device 100, so that the user inputs the terminal device 100 in a natural manner. It can be used as a device. Also, the type of key to be assigned can be dynamically changed according to the application to be executed, and an unprecedented input device can be realized. For example, in the example shown in FIG. 10, the up / down / left / right direction keys 302 are assigned to the tips of four fingers. For example, in an application that causes a character to perform a jump operation, a squatting operation, a forward operation, and a backward operation, the respective operations are performed. The designated operation key may be assigned to the tip region of each finger.
- the key setting unit 52 performs a process for changing the key setting information for the finger. That is, the key setting unit 52 changes the original area to the new area when the contact with the area specified by the original key setting information is lost and the detection of the contact with the new area continues for a predetermined time. Regenerate key setting information.
- the key setting unit 52 executes the finger specifying process again to generate key setting information.
- the key setting unit 52 assigns a key, it holds the relative positional relationship of the assigned key.
- the key setting unit 52 may hold the key setting information as a relative positional relationship.
- the key setting unit 52 displays the assigned key image on the display unit 112 based on the held positional relationship. indicate. In the example of FIG. 10, an arrow mark may be displayed. Since the positional relationship held by the key setting unit 52 is in accordance with the size of the user's hand, the user can omit the key setting process by placing a finger on the displayed key image.
- the key setting unit 52 assigns different areas each time a key is assigned based on the held positional relationship. Since the user operates the key while holding the terminal device 100, if the key is always assigned to the same area, there is a possibility that the display panel 110, the touch sensor 120, and the like in that portion are quickly deteriorated. For this reason, when the key assignment is performed, the posture at the time of key assignment is randomly set based on the reference posture, so that deterioration can be prevented from being concentrated at a specific location. For example, when the relative positional relationship of the assigned key is held as a positional relationship based on the upward key 302a assigned to the middle finger, the assigned region is determined by randomly determining the region of the upward key 302a each time. Can be different each time. When the key setting unit 52 holds the previous key setting information, the area of the up direction key 302a is arbitrarily set, and the area of the other direction key is set to the previous key using the area as a reference. You may obtain
- the terminal device 100 When the terminal device 100 is used as a game controller, the terminal device 100 is assigned a controller number from the information processing device 10.
- the application processing unit 54 causes the display unit 112 to display an image indicating the assigned controller number for a predetermined period. Thereby, the user can know the assigned controller number. Thereafter, the key assignment process described above may be performed.
- the application processing unit 54 generates a display image of the object and outputs it to the output device 20.
- FIG. 11A shows how a character is displayed on the screen of the output device 20. The user brings the terminal device 100 into contact with an area on the screen where the character is displayed.
- FIG. 11B shows a state where the terminal device 100 is pressed against the character displayed on the screen.
- the position specifying unit 44 specifies the position on the screen of the output device 20 when the terminal device 100 contacts the screen of the output device 20.
- the application processing unit 54 grasps the display position of the character, and determines whether the character display position matches the position specified by the position specifying unit 44. If they match, the application processing unit 54 generates character image data for the terminal device 100 so as to produce an effect that the character moves from the output device 20 to the terminal device 100.
- the control unit 50 holds a table in which the identification number of the touch sensor 120 in the terminal device 100 is associated with the position of the touch sensor 120 in the terminal device 100.
- the control unit 50 also holds a table in which the identification number of the display panel 110 in the terminal device 100 is associated with the position of the display panel 110 in the terminal device 100.
- the control unit 50 also holds a table for specifying the positional relationship between the touch sensor 120 and the display panel 110.
- the application processing unit 54 identifies the touch sensor 120 that has output a detection value indicating that the screen of the output device 20 has been touched from the detection value of each touch sensor 120.
- the application processing unit 54 specifies the touch sensor 120 that newly detects contact from the state in which the terminal device 100 is held by the user's finger. This specification may be performed by the position specifying unit 44, and the application processing unit 54 may acquire the identification number of the touch sensor 120 that has touched the output device 20 from the position specifying unit 44.
- the application processing unit 54 uses the table that specifies the positional relationship between the touch sensor 120 and the display panel 110 to specify the display panel 110 located opposite to the contact position.
- the display panel 110 is present at the center position of the terminal device 100 in FIG.
- the application processing unit 54 specifies the display panel 110 located at the center of the terminal device 100 in FIG. 11B
- the character image data for the terminal device 100 is such that the center of the character image comes to the display panel 110. Is generated.
- the display unit 112 of the terminal device 100 the character image is divided and displayed on the plurality of display panels 110.
- the application processing unit 54 displays the identification number of the display panel 110 and the display panel 110.
- the image data of the entire character is generated in association with the image data to be processed.
- the transmission unit 32 transmits the generated image data to the terminal device 100, and the control unit 200 generates an image to be displayed on each display panel 110 based on the identification number of the specified display panel 110 and the image data.
- FIG. 11C shows a state in which the character is displayed on the terminal device 100.
- the output device 20 ends the character display.
- the application processing unit 54 When the terminal device 100 is pressed against the screen of the output device 20 again while the character is displayed in FIG. 11C, the application processing unit 54 returns the character to the screen position of the pressed output device 20. Also good.
- the position on the screen of the output device 20 where the terminal device 100 is pressed is specified by the position specifying unit 44.
- an application in which the user moves the character to the terminal device 100 and returns it to the output device 20 can be realized. Since the application processing unit 54 can stop displaying the character on the terminal device 100 and display the character at an arbitrary position on the screen of the output device 20, for example, while the character is moving to the terminal device 100, the character is displayed. Makes it possible to create a more interactive game that restores physical strength and then returns the character to the game screen.
- 11A to 11C show an example in which the character display is switched from the screen of the output device 20 to the display unit 112 of the terminal device 100.
- the display object according to the contact position on the screen is shown. May be displayed on the display unit 112.
- a help screen related to the content displayed at the contact position of the output device 20 may be displayed on the display unit 112. .
- the position specifying unit 44 can derive the distance between the terminal device 100 and the screen of the output device 20.
- FIG. 11B shows a state in which the terminal device 100 is pressed against the screen of the output device 20.
- the position specifying unit 44 is connected to the terminal device 100.
- the screen of the output device 20 are derived and transmitted to the application processing unit 54.
- the application processing unit 54 is closer than a predetermined distance, a character to be displayed on the terminal device 100 is selected. The user may be notified of this.
- This notification may be performed, for example, by generating an image in which the character is sucked in the display unit 112 of the terminal device 100, and the character is sucked into the terminal device 100 on the screen of the output device 20. This may be done by generating a simple image. By such notification, the user can recognize the character to be displayed on the display unit 112 of the terminal device 100, and the complete character may be displayed on the display unit 112 when pressed.
- FIG. 12A shows a state where the terminal device 100 is sandwiched between the fingers of the user.
- the application processing unit 54 specifies the state in which the terminal device 100 is held from the detection value of each touch sensor 120, specifically, specifies whether the terminal device 100 is the right hand or the left hand, Guess the position.
- the application processing unit 54 determines whether the holding hand is the right hand or the left hand from the detection value of each touch sensor 120.
- FIG. 12B is an explanatory diagram showing a contact state detected by the touch sensor 120 when the terminal device 100 is sandwiched between five fingertips.
- five contact areas 304a to 304e are detected.
- the contact region 304e expressed by a dotted line is located on the lower surface side of the terminal device 100 in the state shown in FIG.
- the application processing unit 54 determines the right hand or the left hand by detecting the smallest contact area 304d.
- the application processing unit 54 determines that the finger in the contact area 304d is a little finger.
- the finger in the contact area 304c is the ring finger
- the finger in the contact area 304b is the middle finger
- the finger in the contact area 304a is the index finger. It can also be determined that the contact area 304e is a thumb.
- the application processing unit 54 may specify the finger of each contact area by specifying the contact area of the thumb. When the terminal device 100 is sandwiched between five fingers, the contact area of the thumb is the largest. Therefore, the application processing unit 54 determines that the finger in the contact area 304e is a thumb. When the thumb is specified, since the index finger is present at a position closest to the thumb, it can be determined that the finger in the contact area 304a is the index finger. Therefore, the contact area of the middle finger, the ring finger, and the little finger can be determined in order.
- the application processing unit 54 may specify the thumb and the little finger from the area of each contact area 304 using the fact that the contact area of the thumb is the largest and the contact area of the little finger is the smallest.
- the application processing unit 54 may specify a finger from the interval of the contact area 304.
- the thumb and the four fingers other than the thumb tend to be arranged as shown in FIG. That is, the interval between the four fingers other than the thumb is relatively narrow, and the thumb is disposed so as to be relatively separated from the other fingers.
- the application processing unit 54 calculates the interval between one contact region 304 and the other four contact regions 304, and specifies the smallest interval among the four calculated intervals. With this process, the minimum interval for each of the five fingers is derived.
- the application processing unit 54 compares the minimum intervals in the contact areas 304 with each other.
- the ratio of the largest minimum interval to the second largest minimum interval is derived, and when (largest minimum interval) / (second largest minimum interval) is 2 or more, the size of the hand It is determined that the terminal device 100 is relatively large, and the following finger specifying process is executed.
- the application processing unit 54 identifies the contact area 304 having the largest minimum interval as the contact area of the thumb. In the case of FIG. 12B, it is specified that the contact area 304e is a thumb. Once the thumb is identified, the index finger is then identified. When the terminal device 100 is relatively large with respect to the size of the hand, the distance between the thumb and the index finger is narrower than the distance between the thumb and the little finger. Therefore, the application processing unit 54 specifies the contact area 304a closest to the contact area 304e as the contact area of the index finger. Thereafter, the application processing unit 54 specifies the contact area of the middle finger, the ring finger, and the little finger.
- the terminal device 100 when the terminal device 100 is relatively small with respect to the size of the hand, that is, when (largest minimum interval) / (second largest minimum interval) is smaller than 2, the user is not a finger belly, The terminal device 100 is held at the tip of the finger. Therefore, in this case, as described above, the finger specifying process using the area of the contact region 304 is effective.
- the application processing unit 54 estimates the palm position.
- the lengthwise ends of the contact areas 304a to 304d are connected, virtual lines that do not intersect with each other are drawn (312a and 312b in the figure), and the direction of the palm is estimated based on the degree of bending.
- the palm is positioned on the direction 306 side.
- the palm is positioned on the direction 308 side.
- the application processing unit 54 determines that the terminal device 100 is pinched with a finger as shown in FIG. 12A, determines to display the character in the display area 310, and displays the character.
- the application processing unit 54 monitors the arrangement of the contact area 304 and sets the display area 310 in real time. Thereby, even when the user changes the position where the terminal device 100 is held, the display area 310 can be set at an appropriate position and the character can be displayed.
- the application processing unit 54 does not set the display area 310 at least in the contact area 304. Further, the application processing unit 54 does not set the display area 310 in the area in the direction in which the palm is estimated to exist. As described above, the application processing unit 54 does not set the display area 310 at a position that the user cannot see, and therefore specifies the finger of the contact area 304 even when the user changes the terminal device 100. By specifying the direction in which the palm exists, the display area 310 can be dynamically set at an appropriate position.
- the application processing unit 54 detects a “tracing operation” from the detection value of each touch sensor 120.
- the “tracing operation” is detected when the contact area on the surface of the terminal device 100 moves in one direction.
- the application processing unit 54 determines from the detection value of each touch sensor 120 that the contact area is moving, the application processing unit 54 generates image data so as to move the character image in that direction, and the transmission unit 32 transmits the image data.
- the control unit 200 displays a character image on the display unit 112 based on the image data. Thereby, the character which moves in the traced direction is displayed on the display unit 112.
- the help screen is displayed on the display unit 112. However, when the user traces the display panel 110 on which the help screen is displayed, the next page of the help screen is displayed. You may do it. Further, when the user twists the terminal device 100 at a predetermined angle in the horizontal plane, the page may be turned. The twist angle of the terminal device 100 is derived from the detection value of the angular velocity sensor 164.
- Application processing unit 54 generates image data of the earth. Since the application processing unit 54 knows the positional relationship between the display panel 110 and the touch sensor 120, when the touch sensor 120 that outputs a detection value indicating that there is a touch is specified, the virtual position at the position of the touch sensor 120 is identified. The latitude and longitude on the earth can be specified. At this time, the application processing unit 54 specifies the latitude and longitude designated by the user by specifying the touch sensor 120 that has output a detection value indicating that a new contact has occurred. When the application processing unit 54 generates information related to an area existing in the latitude and longitude, and the transmission unit 32 transmits the information to the terminal device 100, the control unit 200 displays the information on the display unit 112.
- the earth rotates in accordance with the rotation.
- image processing is performed so as to rotate by inertia like a globe according to the previous rotation.
- the application processing unit 54 derives the rotation speed of the terminal device 100 from the detection value of the angular velocity sensor 164.
- the rotation speed is derived at an arbitrary timing during the rotation.
- the application processing unit 54 holds the derived rotation speed.
- the application processing unit 54 controls the Earth image to be displayed to rotate in the rotation direction so far and gradually decrease the rotation speed.
- the application processing unit 54 gradually reduces the rotation speed of the displayed earth image as if the globe is rotating with inertia based on the rotation speed derived when the terminal device 100 is rotated. Further, even when the user removes his / her hand from the terminal device 100 and places it on a desk, the user may rotate by inertia.
- the application processing unit 54 derives the rotation speed and controls the display image to rotate in the opposite direction to the terminal device 100, thereby rotating the terminal device 100. Regardless, the same image may be seen from the same direction.
- thumbnails of contents such as music files and movie files are arranged on the display unit 112.
- the content is held in the information processing apparatus 10, and when the user selects a thumbnail, the content is played on the output device 20.
- the application processing unit 54 divides longitude (vertical direction) by genre and divides latitude (horizontal direction) by age. , Allowing the user to select content sensuously.
- the opportunity to display the thumbnail is, for example, when the user shakes the terminal device 100.
- the application processing unit 54 detects that the terminal device 100 is shaken from the detection value of the acceleration sensor 162, the thumbnail image of the content is displayed.
- Such information presentation can also be used to display, for example, the access ranking of articles on the Internet.
- the longitude is divided by genre and the latitude is divided by rank so that the user can grasp the ranking sensuously.
- the application processing unit 54 detects the tapped position from the detection value of the touch sensor 120, identifies the thumbnail displayed at the position, and stores the content when detecting that the terminal device 100 is shaken. Read from device and play.
- an image different from the game image displayed on the screen of the output device 20 is displayed on the display unit 112 of the terminal device 100 during the execution of the game.
- the game image is a video from the character's viewpoint
- an image from another viewpoint is displayed on the display unit 112.
- the application processing unit 54 detects the movement of the terminal device 100, displays the image from the viewpoint above the character on the display unit 112 when the terminal device 100 is lifted up on the basis of the character viewpoint, The image from the viewpoint left from the character is displayed on the display unit 112.
- Various images can be displayed on the display unit 112 depending on the game, and the user can move the terminal device 100 to search for an item or look down on the game space. The information may be acquired.
- FIG. 13A shows a game image displayed on the screen of the output device 20.
- This game image is a video from the viewpoint of the character.
- the application processing unit 54 installs a virtual camera behind the character in the three-dimensional game space, generates a game image based on operation data from the user, and outputs the game image from the output device 20.
- the application processing unit 54 generates image data that matches the configuration of the display unit 112 of the terminal device 100, and the transmission unit 32 transmits the image data to the terminal device 100.
- the display unit 112 displays a game image. Thereby, the same game image as that of the output device 20 is also displayed on the display unit 112 of the terminal device 100.
- FIG. 13B shows a game image displayed on the terminal device 100 when the user moves the terminal device 100 to the left.
- the posture specifying unit 42 specifies the moving direction and moving amount of the terminal device 100 from the detection value of the motion sensor 160.
- the application processing unit 54 receives the movement direction and movement amount of the terminal device 100 from the posture specifying unit 42, the application processing unit 54 converts the movement direction and movement amount into the movement direction and movement amount of the virtual camera in the three-dimensional game space. Note that a character is included in the angle of view of the virtual camera.
- the application processing unit 54 generates a game image shot by the moved virtual camera, and generates image data that matches the display unit 112 of the terminal device 100.
- the transmission unit 32 transmits the image data to the terminal device 100, and the control unit 200 receives the image data in the terminal device 100 and causes the display unit 112 to display the image data.
- the user can see the game image different from the display image of the output device 20 on the terminal device 100, and in the example of FIG. 13B, can find that the dog is behind the tree.
- a hint for game capture when the user presses the terminal device 100 against the screen of the output device 20, information that is not displayed in the game image, such as a hint for game capture, may be displayed on the display unit 112.
- the entire display unit 112 may shine to indicate to the user that it is not a specific hint but a timing to focus on.
- the hints may be presented in accordance with the progress of the game. For example, when a chance comes, a hint, light emission for calling attention, or the like may be displayed on the display unit 112. This hint presentation may be executed by, for example, outputting sound from a speaker.
- a replay image of the game may be displayed on the display unit 112. When displaying the replay image, the viewpoint may be changed by changing the attitude of the terminal device 100, for example.
- the fact that a plurality of terminal devices 100 are brought into contact is reflected in the processing of the application. For example, if it is a game, the process which makes it think that the terminal devices 100 contacted is performed, such as a characteristic of a character changing or a character uniting.
- the contact between the terminal devices 100 is determined by the normal vectors on the contact surfaces being opposite to each other.
- the normal vector is determined by the attitude of the terminal device 100 and the contact area.
- the terminal device 100 may be configured to start when it receives a start signal from the information processing device 10 and to sleep when it receives an end signal. In this case, the terminal device 100 can operate on the condition that the information processing device 10 is operating. Therefore, for example, when a signal cannot be received from the information processing device 10 for a predetermined time, the terminal device 100 automatically sleeps. It may be configured. Note that the terminal device 100 may be configured to monitor the detection value of the contact detection unit 122 or the motion sensor 160 in the sleep state and to autonomously start when the terminal device 100 enters a predetermined state.
- the predetermined state may be, for example, a state in which almost the entire surface of the terminal device 100 is covered, a state in which the terminal device 100 is moved at a very high speed, or the like. .
- the terminal device 100 may be configured to sleep by entering the predetermined operation state during operation.
- the application processing unit 54 may specify the orientation of the palm from the attitude of the terminal device 100. By specifying the orientation of the palm, it is possible to determine whether the terminal device 100 is gripped with the palm facing upward or with the palm facing downward.
- the key setting unit 52 performs the contact area specifying process and the like.
- the application processing unit 54 performs the same process.
- the posture identifying unit 42 identifies the current posture from the difference between the detection value of the motion sensor 160 received from the terminal device 100 and the reference value
- the application processing unit 54 determines the current posture identified by the posture identifying unit 42 and the current posture. From the position of the touch sensor 120 that detects the contact of the palm, it is specified which direction the palm is facing with respect to the direction of gravity.
- the application processing unit 54 may use the palm direction as input data for reflecting the palm direction in the processing of the application.
- the application processing unit 54 in the information processing apparatus 10 generates image data to be displayed on the display unit 112.
- the control unit 200 in the terminal device 100 has the same function as the application processing unit 54, and Data may be generated.
- DESCRIPTION OF SYMBOLS 1 ... Information processing system, 10 ... Information processing apparatus, 20 ... Output device, 30 ... Communication part, 32 ... Transmission part, 34 ... Reception part, 40 ... Terminal information Processing unit 42 ... Posture specifying unit 44 ... Position specifying unit 50 ... Control unit 52 ... Key setting unit 54 ... Application processing unit 100 ... Terminal device 110 ... Display panel, 112 ... Display section, 120 ... Touch sensor, 122 ... Contact detection section, 130 ... Protective layer, 132 ... Support member, 134 ... Inner shell, 140 ... Inner core, 142 ... Projector, 150 ... Communication unit, 152 ... Transmission unit, 154 ... Reception unit, 160 ... Motion sensor, 162 ... Acceleration sensor, 164 ... -Angular velocity sensor, 166 ... Geomagnetic sensor, 200 ... control unit.
- the present invention can be applied to the field of user interface.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010-237899 | 2010-10-22 | ||
| JP2010237899A JP5769947B2 (ja) | 2010-10-22 | 2010-10-22 | 端末装置および情報処理システム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012053141A1 true WO2012053141A1 (ja) | 2012-04-26 |
Family
ID=45974873
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2011/004538 WO2012053141A1 (ja) | 2010-10-22 | 2011-08-10 | 端末装置および情報処理システム |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP5769947B2 (enrdf_load_stackoverflow) |
| WO (1) | WO2012053141A1 (enrdf_load_stackoverflow) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2013232044A (ja) * | 2012-04-27 | 2013-11-14 | Toshiba Corp | 電子機器、制御方法およびプログラム |
| WO2014199154A1 (en) * | 2013-06-11 | 2014-12-18 | Sony Computer Entertainment Europe Limited | Head-mountable apparatus and systems |
| IT202200014668A1 (it) * | 2022-07-12 | 2022-10-12 | Pietro Battistoni | Metodo per l'interazione uomo-computer basato sul tatto ed interfacce utente tangibili. |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014026318A (ja) * | 2012-07-24 | 2014-02-06 | Ricoh Co Ltd | 電力管理装置、電力管理システム、電力管理方法およびプログラム |
| JP6417673B2 (ja) * | 2013-05-08 | 2018-11-07 | 株式会社デンソー | 車両用操作検出システム、車両用操作検出ユニット、及び車両用操作検出装置 |
| JP6715562B2 (ja) * | 2014-03-27 | 2020-07-01 | 任天堂株式会社 | 情報処理システム、情報処理プログラム、情報処理方法、情報処理端末 |
| JP6402348B2 (ja) * | 2015-03-30 | 2018-10-10 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、ゲーム制御方法及びプログラム |
| JP2017116893A (ja) * | 2015-12-26 | 2017-06-29 | 株式会社村田製作所 | 立体型画像表示装置 |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009094091A1 (en) * | 2008-01-25 | 2009-07-30 | Microsoft Corporation | Projection of graphical objects on interactive irregular displays |
| WO2010067537A1 (ja) * | 2008-12-08 | 2010-06-17 | シャープ株式会社 | 操作受付装置及びコンピュータプログラム |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001154592A (ja) * | 1999-09-13 | 2001-06-08 | Minolta Co Ltd | 表示装置 |
| US9218116B2 (en) * | 2008-07-25 | 2015-12-22 | Hrvoje Benko | Touch interaction with a curved display |
| JP2010244772A (ja) * | 2009-04-03 | 2010-10-28 | Sony Corp | 静電容量式タッチ部材及びその製造方法、並びに静電容量式タッチ検出装置 |
-
2010
- 2010-10-22 JP JP2010237899A patent/JP5769947B2/ja active Active
-
2011
- 2011-08-10 WO PCT/JP2011/004538 patent/WO2012053141A1/ja active Application Filing
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009094091A1 (en) * | 2008-01-25 | 2009-07-30 | Microsoft Corporation | Projection of graphical objects on interactive irregular displays |
| WO2010067537A1 (ja) * | 2008-12-08 | 2010-06-17 | シャープ株式会社 | 操作受付装置及びコンピュータプログラム |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2013232044A (ja) * | 2012-04-27 | 2013-11-14 | Toshiba Corp | 電子機器、制御方法およびプログラム |
| US9001063B2 (en) | 2012-04-27 | 2015-04-07 | Kabushiki Kaisha Toshiba | Electronic apparatus, touch input control method, and storage medium |
| WO2014199154A1 (en) * | 2013-06-11 | 2014-12-18 | Sony Computer Entertainment Europe Limited | Head-mountable apparatus and systems |
| US10198866B2 (en) | 2013-06-11 | 2019-02-05 | Sony Interactive Entertainment Europe Limited | Head-mountable apparatus and systems |
| IT202200014668A1 (it) * | 2022-07-12 | 2022-10-12 | Pietro Battistoni | Metodo per l'interazione uomo-computer basato sul tatto ed interfacce utente tangibili. |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5769947B2 (ja) | 2015-08-26 |
| JP2012093800A (ja) | 2012-05-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5769947B2 (ja) | 端末装置および情報処理システム | |
| JP7277545B2 (ja) | 検出された手入力に基づく仮想手ポーズのレンダリング | |
| JP6093473B1 (ja) | 情報処理方法及び当該情報処理方法をコンピュータに実行させるためのプログラム | |
| US10317997B2 (en) | Selection of optimally positioned sensors in a glove interface object | |
| JP6158406B2 (ja) | 携帯デバイスによるインタラクティブアプリケーションのビデオキャプチャを可能とするためのシステム | |
| CN107368183B (zh) | 用于提供输入的手套 | |
| CN107533369B (zh) | 带有外围装置的手套指尖的磁性跟踪 | |
| JP2019522849A (ja) | 方向インタフェースオブジェクト | |
| KR101576979B1 (ko) | 자기장 센서를 이용하여 사용자 입력을 판단하는 전기 장치 | |
| JP2021501496A (ja) | ロボットユーティリティ及びインターフェースデバイス | |
| WO2018196552A1 (zh) | 用于虚拟现实场景中的手型显示方法及装置 | |
| JP2018194889A (ja) | 情報処理方法、コンピュータ及びプログラム | |
| JP7356827B2 (ja) | プログラム、情報処理方法、及び情報処理装置 | |
| JP7671379B2 (ja) | プログラム | |
| CA2843670A1 (en) | Video-game console for allied touchscreen devices | |
| JP6893532B2 (ja) | 情報処理方法、コンピュータ及びプログラム | |
| JP7071134B2 (ja) | 情報処理装置、動作制御プログラム及び動作制御方法 | |
| JP2018032130A (ja) | 仮想空間における入力を支援するための方法および装置ならびに当該方法をコンピュータに実行させるプログラム | |
| JP6728111B2 (ja) | 仮想空間を提供する方法、仮想体験を提供する方法、プログラム、および記録媒体 | |
| JP6189495B1 (ja) | 仮想空間を提供する方法、仮想体験を提供する方法、プログラム、および記録媒体 | |
| JP7115695B2 (ja) | アニメーション制作システム | |
| WO2025038897A1 (en) | Dedicated handheld spatial computing device | |
| CN118543090A (zh) | 一种虚拟物体移动方法、装置、设备及存储介质 | |
| JP2019012252A (ja) | 触覚提供装置、触覚提供システム、情報処理装置、及びプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11833988 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11833988 Country of ref document: EP Kind code of ref document: A1 |