WO2012053141A1 - Terminal device and information processing system - Google Patents

Terminal device and information processing system Download PDF

Info

Publication number
WO2012053141A1
WO2012053141A1 PCT/JP2011/004538 JP2011004538W WO2012053141A1 WO 2012053141 A1 WO2012053141 A1 WO 2012053141A1 JP 2011004538 W JP2011004538 W JP 2011004538W WO 2012053141 A1 WO2012053141 A1 WO 2012053141A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
unit
image
contact
information processing
Prior art date
Application number
PCT/JP2011/004538
Other languages
French (fr)
Japanese (ja)
Inventor
清人 渋谷
賢次 松岡
明俊 山口
森田 章義
武志 巻島
久生 和田
友恵 落合
Original Assignee
株式会社ソニー・コンピュータエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・コンピュータエンタテインメント filed Critical 株式会社ソニー・コンピュータエンタテインメント
Publication of WO2012053141A1 publication Critical patent/WO2012053141A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1043Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being characterized by constructional details
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/301Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device using an additional display connected to the game console, e.g. on the controller

Definitions

  • the present invention relates to a terminal device that operates as an input device or an output device, and an information processing system using the terminal device.
  • a conventional game controller has an input interface such as a cross key and a button, and is connected to the game device by wire to transmit operation data by the user to the game device.
  • the user holds the left and right grips of the controller with both hands, and operates the cross keys and buttons with fingers.
  • controllers that can wirelessly transmit and receive data to and from game devices are becoming popular due to advances in wireless technology. Since the wireless controller does not require wiring with the game device, the user can play the game at a free position.
  • the wireless controller disclosed in Patent Document 1 transmits and receives data wirelessly, it does not change an input interface such as a cross key or a button. Therefore, there is no difference in operability between the conventional wired controller and the wireless controller disclosed in Patent Document 1.
  • the controller By providing the controller with a new input interface, the range of user operations can be expanded, and variations in game applications that take advantage of the operability can be increased.
  • the conventional controller is used as an input device that accepts an operation by the user.
  • the controller by providing the controller with a function as an output device that outputs image data from the game device, it is possible to develop an unprecedented game application. I can expect.
  • the new input interface facilitates the development of an unprecedented application not only in the game field but also in processing devices in other fields.
  • an object of the present invention is to provide a terminal device that realizes new operability and can be used not only as an input device but also as an output device.
  • This terminal device may be used as an input device or an output device.
  • Another object of the present invention is to provide an information processing system using this terminal device.
  • a terminal device is a portable terminal device having a curved surface shape or a substantially curved surface shape, and a detection unit that detects contact with the surface of the terminal device;
  • a display unit that displays an image, a transmission unit that transmits a detection value by the detection unit, a reception unit that receives image data for generating an image, and an image to be displayed on the display unit using the received image data
  • a control unit to generate.
  • This information processing system is an information processing system including a terminal device and an information processing device, and the terminal device includes a detection unit that detects contact with the surface of the terminal device, and a display unit that displays an image.
  • a transmission unit that transmits a detection value from the detection unit to the information processing device, a reception unit that receives image data for generating an image from the information processing device, and an image that is displayed on the display unit using the received image data
  • the information processing apparatus includes a receiving unit that receives a detection value from the terminal device, an application processing unit that generates image data based on the detection value, and a transmission unit that transmits the image data to the information processing device.
  • a terminal device that realizes new operability can be provided.
  • FIG. 1 It is a figure which shows the use environment of the information processing system concerning the Example of this invention. It is a figure which shows an example of an external appearance structure of a terminal device. It is a figure which shows the cross section of the terminal device shown in FIG. It is a figure which shows the structure inside an inner shell. It is a figure which shows another example of the external appearance structure of a terminal device. It is a figure which shows the cross section of the terminal device shown in FIG. It is a figure which shows the functional block of a terminal device. It is a figure which shows the functional block of information processing apparatus. It is a figure for demonstrating the process which specifies the relative position of the terminal device with respect to an output device. It is a figure which shows the state in which the terminal device is grasped by the user.
  • (A) is a figure which shows a mode that a character is displayed on the screen of an output device
  • (b) is a figure which shows a mode that the terminal device is pressed against the character currently displayed on the screen
  • (c) is a figure which shows a mode that a character is displayed on a terminal device.
  • (A) is a figure which shows the state in which the terminal device is pinched with a user's finger
  • (b) is an explanation which shows the contact state which a touch sensor detects when a terminal device is pinched with five fingers.
  • FIG. (A) is a figure which shows the game image displayed on the screen of an output device
  • (b) is a figure which shows the game image displayed on a terminal device.
  • FIG. 1 shows a use environment of an information processing system 1 according to an embodiment of the present invention.
  • the information processing system 1 includes an information processing device 10, an output device 20, and a terminal device 100.
  • the output device 20 is a television, for example, and outputs an image and sound.
  • the information processing apparatus 10 is connected to the output device 20 and generates an image and sound to be output from the output device 20.
  • the information processing apparatus 10 and the output apparatus 20 may be connected by wire or may be connected wirelessly.
  • the terminal device 100 has a wireless communication function and can transmit and receive data by establishing a wireless link with the information processing device 10.
  • the information processing system 1 may be a game system that provides a user with an environment for playing a game.
  • the information processing device 10 is a game device that executes game software
  • the terminal device 100 is a controller device for a user to input to the game.
  • the information processing apparatus 10 executes game software based on information that the user operates the terminal device 100 and outputs image data and audio data indicating the execution result to the output device 20.
  • the technique shown in the present embodiment can be used not only in a game but also in an information processing system including a processing device that executes other types of applications.
  • the portable terminal device 100 has a curved surface shape or a substantially curved surface shape.
  • the substantially curved surface means a surface approximated to a curved surface by arranging a plurality of planes side by side.
  • the terminal device 100 may have a substantially spherical shape as shown in FIG. 1 or may have an oval shape.
  • the terminal device 100 only needs to have a curved surface shape or a substantially curved surface shape at a part thereof, and the entire shape may not be a curved surface or a substantially curved surface, but at least half of the entire outer shape is a curved surface shape or a substantially curved shape. It preferably has a curved shape.
  • the center of gravity is preferably set so as to maintain the placed posture.
  • the center of gravity of the terminal device 100 is set to be positioned at the center of the sphere. Thereby, the terminal device 100 can maintain the posture when it is placed without rolling when it is placed.
  • the terminal device 100 has a shape other than a sphere, for example, when it has an oval shape, the weight balance is set so that the center of gravity is located at the bottom instead of the top, and the bottom is placed on a flat surface. It is preferable that the posture at that time is maintained.
  • the center of gravity may be dynamically set so that the center of gravity adjusting mechanism maintains the posture when placed.
  • the surface of the terminal device 100 constitutes an operation surface operated by the user. Therefore, the operation surface is preferably formed with a curved surface or a substantially curved surface in order to provide a smooth operation feeling. Further, as shown in FIG. 1, the terminal device 100 preferably has a size that can be held by a user with one hand. Thereby, the user can operate the operation surface while holding the terminal device 100 with one hand, and can also freely operate the operation surface with the other hand while holding the terminal device 100 with one hand.
  • the terminal device 100 operates as an input device that is operated by a user and transmits operation input data to the information processing device 10.
  • the terminal device 100 includes a contact sensor that detects contact with the surface. When the surface of the terminal device 100 comes into contact with any object such as a user's finger or the display (screen) of the output device 20, the contact position is determined. To detect. At this time, not only the position but also the pressure may be detected.
  • the contact sensor detects contact with the surface of the terminal device 100 by changing the position or strength of the terminal device 100 or by pressing the terminal device 100 against the screen of the output device 20. The terminal device 100 transmits the detected value to the information processing device 10.
  • the contact sensor is continuously provided on the curved surface portion of the terminal device 100, and the user can obtain a smooth operation feeling by moving a finger on the curved surface.
  • the contact sensor is preferably provided so as to be able to detect contact on the entire surface of the terminal device 100, whereby the user can perform an input operation using the entire surface of the terminal device 100.
  • the terminal device 100 transmits input data generated based on the contact operation from the user to the information processing device 10.
  • the contact sensor should just be provided in the operation surface.
  • the operation surface is formed on a continuous surface on the terminal device 100 so that the user can easily operate the operation surface.
  • the terminal device 100 also has a motion sensor that generates a sensor value for detecting the posture.
  • the motion sensor has a triaxial acceleration sensor and a triaxial angular velocity sensor.
  • the motion sensor may further include a triaxial geomagnetic sensor.
  • the information processing apparatus 10 can process the attitude change of the terminal device 100 as input data to the application being executed. Therefore, the terminal device 100 periodically transmits the detection value of the motion sensor to the information processing device 10.
  • the terminal device 100 includes a display unit and operates as an output device that displays an image.
  • the terminal device 100 receives image data for generating an image from the information processing device 10, and displays the image on the display unit based on the image data.
  • the terminal device 100 may generate an image from information stored in its own storage device. For example, a game character or the like is displayed on the display unit.
  • a display object such as a game character is drawn based on detection values of a contact sensor, a motion sensor, or the like.
  • This display control may be executed by the information processing apparatus 10 or may be executed by a control unit mounted on the terminal device 100.
  • the display unit is provided on the curved surface portion of the terminal device 100.
  • the display unit is formed of an EL (Electroluminescence) panel that can be flexibly bent, and the display panel is attached to the curved surface portion.
  • the display unit is configured by combining a plurality of display panels, but may be configured by a single display panel.
  • the display unit may be configured by bonding a plurality of liquid crystal panels to each surface of a substantially curved polyhedron.
  • the substantially curved polyhedron refers to a polyhedron having 20 or more planes.
  • the terminal device 100 can be arranged from the entire surface by continuously disposing a plurality of display panels so as to cover the entire surface of the terminal device 100 without a gap. The image can be displayed.
  • the display object can be interlocked with the state change of the terminal device 100 detected by the contact sensor or the motion sensor.
  • the terminal device 100 operates as an input / output device that acquires an operation by the user as a sensor value and displays an image reflecting the sensor value. Since the terminal device 100 includes the sensor value acquisition function and the image display function, an intuitive operational feeling can be provided to the user.
  • the terminal device 100 has a touch panel function by overlapping the display unit and the sensor.
  • the display unit may be a screen in which a projector is arranged inside the terminal device 100 and the light from the projector is displayed so that the user can visually recognize it from the outside.
  • the interior of the terminal device 100 is formed hollow, and the display unit is formed of a transparent or translucent material that projects light from the projector to the outside.
  • a core on which a processor is mounted is provided in the center of the space within the terminal device 100.
  • a plurality of projectors are provided in the core.
  • FIG. 2 shows an example of the external configuration of the terminal device 100.
  • the terminal device 100 includes a display unit 112 that displays an image on the surface.
  • the display unit 112 is configured by combining a plurality of display panels 110a to 110v (hereinafter referred to as “display panel 110” if not distinguished).
  • display panel 110 a plurality of display panels 110 are provided on the back surface of the terminal device 100 shown in FIG.
  • the terminal device 100 can display the image from the entire surface by covering the entire outer surface with the plurality of display panels 110 without gaps.
  • the terminal device 100 includes a control unit that generates image data to be displayed on each display panel 110.
  • the display panel 110 is formed by an EL panel, a liquid crystal panel, or the like.
  • a panel that can have a curvature, such as an EL panel, is formed in a curved surface and is directly attached to the surface of a sphere.
  • the plurality of display panels 110 are arranged in a truncated icosahedron in which 20 regular hexagons such as soccer balls and 12 regular pentagons are combined. It may be arranged on the surface.
  • the surface of the terminal device 100 is protected by a transparent material such as resin, and forms a curved surface shape or a substantially curved surface shape.
  • the display panel 110 is a liquid crystal panel and is attached to each surface of a truncated icosahedron
  • the surface of the terminal device 100 is covered with a resin so as to have a curved shape, so that the user can A smooth operation feeling can be obtained when the fingers are moved up.
  • FIG. 3 shows a cross section of the terminal device 100 shown in FIG.
  • An inner core 140 is disposed in the center of the inner space of the terminal device 100.
  • the internal core 140 is provided with a control unit that controls processing of the terminal device 100, a communication unit that communicates with an external device, a sensor for detecting the movement and posture of the terminal device 100, and the like.
  • a transparent protective layer 130 covering the entirety of the terminal device 100 forms an outer shell, and a display unit 112 having a plurality of display panels 110 provided on the inner surface of the protective layer 130 without gaps is provided inside the protective layer 130. .
  • the contact detection unit 122 that detects contact with the surface of the terminal device 100 is provided.
  • the contact detection unit 122 includes a plurality of touch sensors 120.
  • the plurality of touch sensors 120 are preferably supported by an inner shell 134 that is a hollow sphere, and are provided so as to be able to detect contact on the entire surface of the terminal device 100.
  • the plurality of touch sensors 120 are arranged on the inner shell 134 without a gap.
  • the contact detection unit 122 may include another sensor, such as a pressure sensor.
  • the inner shell 134 is preferably formed of a flexible resin material.
  • the inner shell 134 is supported by a rod-shaped support member 132 extending from the inner core 140. Thereby, the inner side of the inner shell 134 is made into a space, and the terminal device 100 can be reduced in weight.
  • the support member 132 is preferably formed of an elastic material such as a spring so that the user can crush or deform the terminal device 100 to some extent. Accordingly, the user can recognize that the input operation has been performed because the surface of the spherical terminal device 100 is recessed, and an unprecedented input interface can be realized.
  • the internal core 140 and the display unit 112 are electrically connected by, for example, wiring provided inside the support member 132.
  • the wiring between the inner core 140 and the contact detection unit 122 may be similarly provided. Since the terminal device 100 shown in FIG. 3 has a hollow structure, wiring can be provided in this space.
  • FIG. 4 shows the inner structure of the inner shell 134.
  • six support members 132 are provided from the inner core 140 toward the inner shell 134, and are configured to contract in the length direction when pressed.
  • the number of the support members 132 may be more than six, and when pressed by a user's finger or the like while maintaining the shape of the sphere of the terminal device 100, the pressed portion is deformed, and when the finger is released, What is necessary is just to be comprised so that it may return to the spherical shape of this.
  • the terminal device 100 may have a solid structure filled with an elastic material.
  • the hollow structure is superior in terms of weight reduction, but there is an advantage that the shape can be stably maintained by adopting the solid structure.
  • FIG. 5 shows another example of the external configuration of the terminal device 100.
  • the terminal device 100 has a projector inside, and projects an image on the display unit 112 from the inside.
  • the display unit 112 is formed of a material that reflects light from the projector and allows an external user to see the projected image.
  • the terminal device 100 of this example needs to take a hollow structure.
  • FIG. 6 shows a cross section of the terminal device 100 shown in FIG.
  • An inner core 140 is disposed in the center of the inner space of the terminal device 100.
  • the internal core 140 includes a control unit, a communication unit, various sensors, and the like.
  • a plurality of projectors 142 are provided on the surface of the inner core 140.
  • a display unit 112 serving as a transparent or translucent projection surface is provided on the inner side of the transparent protective layer 130. Further, as shown in FIG.
  • a contact detector 122 is formed.
  • the inner shell 134 and the contact detection unit 122 are formed of a transparent material, and image light passes through the inner shell 134 and the contact detection unit 122.
  • a plurality of projectors 142 are provided on the inner core 140, and image light reaches the entire display unit 112 by combining light output from the projectors 142.
  • FIG. 7 shows functional blocks of the terminal device 100.
  • the terminal device 100 includes a display unit 112 and a contact detection unit 122 on the surface thereof, and includes a control unit 200, a communication unit 150, and a motion sensor 160 in the internal core 140.
  • Each component in the terminal device 100 is supplied with power by a battery (not shown).
  • the battery is charged by wireless power feeding.
  • the communication unit 150 includes a transmission unit 152 and a reception unit 154, and transmits / receives data to / from the information processing apparatus 10 using a predetermined communication protocol such as IEEE 802.11 or IEEE 802.15.1.
  • the motion sensor 160 is a detection unit that detects data for detecting the movement and posture of the terminal device 100, and includes a triaxial acceleration sensor 162, a triaxial angular velocity sensor 164, and a triaxial geomagnetic sensor 166.
  • the control unit 200 receives each detection value from the contact detection unit 122 and the motion sensor 160 and supplies it to the communication unit 150.
  • the plurality of touch sensors 120 each supply a detection value to the control unit 200. At this time, each touch sensor 120 sends a detection value to the control unit 200 together with an identification number for identifying itself. As a result, the identification number and the detected value are associated with each other and sent to the control unit 200.
  • the control unit 200 may identify the touch sensor 120 by a port to which the detection value is input, and associate the identification number with the detection value. In this way, the control unit 200 associates the identification number of the touch sensor 120 with the detection value and causes the information processing apparatus 10 to transmit the information from the transmission unit 152.
  • the receiving unit 154 receives image data from the information processing apparatus 10 at a predetermined cycle.
  • the reception cycle is set to 10 milliseconds as with the transmission cycle.
  • the control unit 200 receives the image data from the reception unit 154, the control unit 200 generates an image to be displayed on the display unit 112 and displays the image on each display panel 110.
  • Image drawing data to be displayed on the display panel 110 may be generated by the control unit 200 or may be generated by the information processing apparatus 10.
  • FIG. 6 when projecting an image from the projector 142, the control unit 200 provides image data to the projector 142.
  • the terminal device 100 may operate not only as the display unit 112 but also as an output device including a speaker.
  • the receiving unit 154 receives audio data from the information processing apparatus 10, and the speaker outputs audio.
  • the audio data is compressed together with the image data and transmitted, and the control unit 200 outputs the drawing data to the display unit 112 and the audio data to the speaker.
  • the reception unit 154 may receive a vibrator drive signal from the information processing apparatus 10 and the control unit 200 may drive the vibrator.
  • FIG. 8 shows functional blocks of the information processing apparatus 10.
  • the information processing apparatus 10 includes a communication unit 30, a terminal information processing unit 40, and a control unit 50.
  • the communication unit 30 includes a transmission unit 32 and a reception unit 34 and transmits / receives data to / from the terminal device 100 using a predetermined communication protocol such as IEEE802.11 or IEEE802.15.1.
  • the terminal information processing unit 40 includes a posture specifying unit 42 and a position specifying unit 44.
  • the posture specifying unit 42 specifies the posture and movement of the information processing apparatus 10
  • the position specifying unit 44 specifies the position of the information processing apparatus 10 in the space. Specifically, the position specifying unit 44 specifies the position on the screen of the output device 20 when the terminal device 100 touches the display screen of the output device 20.
  • the control unit 50 includes a key setting unit 52 and an application processing unit 54.
  • the key setting unit 52 has a function of setting an operation key at an arbitrary position on the operation surface of the terminal device 100 when the terminal device 100 is used as a game controller.
  • the application processing unit 54 has a function of acquiring sensor information and terminal information from the terminal device 100 and the terminal information processing unit 40 and reflecting the sensor information and terminal information in application processing.
  • each element described as a functional block for performing various processes can be configured by a CPU (Central Processing Unit), a memory, and other LSIs in terms of hardware, and memory in terms of software. This is realized by a program loaded on the computer. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
  • CPU Central Processing Unit
  • the receiving unit 34 receives the detection value of the motion sensor 160 from the terminal device 100 and passes it to the posture specifying unit 42.
  • the posture identifying unit 42 holds in advance the detection value of the triaxial acceleration sensor 162 in the terminal device 100 at rest as a reference value in the reference posture of the terminal device 100. If the terminal device 100 is stationary, the posture specifying unit 42 specifies the current posture from the difference between the received detection value and the reference value. When the terminal device 100 is moving, the posture specifying unit 42 specifies the current posture from the detection value of the triaxial acceleration sensor 162 and the detection value of the triaxial angular velocity sensor 164.
  • the posture specifying unit 42 also specifies the movement of the terminal device 100 from the detection value of the triaxial acceleration sensor 162 and the detection value of the triaxial angular velocity sensor 164.
  • the specified movement of the terminal device 100 includes the moving direction and the moving amount of the terminal device 100.
  • the posture specifying unit 42 may specify the posture and movement of the terminal device 100 in consideration of the detection value of the geomagnetic sensor 166.
  • the terminal device 100 shown in the embodiment has a spherically symmetric shape, even if the user rotates or tilts the terminal device 100, the user is not aware of the current posture of the terminal device 100. Therefore, it is important for the posture specifying unit 42 to grasp the posture of the terminal device 100 in real time using the detection value of the motion sensor 160 when executing an application described later.
  • the position specifying unit 44 specifies the position of the terminal device 100.
  • the position specifying unit 44 only needs to be able to specify the relative positional relationship between the terminal device 100 and the output device 20.
  • the terminal device 100 touches the screen of the output device 20. It is only necessary that the position of the output device 20 on the screen can be specified. Further, it is preferable that the position specifying unit 44 can specify the distance between the terminal device 100 and the screen of the output device 20.
  • a camera that captures the screen of the output device 20 may be arranged so that the position specifying unit 44 can specify the position when the terminal device 100 touches the screen.
  • the position specifying unit 44 specifies the contact position on the screen of the output device 20 at that time.
  • the position specifying unit 44 may determine whether or not the terminal device 100 has touched any object based on the detection value of the contact detection unit 122, and the camera when contact is detected by the contact determination.
  • the contact position on the screen of the output device 20 may be specified from the captured image.
  • FIG. 9 is a diagram for explaining processing in which the position specifying unit specifies the relative position of the terminal device 100 with respect to the output device 20 from the captured image of the camera.
  • FIG. 9A shows a state in which the camera 60 is arranged at the upper right corner of the screen of the output device 20.
  • the position specifying unit 44 virtually sets the X axis in the horizontal direction of the screen and the Y axis in the vertical direction of the screen, the coordinates of the upper right corner of the screen are (0, 0), and the screen
  • the coordinates of the upper left corner are set to (Xmax, 0)
  • the coordinates of the lower left corner of the screen are set to (Xmax, Ymax)
  • the coordinates of the lower right corner of the screen are set to (0, Ymax).
  • the arrangement position of the camera 60 is not limited to the upper right corner of the screen of the output device 20, but may be any place where the entire screen of the output device 20 enters the angle of view of the camera 60 and the camera 60 can capture the entire screen of the output device 20. Good.
  • the camera 60 may be installed at the upper center position of the screen of the output device 20.
  • FIG. 9A shows a state in which the user presses the terminal device 100 against the screen of the output device 20.
  • FIG. 9B shows a captured image of the camera 60.
  • the position specifying unit 44 specifies the coordinates (x, y) in the XY space where the terminal device 100 exists from the captured image.
  • the position specifying unit 44 detects the contact of the terminal device 100 based on the detection value of the contact detection unit 122, the coordinates (x, y) are specified by specifying the contact point between the terminal device 100 and the output device 20 in the captured image. ) And the contact position on the screen of the output device 20 is obtained.
  • the position specifying unit 44 can also specify the relative position of the terminal device 100 with respect to the output device 20 from only the captured image of the camera 60 without using the result of the contact determination.
  • the position specifying unit 44 may detect whether the terminal device 100 is in contact with the screen of the output device 20 by analyzing the captured image.
  • the position specifying unit 44 may be configured to derive the distance between the terminal device 100 and the screen of the output device 20 by image analysis.
  • a camera having an optical axis parallel to the screen is separately installed, and the position specifying unit 44 determines contact from the captured image of the camera, Alternatively, the separation distance may be derived.
  • the position when the terminal device 100 touches the screen may be specified by configuring the screen of the output device 20 with a touch panel.
  • the position specifying unit 44 determines whether or not the terminal device 100 has touched any object based on the detection value of the contact detection unit 122. If the timing at which the terminal device 100 is determined to be in contact with the detection value of the contact detection unit 122 coincides with the timing at which the touch panel of the output device 20 is touched, the position specifying unit 44 determines that the touch panel of the output device 20 It is determined that the terminal device 100 has touched the position where the touch has been made. In addition, it may be determined that the terminal device 100 and the screen of the output device 20 are in contact with each other using not only the timing match but also the contact area information.
  • the terminal device 100 periodically transmits detection value data in which the identification number of the touch sensor 120 and the detection value are associated with each other to the information processing device 10.
  • the information processing apparatus 10 holds a table in which the identification number of the touch sensor 120 is associated with the position of the touch sensor 120 in the terminal device 100. This table is held by the control unit 50 and is used by the control unit 50 to generate data reflected in the processing of the application, but may be held by the position specifying unit 44.
  • the position specifying unit 44 can specify the position of the touch sensor 120 that has detected contact in the terminal device 100 by holding this table.
  • the position specifying unit 44 derives the contact area in the terminal device 100 from the plurality of touch sensors 120 that have newly detected contact. Since the terminal device 100 is leaned by the user's finger, the touch sensor 120 that detects contact with the finger exists before the contact between the terminal device 100 and the output device 20, but the position specifying unit 44 is By specifying the touch sensor 120 that has newly detected contact, the contact area on the terminal device 100 when the terminal device 100 contacts the output device 20 is specified. The position specifying unit 44 compares the area where the touch is made on the touch panel of the output device 20 with the area where the terminal device 100 is newly contacted. Contact with the screen of the output device 20 is determined. By using the touch panel, the contact position on the screen can be easily specified.
  • the position specifying unit 44 specifies the position on the screen when the terminal device 100 touches the screen of the output device 20.
  • usage examples of the terminal device 100 in the information processing system 1 will be described. Note that the processing and functions of the information processing apparatus 10 described in one usage example can be applied to another usage example.
  • FIG. 10 shows a state where the terminal device 100 is held by the user.
  • the palm and fingers come into contact with the surface of the terminal device 100.
  • the touch sensor 120 located below the contact area supplies a detection value indicating that there is a contact to the control unit 200.
  • the control unit 200 associates the identification number of each touch sensor 120 with the detection value, generates detection value data, and the transmission unit 152 transmits the detection value data to the information processing apparatus 10.
  • the key setting unit 52 specifies the type of the finger from the detection values obtained by the touch sensors 120.
  • the control unit 50 holds a table in which the identification number of the touch sensor 120 is associated with the position of the touch sensor 120 in the terminal device 100. Accordingly, when receiving the detection value data, the key setting unit 52 can reproduce the contact state of the terminal device 100 by developing the included identification number and detection value on the virtual sphere. The key setting unit 52 detects the palm from the reproduced contact state.
  • the key setting unit 52 When the key setting unit 52 reproduces the contact state on the virtual sphere from the detection value data, five continuous elongated contact areas and a large continuous contact area located near the ends of the five contact areas are detected. The First, the key setting unit 52 identifies a large contact area as a palm. Subsequently, among the five elongated contact areas extending from the palm, two contact areas extending from both ends of the palm are specified. The key setting unit 52 compares the contact areas at both ends, and specifies a thick finger as a thumb and a thin finger as a little finger in the short direction of the contact area.
  • the key setting unit 52 may compare the lengths in the longitudinal direction of the contact areas at both ends of the elongated portion extending from the palm, and may specify, for example, a short finger as a thumb and a thin finger as a little finger. Further, the key setting unit 52 may compare the areas of the contact areas at both ends of the elongated portion extending from the palm, and may specify, for example, a finger with a large area as a thumb and a small finger as a little finger. As described above, by specifying the thumb and the little finger, the fingers in the three contact areas between the thumb and the little finger can be specified, and the tip position of the finger can also be specified.
  • the key setting unit 52 sets a key at the tip position of the finger.
  • an up key 302a is set at the tip of the middle finger
  • a left key 302b is set at the tip of the index finger
  • a right key 302c is set at the tip of the ring finger
  • a direction key 302d is set at the tip of the thumb.
  • the key setting unit 52 delivers the position information of the direction key 302 set at the tip of each finger as key setting information to the application processing unit 54 that executes the application.
  • the application processing unit 54 monitors the input from the user according to the key setting information.
  • an input to the application is performed when the user removes his / her finger from the terminal device 100.
  • the contact with the set area of the upward key 302a is lost, and the detection value of the touch sensor 120 at that position changes.
  • the detection value of the touch sensor 120 in the area where the up direction key 302a is set changes from the on value indicating contact to the off value indicating non-contact while receiving the detection value data
  • the application processing unit 54 In addition, it is detected that the upward key 302a has been input, and this is reflected in the processing of the application.
  • an input to the application may be performed. For example, when the user lifts the middle finger away from the terminal device 100, the contact with the set area of the upward key 302a is lost, and the detection value of the touch sensor 120 at that position changes to an off value indicating non-contact. If the detected value again becomes an ON value indicating contact within a predetermined time after the change of the detected value, the application processing unit 54 detects that the upward key 302a has been input and reflects it in the processing of the application. May be.
  • the application processing unit 54 monitors the contact / non-contact of the position opposite to the position where the re-contact occurred, specifically, the terminal device 100 from the re-contact position through the center of the sphere to the opposite position. Simultaneously with the detection of re-contact, when it is detected that a larger pressure is applied to the opposite position, it may be detected that the up key 302a has been input. For example, an increase in pressure is detected by an increase in the number of touch sensors 120 that output ON values.
  • the contact detection part 122 may be comprised with a pressure sensor.
  • a pressure sensor When configured with a pressure sensor, an input to the application may be performed when the user strongly presses the surface of the terminal device 100 with a finger.
  • the application processing unit 54 monitors the detected value of the pressure sensor in the area where the direction key 302 is set according to the key setting information, and when the detected value becomes larger than a predetermined value, Detects input and reflects it in the application process.
  • the key setting unit 52 dynamically sets the key in accordance with the position of the finger when the user grips the terminal device 100, so that the user inputs the terminal device 100 in a natural manner. It can be used as a device. Also, the type of key to be assigned can be dynamically changed according to the application to be executed, and an unprecedented input device can be realized. For example, in the example shown in FIG. 10, the up / down / left / right direction keys 302 are assigned to the tips of four fingers. For example, in an application that causes a character to perform a jump operation, a squatting operation, a forward operation, and a backward operation, the respective operations are performed. The designated operation key may be assigned to the tip region of each finger.
  • the key setting unit 52 performs a process for changing the key setting information for the finger. That is, the key setting unit 52 changes the original area to the new area when the contact with the area specified by the original key setting information is lost and the detection of the contact with the new area continues for a predetermined time. Regenerate key setting information.
  • the key setting unit 52 executes the finger specifying process again to generate key setting information.
  • the key setting unit 52 assigns a key, it holds the relative positional relationship of the assigned key.
  • the key setting unit 52 may hold the key setting information as a relative positional relationship.
  • the key setting unit 52 displays the assigned key image on the display unit 112 based on the held positional relationship. indicate. In the example of FIG. 10, an arrow mark may be displayed. Since the positional relationship held by the key setting unit 52 is in accordance with the size of the user's hand, the user can omit the key setting process by placing a finger on the displayed key image.
  • the key setting unit 52 assigns different areas each time a key is assigned based on the held positional relationship. Since the user operates the key while holding the terminal device 100, if the key is always assigned to the same area, there is a possibility that the display panel 110, the touch sensor 120, and the like in that portion are quickly deteriorated. For this reason, when the key assignment is performed, the posture at the time of key assignment is randomly set based on the reference posture, so that deterioration can be prevented from being concentrated at a specific location. For example, when the relative positional relationship of the assigned key is held as a positional relationship based on the upward key 302a assigned to the middle finger, the assigned region is determined by randomly determining the region of the upward key 302a each time. Can be different each time. When the key setting unit 52 holds the previous key setting information, the area of the up direction key 302a is arbitrarily set, and the area of the other direction key is set to the previous key using the area as a reference. You may obtain
  • the terminal device 100 When the terminal device 100 is used as a game controller, the terminal device 100 is assigned a controller number from the information processing device 10.
  • the application processing unit 54 causes the display unit 112 to display an image indicating the assigned controller number for a predetermined period. Thereby, the user can know the assigned controller number. Thereafter, the key assignment process described above may be performed.
  • the application processing unit 54 generates a display image of the object and outputs it to the output device 20.
  • FIG. 11A shows how a character is displayed on the screen of the output device 20. The user brings the terminal device 100 into contact with an area on the screen where the character is displayed.
  • FIG. 11B shows a state where the terminal device 100 is pressed against the character displayed on the screen.
  • the position specifying unit 44 specifies the position on the screen of the output device 20 when the terminal device 100 contacts the screen of the output device 20.
  • the application processing unit 54 grasps the display position of the character, and determines whether the character display position matches the position specified by the position specifying unit 44. If they match, the application processing unit 54 generates character image data for the terminal device 100 so as to produce an effect that the character moves from the output device 20 to the terminal device 100.
  • the control unit 50 holds a table in which the identification number of the touch sensor 120 in the terminal device 100 is associated with the position of the touch sensor 120 in the terminal device 100.
  • the control unit 50 also holds a table in which the identification number of the display panel 110 in the terminal device 100 is associated with the position of the display panel 110 in the terminal device 100.
  • the control unit 50 also holds a table for specifying the positional relationship between the touch sensor 120 and the display panel 110.
  • the application processing unit 54 identifies the touch sensor 120 that has output a detection value indicating that the screen of the output device 20 has been touched from the detection value of each touch sensor 120.
  • the application processing unit 54 specifies the touch sensor 120 that newly detects contact from the state in which the terminal device 100 is held by the user's finger. This specification may be performed by the position specifying unit 44, and the application processing unit 54 may acquire the identification number of the touch sensor 120 that has touched the output device 20 from the position specifying unit 44.
  • the application processing unit 54 uses the table that specifies the positional relationship between the touch sensor 120 and the display panel 110 to specify the display panel 110 located opposite to the contact position.
  • the display panel 110 is present at the center position of the terminal device 100 in FIG.
  • the application processing unit 54 specifies the display panel 110 located at the center of the terminal device 100 in FIG. 11B
  • the character image data for the terminal device 100 is such that the center of the character image comes to the display panel 110. Is generated.
  • the display unit 112 of the terminal device 100 the character image is divided and displayed on the plurality of display panels 110.
  • the application processing unit 54 displays the identification number of the display panel 110 and the display panel 110.
  • the image data of the entire character is generated in association with the image data to be processed.
  • the transmission unit 32 transmits the generated image data to the terminal device 100, and the control unit 200 generates an image to be displayed on each display panel 110 based on the identification number of the specified display panel 110 and the image data.
  • FIG. 11C shows a state in which the character is displayed on the terminal device 100.
  • the output device 20 ends the character display.
  • the application processing unit 54 When the terminal device 100 is pressed against the screen of the output device 20 again while the character is displayed in FIG. 11C, the application processing unit 54 returns the character to the screen position of the pressed output device 20. Also good.
  • the position on the screen of the output device 20 where the terminal device 100 is pressed is specified by the position specifying unit 44.
  • an application in which the user moves the character to the terminal device 100 and returns it to the output device 20 can be realized. Since the application processing unit 54 can stop displaying the character on the terminal device 100 and display the character at an arbitrary position on the screen of the output device 20, for example, while the character is moving to the terminal device 100, the character is displayed. Makes it possible to create a more interactive game that restores physical strength and then returns the character to the game screen.
  • 11A to 11C show an example in which the character display is switched from the screen of the output device 20 to the display unit 112 of the terminal device 100.
  • the display object according to the contact position on the screen is shown. May be displayed on the display unit 112.
  • a help screen related to the content displayed at the contact position of the output device 20 may be displayed on the display unit 112. .
  • the position specifying unit 44 can derive the distance between the terminal device 100 and the screen of the output device 20.
  • FIG. 11B shows a state in which the terminal device 100 is pressed against the screen of the output device 20.
  • the position specifying unit 44 is connected to the terminal device 100.
  • the screen of the output device 20 are derived and transmitted to the application processing unit 54.
  • the application processing unit 54 is closer than a predetermined distance, a character to be displayed on the terminal device 100 is selected. The user may be notified of this.
  • This notification may be performed, for example, by generating an image in which the character is sucked in the display unit 112 of the terminal device 100, and the character is sucked into the terminal device 100 on the screen of the output device 20. This may be done by generating a simple image. By such notification, the user can recognize the character to be displayed on the display unit 112 of the terminal device 100, and the complete character may be displayed on the display unit 112 when pressed.
  • FIG. 12A shows a state where the terminal device 100 is sandwiched between the fingers of the user.
  • the application processing unit 54 specifies the state in which the terminal device 100 is held from the detection value of each touch sensor 120, specifically, specifies whether the terminal device 100 is the right hand or the left hand, Guess the position.
  • the application processing unit 54 determines whether the holding hand is the right hand or the left hand from the detection value of each touch sensor 120.
  • FIG. 12B is an explanatory diagram showing a contact state detected by the touch sensor 120 when the terminal device 100 is sandwiched between five fingertips.
  • five contact areas 304a to 304e are detected.
  • the contact region 304e expressed by a dotted line is located on the lower surface side of the terminal device 100 in the state shown in FIG.
  • the application processing unit 54 determines the right hand or the left hand by detecting the smallest contact area 304d.
  • the application processing unit 54 determines that the finger in the contact area 304d is a little finger.
  • the finger in the contact area 304c is the ring finger
  • the finger in the contact area 304b is the middle finger
  • the finger in the contact area 304a is the index finger. It can also be determined that the contact area 304e is a thumb.
  • the application processing unit 54 may specify the finger of each contact area by specifying the contact area of the thumb. When the terminal device 100 is sandwiched between five fingers, the contact area of the thumb is the largest. Therefore, the application processing unit 54 determines that the finger in the contact area 304e is a thumb. When the thumb is specified, since the index finger is present at a position closest to the thumb, it can be determined that the finger in the contact area 304a is the index finger. Therefore, the contact area of the middle finger, the ring finger, and the little finger can be determined in order.
  • the application processing unit 54 may specify the thumb and the little finger from the area of each contact area 304 using the fact that the contact area of the thumb is the largest and the contact area of the little finger is the smallest.
  • the application processing unit 54 may specify a finger from the interval of the contact area 304.
  • the thumb and the four fingers other than the thumb tend to be arranged as shown in FIG. That is, the interval between the four fingers other than the thumb is relatively narrow, and the thumb is disposed so as to be relatively separated from the other fingers.
  • the application processing unit 54 calculates the interval between one contact region 304 and the other four contact regions 304, and specifies the smallest interval among the four calculated intervals. With this process, the minimum interval for each of the five fingers is derived.
  • the application processing unit 54 compares the minimum intervals in the contact areas 304 with each other.
  • the ratio of the largest minimum interval to the second largest minimum interval is derived, and when (largest minimum interval) / (second largest minimum interval) is 2 or more, the size of the hand It is determined that the terminal device 100 is relatively large, and the following finger specifying process is executed.
  • the application processing unit 54 identifies the contact area 304 having the largest minimum interval as the contact area of the thumb. In the case of FIG. 12B, it is specified that the contact area 304e is a thumb. Once the thumb is identified, the index finger is then identified. When the terminal device 100 is relatively large with respect to the size of the hand, the distance between the thumb and the index finger is narrower than the distance between the thumb and the little finger. Therefore, the application processing unit 54 specifies the contact area 304a closest to the contact area 304e as the contact area of the index finger. Thereafter, the application processing unit 54 specifies the contact area of the middle finger, the ring finger, and the little finger.
  • the terminal device 100 when the terminal device 100 is relatively small with respect to the size of the hand, that is, when (largest minimum interval) / (second largest minimum interval) is smaller than 2, the user is not a finger belly, The terminal device 100 is held at the tip of the finger. Therefore, in this case, as described above, the finger specifying process using the area of the contact region 304 is effective.
  • the application processing unit 54 estimates the palm position.
  • the lengthwise ends of the contact areas 304a to 304d are connected, virtual lines that do not intersect with each other are drawn (312a and 312b in the figure), and the direction of the palm is estimated based on the degree of bending.
  • the palm is positioned on the direction 306 side.
  • the palm is positioned on the direction 308 side.
  • the application processing unit 54 determines that the terminal device 100 is pinched with a finger as shown in FIG. 12A, determines to display the character in the display area 310, and displays the character.
  • the application processing unit 54 monitors the arrangement of the contact area 304 and sets the display area 310 in real time. Thereby, even when the user changes the position where the terminal device 100 is held, the display area 310 can be set at an appropriate position and the character can be displayed.
  • the application processing unit 54 does not set the display area 310 at least in the contact area 304. Further, the application processing unit 54 does not set the display area 310 in the area in the direction in which the palm is estimated to exist. As described above, the application processing unit 54 does not set the display area 310 at a position that the user cannot see, and therefore specifies the finger of the contact area 304 even when the user changes the terminal device 100. By specifying the direction in which the palm exists, the display area 310 can be dynamically set at an appropriate position.
  • the application processing unit 54 detects a “tracing operation” from the detection value of each touch sensor 120.
  • the “tracing operation” is detected when the contact area on the surface of the terminal device 100 moves in one direction.
  • the application processing unit 54 determines from the detection value of each touch sensor 120 that the contact area is moving, the application processing unit 54 generates image data so as to move the character image in that direction, and the transmission unit 32 transmits the image data.
  • the control unit 200 displays a character image on the display unit 112 based on the image data. Thereby, the character which moves in the traced direction is displayed on the display unit 112.
  • the help screen is displayed on the display unit 112. However, when the user traces the display panel 110 on which the help screen is displayed, the next page of the help screen is displayed. You may do it. Further, when the user twists the terminal device 100 at a predetermined angle in the horizontal plane, the page may be turned. The twist angle of the terminal device 100 is derived from the detection value of the angular velocity sensor 164.
  • Application processing unit 54 generates image data of the earth. Since the application processing unit 54 knows the positional relationship between the display panel 110 and the touch sensor 120, when the touch sensor 120 that outputs a detection value indicating that there is a touch is specified, the virtual position at the position of the touch sensor 120 is identified. The latitude and longitude on the earth can be specified. At this time, the application processing unit 54 specifies the latitude and longitude designated by the user by specifying the touch sensor 120 that has output a detection value indicating that a new contact has occurred. When the application processing unit 54 generates information related to an area existing in the latitude and longitude, and the transmission unit 32 transmits the information to the terminal device 100, the control unit 200 displays the information on the display unit 112.
  • the earth rotates in accordance with the rotation.
  • image processing is performed so as to rotate by inertia like a globe according to the previous rotation.
  • the application processing unit 54 derives the rotation speed of the terminal device 100 from the detection value of the angular velocity sensor 164.
  • the rotation speed is derived at an arbitrary timing during the rotation.
  • the application processing unit 54 holds the derived rotation speed.
  • the application processing unit 54 controls the Earth image to be displayed to rotate in the rotation direction so far and gradually decrease the rotation speed.
  • the application processing unit 54 gradually reduces the rotation speed of the displayed earth image as if the globe is rotating with inertia based on the rotation speed derived when the terminal device 100 is rotated. Further, even when the user removes his / her hand from the terminal device 100 and places it on a desk, the user may rotate by inertia.
  • the application processing unit 54 derives the rotation speed and controls the display image to rotate in the opposite direction to the terminal device 100, thereby rotating the terminal device 100. Regardless, the same image may be seen from the same direction.
  • thumbnails of contents such as music files and movie files are arranged on the display unit 112.
  • the content is held in the information processing apparatus 10, and when the user selects a thumbnail, the content is played on the output device 20.
  • the application processing unit 54 divides longitude (vertical direction) by genre and divides latitude (horizontal direction) by age. , Allowing the user to select content sensuously.
  • the opportunity to display the thumbnail is, for example, when the user shakes the terminal device 100.
  • the application processing unit 54 detects that the terminal device 100 is shaken from the detection value of the acceleration sensor 162, the thumbnail image of the content is displayed.
  • Such information presentation can also be used to display, for example, the access ranking of articles on the Internet.
  • the longitude is divided by genre and the latitude is divided by rank so that the user can grasp the ranking sensuously.
  • the application processing unit 54 detects the tapped position from the detection value of the touch sensor 120, identifies the thumbnail displayed at the position, and stores the content when detecting that the terminal device 100 is shaken. Read from device and play.
  • an image different from the game image displayed on the screen of the output device 20 is displayed on the display unit 112 of the terminal device 100 during the execution of the game.
  • the game image is a video from the character's viewpoint
  • an image from another viewpoint is displayed on the display unit 112.
  • the application processing unit 54 detects the movement of the terminal device 100, displays the image from the viewpoint above the character on the display unit 112 when the terminal device 100 is lifted up on the basis of the character viewpoint, The image from the viewpoint left from the character is displayed on the display unit 112.
  • Various images can be displayed on the display unit 112 depending on the game, and the user can move the terminal device 100 to search for an item or look down on the game space. The information may be acquired.
  • FIG. 13A shows a game image displayed on the screen of the output device 20.
  • This game image is a video from the viewpoint of the character.
  • the application processing unit 54 installs a virtual camera behind the character in the three-dimensional game space, generates a game image based on operation data from the user, and outputs the game image from the output device 20.
  • the application processing unit 54 generates image data that matches the configuration of the display unit 112 of the terminal device 100, and the transmission unit 32 transmits the image data to the terminal device 100.
  • the display unit 112 displays a game image. Thereby, the same game image as that of the output device 20 is also displayed on the display unit 112 of the terminal device 100.
  • FIG. 13B shows a game image displayed on the terminal device 100 when the user moves the terminal device 100 to the left.
  • the posture specifying unit 42 specifies the moving direction and moving amount of the terminal device 100 from the detection value of the motion sensor 160.
  • the application processing unit 54 receives the movement direction and movement amount of the terminal device 100 from the posture specifying unit 42, the application processing unit 54 converts the movement direction and movement amount into the movement direction and movement amount of the virtual camera in the three-dimensional game space. Note that a character is included in the angle of view of the virtual camera.
  • the application processing unit 54 generates a game image shot by the moved virtual camera, and generates image data that matches the display unit 112 of the terminal device 100.
  • the transmission unit 32 transmits the image data to the terminal device 100, and the control unit 200 receives the image data in the terminal device 100 and causes the display unit 112 to display the image data.
  • the user can see the game image different from the display image of the output device 20 on the terminal device 100, and in the example of FIG. 13B, can find that the dog is behind the tree.
  • a hint for game capture when the user presses the terminal device 100 against the screen of the output device 20, information that is not displayed in the game image, such as a hint for game capture, may be displayed on the display unit 112.
  • the entire display unit 112 may shine to indicate to the user that it is not a specific hint but a timing to focus on.
  • the hints may be presented in accordance with the progress of the game. For example, when a chance comes, a hint, light emission for calling attention, or the like may be displayed on the display unit 112. This hint presentation may be executed by, for example, outputting sound from a speaker.
  • a replay image of the game may be displayed on the display unit 112. When displaying the replay image, the viewpoint may be changed by changing the attitude of the terminal device 100, for example.
  • the fact that a plurality of terminal devices 100 are brought into contact is reflected in the processing of the application. For example, if it is a game, the process which makes it think that the terminal devices 100 contacted is performed, such as a characteristic of a character changing or a character uniting.
  • the contact between the terminal devices 100 is determined by the normal vectors on the contact surfaces being opposite to each other.
  • the normal vector is determined by the attitude of the terminal device 100 and the contact area.
  • the terminal device 100 may be configured to start when it receives a start signal from the information processing device 10 and to sleep when it receives an end signal. In this case, the terminal device 100 can operate on the condition that the information processing device 10 is operating. Therefore, for example, when a signal cannot be received from the information processing device 10 for a predetermined time, the terminal device 100 automatically sleeps. It may be configured. Note that the terminal device 100 may be configured to monitor the detection value of the contact detection unit 122 or the motion sensor 160 in the sleep state and to autonomously start when the terminal device 100 enters a predetermined state.
  • the predetermined state may be, for example, a state in which almost the entire surface of the terminal device 100 is covered, a state in which the terminal device 100 is moved at a very high speed, or the like. .
  • the terminal device 100 may be configured to sleep by entering the predetermined operation state during operation.
  • the application processing unit 54 may specify the orientation of the palm from the attitude of the terminal device 100. By specifying the orientation of the palm, it is possible to determine whether the terminal device 100 is gripped with the palm facing upward or with the palm facing downward.
  • the key setting unit 52 performs the contact area specifying process and the like.
  • the application processing unit 54 performs the same process.
  • the posture identifying unit 42 identifies the current posture from the difference between the detection value of the motion sensor 160 received from the terminal device 100 and the reference value
  • the application processing unit 54 determines the current posture identified by the posture identifying unit 42 and the current posture. From the position of the touch sensor 120 that detects the contact of the palm, it is specified which direction the palm is facing with respect to the direction of gravity.
  • the application processing unit 54 may use the palm direction as input data for reflecting the palm direction in the processing of the application.
  • the application processing unit 54 in the information processing apparatus 10 generates image data to be displayed on the display unit 112.
  • the control unit 200 in the terminal device 100 has the same function as the application processing unit 54, and Data may be generated.
  • DESCRIPTION OF SYMBOLS 1 ... Information processing system, 10 ... Information processing apparatus, 20 ... Output device, 30 ... Communication part, 32 ... Transmission part, 34 ... Reception part, 40 ... Terminal information Processing unit 42 ... Posture specifying unit 44 ... Position specifying unit 50 ... Control unit 52 ... Key setting unit 54 ... Application processing unit 100 ... Terminal device 110 ... Display panel, 112 ... Display section, 120 ... Touch sensor, 122 ... Contact detection section, 130 ... Protective layer, 132 ... Support member, 134 ... Inner shell, 140 ... Inner core, 142 ... Projector, 150 ... Communication unit, 152 ... Transmission unit, 154 ... Reception unit, 160 ... Motion sensor, 162 ... Acceleration sensor, 164 ... -Angular velocity sensor, 166 ... Geomagnetic sensor, 200 ... control unit.
  • the present invention can be applied to the field of user interface.

Abstract

A terminal device (100) is configured such that the terminal device has a curved surface or a substantially curved surface. The terminal device (100) is provided with: a contact detecting unit (122) which detects a contact with the surface of the terminal device; a display unit (112) which displays an image; a transmitting unit (152) which transmits a detection value obtained from the contact detecting unit (122); a receiving unit (154) which receives image data for generating the image; and a control unit (200) which uses the received image data and generates the image to be displayed on the display unit (112).

Description

端末装置および情報処理システムTerminal device and information processing system
 本発明は、入力装置または出力装置として動作する端末装置、およびその端末装置を利用した情報処理システムに関する。 The present invention relates to a terminal device that operates as an input device or an output device, and an information processing system using the terminal device.
 従来のゲームコントローラは十字キーやボタンなどの入力インタフェースを備え、ゲーム装置に有線接続して、ユーザによる操作データをゲーム装置に伝送する。ユーザは両手でコントローラの左右のグリップをもち、手指で十字キーやボタンを操作する。近年、無線技術の進歩により、ゲーム装置に無線でデータを送受信できるコントローラが普及しはじめている。無線コントローラは、ゲーム装置との間の配線を不要とするため、ユーザは自由な位置でゲームをプレイできる。 A conventional game controller has an input interface such as a cross key and a button, and is connected to the game device by wire to transmit operation data by the user to the game device. The user holds the left and right grips of the controller with both hands, and operates the cross keys and buttons with fingers. In recent years, controllers that can wirelessly transmit and receive data to and from game devices are becoming popular due to advances in wireless technology. Since the wireless controller does not require wiring with the game device, the user can play the game at a free position.
米国公開特許2007-218994号公報US Published Patent No. 2007-218994
 しかしながら、特許文献1に開示される無線コントローラは、データを無線で送受信するものの、十字キーやボタンなどの入力インタフェースを変更するものではない。そのため、従来からある有線コントローラも、特許文献1に開示された無線コントローラも、その操作性に違いはない。 However, although the wireless controller disclosed in Patent Document 1 transmits and receives data wirelessly, it does not change an input interface such as a cross key or a button. Therefore, there is no difference in operability between the conventional wired controller and the wireless controller disclosed in Patent Document 1.
 コントローラに新たな入力インタフェースをもたせることで、ユーザによる操作の幅を広げることができ、その操作性を活かしたゲームアプリケーションのバリエーションも増やすことができる。また、従来のコントローラは、ユーザによる操作を受け付ける入力装置として利用されているが、コントローラにゲーム装置からの画像データを出力する出力装置としての機能ももたせることで、従来にないゲームアプリケーションの開発も期待できる。なお、ゲームの分野だけでなく、他の分野の処理装置においても、新たな入力インタフェースは、従来にないアプリケーションの開発を促進する。 By providing the controller with a new input interface, the range of user operations can be expanded, and variations in game applications that take advantage of the operability can be increased. In addition, the conventional controller is used as an input device that accepts an operation by the user. However, by providing the controller with a function as an output device that outputs image data from the game device, it is possible to develop an unprecedented game application. I can expect. It should be noted that the new input interface facilitates the development of an unprecedented application not only in the game field but also in processing devices in other fields.
 そこで本発明は、新たな操作性を実現する端末装置であって、入力装置としてだけではなく、出力装置としても使用可能な端末装置を提供することを目的とする。この端末装置は、入力装置または出力装置として使用されてもよい。また本発明は、この端末装置を利用した情報処理システムを提供することも目的とする。 Therefore, an object of the present invention is to provide a terminal device that realizes new operability and can be used not only as an input device but also as an output device. This terminal device may be used as an input device or an output device. Another object of the present invention is to provide an information processing system using this terminal device.
 上記課題を解決するために、本発明のある態様の端末装置は、曲面形状または略曲面形状を有する携帯型の端末装置であって、当該端末装置の表面への接触を検出する検出部と、画像を表示する表示部と、検出部による検出値を送信する送信部と、画像を生成するための画像データを受信する受信部と、受信した画像データを用いて、表示部に表示する画像を生成する制御部とを備える。 In order to solve the above problems, a terminal device according to an aspect of the present invention is a portable terminal device having a curved surface shape or a substantially curved surface shape, and a detection unit that detects contact with the surface of the terminal device; A display unit that displays an image, a transmission unit that transmits a detection value by the detection unit, a reception unit that receives image data for generating an image, and an image to be displayed on the display unit using the received image data And a control unit to generate.
 本発明の別の態様は、情報処理システムである。この情報処理システムは、端末装置と、情報処理装置とを備えた情報処理システムであって、端末装置は、当該端末装置の表面への接触を検出する検出部と、画像を表示する表示部と、検出部による検出値を情報処理装置に送信する送信部と、画像を生成するための画像データを情報処理装置から受信する受信部と、受信した画像データを用いて、表示部に表示する画像を生成する制御部とを有する。情報処理装置は、端末装置から、検出値を受信する受信部と、検出値をもとに、画像データを生成するアプリケーション処理部と、画像データを情報処理装置に送信する送信部とを有する。 Another aspect of the present invention is an information processing system. This information processing system is an information processing system including a terminal device and an information processing device, and the terminal device includes a detection unit that detects contact with the surface of the terminal device, and a display unit that displays an image. A transmission unit that transmits a detection value from the detection unit to the information processing device, a reception unit that receives image data for generating an image from the information processing device, and an image that is displayed on the display unit using the received image data And a control unit for generating The information processing apparatus includes a receiving unit that receives a detection value from the terminal device, an application processing unit that generates image data based on the detection value, and a transmission unit that transmits the image data to the information processing device.
 なお、以上の構成要素の任意の組合せ、本発明の表現を方法、装置、システム、記録媒体、コンピュータプログラムなどの間で変換したものもまた、本発明の態様として有効である。 It should be noted that an arbitrary combination of the above-described components and a conversion of the expression of the present invention between a method, an apparatus, a system, a recording medium, a computer program, and the like are also effective as an aspect of the present invention.
 本発明によると、新たな操作性を実現する端末装置を提供できる。 According to the present invention, a terminal device that realizes new operability can be provided.
本発明の実施例にかかる情報処理システムの使用環境を示す図である。It is a figure which shows the use environment of the information processing system concerning the Example of this invention. 端末装置の外観構成の一例を示す図である。It is a figure which shows an example of an external appearance structure of a terminal device. 図2に示す端末装置の断面を示す図である。It is a figure which shows the cross section of the terminal device shown in FIG. 内郭の内側の構造を示す図である。It is a figure which shows the structure inside an inner shell. 端末装置の外観構成の別の例を示す図である。It is a figure which shows another example of the external appearance structure of a terminal device. 図5に示す端末装置の断面を示す図である。It is a figure which shows the cross section of the terminal device shown in FIG. 端末装置の機能ブロックを示す図である。It is a figure which shows the functional block of a terminal device. 情報処理装置の機能ブロックを示す図である。It is a figure which shows the functional block of information processing apparatus. 出力装置に対する端末装置の相対的位置を特定する処理を説明するための図である。It is a figure for demonstrating the process which specifies the relative position of the terminal device with respect to an output device. 端末装置がユーザによって握られている状態を示す図である。It is a figure which shows the state in which the terminal device is grasped by the user. (a)は、出力装置の画面にキャラクタが表示される様子を示す図であり、(b)は、画面に表示されているキャラクタに端末装置を押し当てている様子を示す図であり、(c)は、キャラクタが端末装置に表示される様子を示す図である。(A) is a figure which shows a mode that a character is displayed on the screen of an output device, (b) is a figure which shows a mode that the terminal device is pressed against the character currently displayed on the screen, (c) is a figure which shows a mode that a character is displayed on a terminal device. (a)は、端末装置がユーザの手指で挟まれている状態を示す図であり、(b)は、端末装置を5本の指で挟んだときにタッチセンサが検出する接触状態を示す説明図である。(A) is a figure which shows the state in which the terminal device is pinched with a user's finger, (b) is an explanation which shows the contact state which a touch sensor detects when a terminal device is pinched with five fingers. FIG. (a)は、出力装置の画面に表示されるゲーム画像を示す図であり、(b)は、端末装置に表示されるゲーム画像を示す図である。(A) is a figure which shows the game image displayed on the screen of an output device, (b) is a figure which shows the game image displayed on a terminal device.
 図1は、本発明の実施例にかかる情報処理システム1の使用環境を示す。情報処理システム1は、情報処理装置10、出力装置20および端末装置100を備える。出力装置20は、たとえばテレビであり、画像および音声を出力する。情報処理装置10は、出力装置20に接続されて、出力装置20で出力する画像および音声を生成する。情報処理装置10と出力装置20は有線により接続されてもよく、また無線により接続されてもよい。端末装置100は無線通信機能を有し、情報処理装置10と無線リンクを確立することで、データの送受信を行うことができる。 FIG. 1 shows a use environment of an information processing system 1 according to an embodiment of the present invention. The information processing system 1 includes an information processing device 10, an output device 20, and a terminal device 100. The output device 20 is a television, for example, and outputs an image and sound. The information processing apparatus 10 is connected to the output device 20 and generates an image and sound to be output from the output device 20. The information processing apparatus 10 and the output apparatus 20 may be connected by wire or may be connected wirelessly. The terminal device 100 has a wireless communication function and can transmit and receive data by establishing a wireless link with the information processing device 10.
 情報処理システム1は、ゲームをプレイする環境をユーザに提供するゲームシステムであってよい。この場合、情報処理装置10は、ゲームソフトウェアを実行するゲーム装置であり、端末装置100は、ユーザがゲームへの入力を行うためのコントローラ装置である。情報処理装置10は、ユーザが端末装置100を操作した情報をもとにゲームソフトウェアを実行して、実行結果を示す画像データおよび音声データを出力装置20に出力する。なお、本実施例に示す技術は、ゲームに限らず、他の種類のアプリケーションを実行する処理装置を備えた情報処理システムにおいても利用できる。 The information processing system 1 may be a game system that provides a user with an environment for playing a game. In this case, the information processing device 10 is a game device that executes game software, and the terminal device 100 is a controller device for a user to input to the game. The information processing apparatus 10 executes game software based on information that the user operates the terminal device 100 and outputs image data and audio data indicating the execution result to the output device 20. The technique shown in the present embodiment can be used not only in a game but also in an information processing system including a processing device that executes other types of applications.
 携帯型の端末装置100は曲面形状または略曲面形状を有する。本実施例において略曲面は、複数の平面を並設することで曲面に近似させた面を意味する。端末装置100は、図1に示すように略球体の形状を有してもよく、また卵形の形状を有してもよい。なお端末装置100は、その一部に曲面形状または略曲面形状を有していればよく、形状全体が曲面または略曲面でなくてもよいが、少なくとも外郭全体の半分以上は、曲面形状または略曲面形状を有することが好ましい。 The portable terminal device 100 has a curved surface shape or a substantially curved surface shape. In the present embodiment, the substantially curved surface means a surface approximated to a curved surface by arranging a plurality of planes side by side. The terminal device 100 may have a substantially spherical shape as shown in FIG. 1 or may have an oval shape. The terminal device 100 only needs to have a curved surface shape or a substantially curved surface shape at a part thereof, and the entire shape may not be a curved surface or a substantially curved surface, but at least half of the entire outer shape is a curved surface shape or a substantially curved shape. It preferably has a curved shape.
 端末装置100は、テーブルなどの平坦面に置かれたときに、置かれた姿勢を維持するように重心が設定されていることが好ましい。端末装置100が球体の形状を有する場合には、端末装置100の重心が、球体の中心に位置するように設定される。これにより、端末装置100は、置かれたときに転がることなく、置かれたときの姿勢を維持できる。なお端末装置100が球体以外の形状を有する場合、たとえば卵形の形状を有する場合には、頂部ではなく底部に重心が位置するように重量バランスが設定されて、底部を平坦面に載置すると、そのときの姿勢を維持するようにされることが好ましい。なお、端末装置100の内部に重心位置を調整する機構が設けられる場合には、重心調整機構が、載置されたときの姿勢を保つように、重心を動的に設定してもよい。 When the terminal device 100 is placed on a flat surface such as a table, the center of gravity is preferably set so as to maintain the placed posture. When the terminal device 100 has a spherical shape, the center of gravity of the terminal device 100 is set to be positioned at the center of the sphere. Thereby, the terminal device 100 can maintain the posture when it is placed without rolling when it is placed. When the terminal device 100 has a shape other than a sphere, for example, when it has an oval shape, the weight balance is set so that the center of gravity is located at the bottom instead of the top, and the bottom is placed on a flat surface. It is preferable that the posture at that time is maintained. When a mechanism for adjusting the position of the center of gravity is provided inside the terminal device 100, the center of gravity may be dynamically set so that the center of gravity adjusting mechanism maintains the posture when placed.
 端末装置100の表面はユーザにより操作される操作面を構成する。したがって操作面は、滑らかな操作感を提供するために、曲面または略曲面で形成されることが好ましい。また端末装置100は、図1に示すように、ユーザが片手で持つことのできる大きさであることが好ましい。これによりユーザは、端末装置100を片手で持ちながら操作面を操作することができ、また片手で持ちながら、他方の手で操作面を自由に操作することもできる。 The surface of the terminal device 100 constitutes an operation surface operated by the user. Therefore, the operation surface is preferably formed with a curved surface or a substantially curved surface in order to provide a smooth operation feeling. Further, as shown in FIG. 1, the terminal device 100 preferably has a size that can be held by a user with one hand. Thereby, the user can operate the operation surface while holding the terminal device 100 with one hand, and can also freely operate the operation surface with the other hand while holding the terminal device 100 with one hand.
 1つの使用態様において、端末装置100は、ユーザにより操作されて、情報処理装置10に操作入力データを送信する入力装置として動作する。端末装置100は、表面への接触を検出する接触センサを有し、端末装置100の表面がなんらかの物体、たとえばユーザの手指や、出力装置20のディスプレイ(画面)などと接触すると、その接触位置を検出する。このとき、位置だけでなく、圧力も検出できるようにしてもよい。ユーザが端末装置100を持つ位置や強さなどを変化させて、また、端末装置100を出力装置20の画面に押しつけたりすることで、接触センサは、端末装置100の表面への接触を検出し、端末装置100は、その検出値を情報処理装置10に送信する。 In one usage mode, the terminal device 100 operates as an input device that is operated by a user and transmits operation input data to the information processing device 10. The terminal device 100 includes a contact sensor that detects contact with the surface. When the surface of the terminal device 100 comes into contact with any object such as a user's finger or the display (screen) of the output device 20, the contact position is determined. To detect. At this time, not only the position but also the pressure may be detected. The contact sensor detects contact with the surface of the terminal device 100 by changing the position or strength of the terminal device 100 or by pressing the terminal device 100 against the screen of the output device 20. The terminal device 100 transmits the detected value to the information processing device 10.
 接触センサは、端末装置100の曲面部分に連続的に設けられ、ユーザは、曲面上で手指を動かすことで、滑らかな操作感を得ることができる。なお接触センサは、端末装置100の全表面における接触を検出可能に設けられることが好ましく、これにより、ユーザは、端末装置100の全表面を使って、入力操作を行うことができる。端末装置100は、ユーザからの接触操作をもとに生成した入力データを、情報処理装置10に対して送信する。なお端末装置100の表面の一部が操作面を構成する場合には、その操作面に接触センサが設けられていればよい。操作面は、ユーザが操作しやすいように端末装置100上の連続した面に形成される。 The contact sensor is continuously provided on the curved surface portion of the terminal device 100, and the user can obtain a smooth operation feeling by moving a finger on the curved surface. Note that the contact sensor is preferably provided so as to be able to detect contact on the entire surface of the terminal device 100, whereby the user can perform an input operation using the entire surface of the terminal device 100. The terminal device 100 transmits input data generated based on the contact operation from the user to the information processing device 10. In addition, when a part of surface of the terminal device 100 comprises an operation surface, the contact sensor should just be provided in the operation surface. The operation surface is formed on a continuous surface on the terminal device 100 so that the user can easily operate the operation surface.
 また端末装置100は、姿勢を検出するためのセンサ値を生成するモーションセンサを有する。モーションセンサは、3軸加速度センサおよび3軸角速度センサを有する。モーションセンサは、さらに3軸地磁気センサを有してもよい。情報処理システム1において、情報処理装置10は、端末装置100の姿勢変化を、実行中のアプリケーションへの入力データとして処理できる。そのため端末装置100は、モーションセンサの検出値を周期的に情報処理装置10に送信する。 The terminal device 100 also has a motion sensor that generates a sensor value for detecting the posture. The motion sensor has a triaxial acceleration sensor and a triaxial angular velocity sensor. The motion sensor may further include a triaxial geomagnetic sensor. In the information processing system 1, the information processing apparatus 10 can process the attitude change of the terminal device 100 as input data to the application being executed. Therefore, the terminal device 100 periodically transmits the detection value of the motion sensor to the information processing device 10.
 別の使用態様において、端末装置100は、表示部を有し、画像を表示する出力装置として動作する。端末装置100は、画像を生成するための画像データを情報処理装置10から受信し、その画像データをもとに、表示部に画像を表示する。なお端末装置100は、自身の記憶装置に保持する情報から画像を生成してもよい。表示部には、たとえばゲームのキャラクタなどが映し出される。ゲームキャラクタなどの表示物は、接触センサやモーションセンサなどの検出値に基づいて描画処理される。この表示制御は、情報処理装置10により実行されてもよいが、端末装置100に搭載される制御部によって実行されてもよい。 In another usage mode, the terminal device 100 includes a display unit and operates as an output device that displays an image. The terminal device 100 receives image data for generating an image from the information processing device 10, and displays the image on the display unit based on the image data. Note that the terminal device 100 may generate an image from information stored in its own storage device. For example, a game character or the like is displayed on the display unit. A display object such as a game character is drawn based on detection values of a contact sensor, a motion sensor, or the like. This display control may be executed by the information processing apparatus 10 or may be executed by a control unit mounted on the terminal device 100.
 表示部は、端末装置100の曲面部分に設けられる。表示部は、柔軟に湾曲することのできるEL(Electroluminescence)パネルなどにより形成され、曲面部分に表示パネルが貼り付けられる。表示部は、複数の表示パネルを組み合わせて構成されるが、1枚の表示パネルで構成されてもよい。また表示部は、複数の液晶パネルを、略曲面形状の多面体の各面に貼り合わせることで構成されてもよい。ここで、略曲面形状の多面体とは、20面体以上の多面体をさす。いずれの種類の表示パネルを採用した場合であっても、複数の表示パネルを、端末装置100の全表面を隙間無く覆うように連続的に配設することで、端末装置100は、全表面から画像を表示できるようになる。 The display unit is provided on the curved surface portion of the terminal device 100. The display unit is formed of an EL (Electroluminescence) panel that can be flexibly bent, and the display panel is attached to the curved surface portion. The display unit is configured by combining a plurality of display panels, but may be configured by a single display panel. The display unit may be configured by bonding a plurality of liquid crystal panels to each surface of a substantially curved polyhedron. Here, the substantially curved polyhedron refers to a polyhedron having 20 or more planes. Regardless of which type of display panel is employed, the terminal device 100 can be arranged from the entire surface by continuously disposing a plurality of display panels so as to cover the entire surface of the terminal device 100 without a gap. The image can be displayed.
 端末装置100において表示物は、接触センサやモーションセンサによって検出される端末装置100の状態変化に連動させることができる。この場合、端末装置100は、ユーザによる操作をセンサ値で取得し、そのセンサ値を反映した画像を表示する入出力装置として動作する。端末装置100が、このセンサ値取得機能と画像表示機能を備えることで、ユーザに直観的な操作感を提供できる。 In the terminal device 100, the display object can be interlocked with the state change of the terminal device 100 detected by the contact sensor or the motion sensor. In this case, the terminal device 100 operates as an input / output device that acquires an operation by the user as a sensor value and displays an image reflecting the sensor value. Since the terminal device 100 includes the sensor value acquisition function and the image display function, an intuitive operational feeling can be provided to the user.
 表示部とセンサとを重ね合わせることで、端末装置100は、タッチパネルの機能をもつことになる。なお表示部は、端末装置100の内部にプロジェクタを配設して、プロジェクタからの光をユーザが外部から視認可能に映し出すスクリーンであってもよい。このとき端末装置100の内部は中空に形成され、表示部は、プロジェクタからの光を外に映し出す透明または半透明材料で形成される。端末装置100内の空間中央には、プロセッサを搭載したコアが設けられており、端末装置100の全面に画像を映し出すために、コアに複数のプロジェクタが設けられる。 The terminal device 100 has a touch panel function by overlapping the display unit and the sensor. The display unit may be a screen in which a projector is arranged inside the terminal device 100 and the light from the projector is displayed so that the user can visually recognize it from the outside. At this time, the interior of the terminal device 100 is formed hollow, and the display unit is formed of a transparent or translucent material that projects light from the projector to the outside. In the center of the space within the terminal device 100, a core on which a processor is mounted is provided. In order to project an image on the entire surface of the terminal device 100, a plurality of projectors are provided in the core.
 図2は、端末装置100の外観構成の一例を示す。端末装置100は、画像を表示する表示部112を表面に備える。表示部112は、複数の表示パネル110a~110v(以下、区別しない場合には、「表示パネル110」と表現する)を組み合わせて構成される。なお図2に示す端末装置100の背面にも、同じように、複数の表示パネル110が設けられている。このように端末装置100は、外表面の全体を複数の表示パネル110で隙間無く覆い、全表面から画像を表示できることが好ましい。端末装置100は、各表示パネル110に表示する画像データを生成する制御部を備える。 FIG. 2 shows an example of the external configuration of the terminal device 100. The terminal device 100 includes a display unit 112 that displays an image on the surface. The display unit 112 is configured by combining a plurality of display panels 110a to 110v (hereinafter referred to as “display panel 110” if not distinguished). Similarly, a plurality of display panels 110 are provided on the back surface of the terminal device 100 shown in FIG. Thus, it is preferable that the terminal device 100 can display the image from the entire surface by covering the entire outer surface with the plurality of display panels 110 without gaps. The terminal device 100 includes a control unit that generates image data to be displayed on each display panel 110.
 表示パネル110は、ELパネルや液晶パネルなどによって形成される。ELパネルのように曲率をもつことのできるパネルは、曲面状に形成され、球体表面に直接貼り付けられる。なお複数の表示パネル110は、図2に示す配置だけでなく、たとえばサッカーボールのような20個の正六角形と12個の正五角形とを組み合わせた切頂二十面体の配置で端末装置100の表面に並べられてもよい。端末装置100の表面は、樹脂などの透明材料により保護され、曲面形状または略曲面形状を形成する。たとえば、表示パネル110が液晶パネルであって、切頂二十面体の各面に貼り付けられる場合、端末装置100の表面を樹脂で覆い、曲面形状をもたせることで、ユーザは端末装置100の表面上で手指を動かしたとき、滑らかな操作感を得ることができる。 The display panel 110 is formed by an EL panel, a liquid crystal panel, or the like. A panel that can have a curvature, such as an EL panel, is formed in a curved surface and is directly attached to the surface of a sphere. In addition to the arrangement shown in FIG. 2, the plurality of display panels 110 are arranged in a truncated icosahedron in which 20 regular hexagons such as soccer balls and 12 regular pentagons are combined. It may be arranged on the surface. The surface of the terminal device 100 is protected by a transparent material such as resin, and forms a curved surface shape or a substantially curved surface shape. For example, when the display panel 110 is a liquid crystal panel and is attached to each surface of a truncated icosahedron, the surface of the terminal device 100 is covered with a resin so as to have a curved shape, so that the user can A smooth operation feeling can be obtained when the fingers are moved up.
 図3は、図2に示す端末装置100の断面を示す。端末装置100の内部空間中央には内部コア140が配置される。内部コア140には、端末装置100の処理を制御する制御部や、外部装置と通信する通信部、また端末装置100の動きや姿勢を検出するためのセンサなどが設けられる。端末装置100の全体を覆う透明な保護層130が外郭を形成し、保護層130の内側には、保護層130の内面に隙間無く設けられた複数の表示パネル110を有する表示部112が設けられる。 FIG. 3 shows a cross section of the terminal device 100 shown in FIG. An inner core 140 is disposed in the center of the inner space of the terminal device 100. The internal core 140 is provided with a control unit that controls processing of the terminal device 100, a communication unit that communicates with an external device, a sensor for detecting the movement and posture of the terminal device 100, and the like. A transparent protective layer 130 covering the entirety of the terminal device 100 forms an outer shell, and a display unit 112 having a plurality of display panels 110 provided on the inner surface of the protective layer 130 without gaps is provided inside the protective layer 130. .
 表示部112の内側には、端末装置100の表面への接触を検出する接触検出部122が設けられる。接触検出部122は、複数のタッチセンサ120を有する。複数のタッチセンサ120は、中空の球体である内郭134により支持され、端末装置100の全表面における接触を検出可能に設けられることが好ましい。そのために複数のタッチセンサ120は、内郭134上に隙間無く配置される。これによりユーザは、後述するように、端末装置100の姿勢や向きを気にすることなく、端末装置表面を触ることで、情報処理装置10への入力を行えるようになる。なお、タッチセンサ120の配置密度は、表示パネル110の配置密度よりも高く、1つの表示パネル110の下方には、2以上のタッチセンサ120が設けられている。接触検出部122は、他のセンサ、たとえば圧力センサを有してもよい。内郭134は、可撓性をもつ樹脂材料で形成されることが好ましい。 Inside the display unit 112, a contact detection unit 122 that detects contact with the surface of the terminal device 100 is provided. The contact detection unit 122 includes a plurality of touch sensors 120. The plurality of touch sensors 120 are preferably supported by an inner shell 134 that is a hollow sphere, and are provided so as to be able to detect contact on the entire surface of the terminal device 100. For this purpose, the plurality of touch sensors 120 are arranged on the inner shell 134 without a gap. As a result, the user can input to the information processing device 10 by touching the surface of the terminal device without worrying about the posture or orientation of the terminal device 100, as will be described later. Note that the arrangement density of the touch sensors 120 is higher than the arrangement density of the display panels 110, and two or more touch sensors 120 are provided below one display panel 110. The contact detection unit 122 may include another sensor, such as a pressure sensor. The inner shell 134 is preferably formed of a flexible resin material.
 内郭134は、内部コア140から延出される棒状の支持部材132により支持される。これにより、内郭134の内側を空間にし、端末装置100を軽量化できる。またユーザが、端末装置100をある程度押しつぶしたり、また変形できるように、支持部材132は、伸縮可能な弾性材料、たとえば、ばねで形成されることが好ましい。これにより、ユーザは、球体の端末装置100の表面が凹むことで、入力操作を行ったことを認識できるようになり、従来にない入力インタフェースを実現できる。内部コア140と表示部112の間は、たとえば支持部材132の内部に設けた配線により電気的に接続される。内部コア140と接触検出部122の間の配線も、同様に設けられてよい。なお、図3に示す端末装置100は中空構造をとるため、この空間に配線を設けることも可能である。 The inner shell 134 is supported by a rod-shaped support member 132 extending from the inner core 140. Thereby, the inner side of the inner shell 134 is made into a space, and the terminal device 100 can be reduced in weight. Further, the support member 132 is preferably formed of an elastic material such as a spring so that the user can crush or deform the terminal device 100 to some extent. Accordingly, the user can recognize that the input operation has been performed because the surface of the spherical terminal device 100 is recessed, and an unprecedented input interface can be realized. The internal core 140 and the display unit 112 are electrically connected by, for example, wiring provided inside the support member 132. The wiring between the inner core 140 and the contact detection unit 122 may be similarly provided. Since the terminal device 100 shown in FIG. 3 has a hollow structure, wiring can be provided in this space.
 図4は、内郭134の内側の構造を示す。この例では、6本の支持部材132が内部コア140から内郭134に向けて設けられ、押されると、それぞれ長さ方向に縮むように構成されている。なお、支持部材132は6本より多くてよく、端末装置100の球体の形状を維持しつつ、ユーザの手指などにより押されると、その押された箇所が変形し、手指が離されると、元の球体形状に戻るように構成されていればよい。 FIG. 4 shows the inner structure of the inner shell 134. In this example, six support members 132 are provided from the inner core 140 toward the inner shell 134, and are configured to contract in the length direction when pressed. Note that the number of the support members 132 may be more than six, and when pressed by a user's finger or the like while maintaining the shape of the sphere of the terminal device 100, the pressed portion is deformed, and when the finger is released, What is necessary is just to be comprised so that it may return to the spherical shape of this.
 なお図3,図4では、中空構造の端末装置100を示したが、端末装置100は、弾性材料を充填された中実構造を有してもよい。軽量化の点においては中空構造の方が優れているが、中実構造を採用することで、形状を安定に維持できる利点がある。 3 and 4 show the hollow structure terminal device 100, the terminal device 100 may have a solid structure filled with an elastic material. The hollow structure is superior in terms of weight reduction, but there is an advantage that the shape can be stably maintained by adopting the solid structure.
 図5は、端末装置100の外観構成の別の例を示す。端末装置100は、内部にプロジェクタを有し、内部から画像を表示部112に投影する。表示部112は、プロジェクタからの光を映し出し、外部のユーザが、その映し出された画像を見ることのできる材料で形成される。またプロジェクタからの光を表示部112に投影するために、この例の端末装置100は、中空構造をとる必要がある。 FIG. 5 shows another example of the external configuration of the terminal device 100. The terminal device 100 has a projector inside, and projects an image on the display unit 112 from the inside. The display unit 112 is formed of a material that reflects light from the projector and allows an external user to see the projected image. Moreover, in order to project the light from a projector on the display part 112, the terminal device 100 of this example needs to take a hollow structure.
 図6は、図5に示す端末装置100の断面を示す。端末装置100の内部空間中央には内部コア140が配置される。内部コア140には、上記したように制御部、通信部および各種センサなどが設けられる。また内部コア140の表面には、複数のプロジェクタ142が設けられる。透明な保護層130の内側には、透明もしくは半透明の投影面となる表示部112が設けられ、また表示部112の内側には、図3に示したように、内郭134に支持された接触検出部122が形成される。 FIG. 6 shows a cross section of the terminal device 100 shown in FIG. An inner core 140 is disposed in the center of the inner space of the terminal device 100. As described above, the internal core 140 includes a control unit, a communication unit, various sensors, and the like. A plurality of projectors 142 are provided on the surface of the inner core 140. A display unit 112 serving as a transparent or translucent projection surface is provided on the inner side of the transparent protective layer 130. Further, as shown in FIG. A contact detector 122 is formed.
 図6に示す端末装置100では、プロジェクタ142が表示部112に画像を投影するために、内郭134および接触検出部122は透明材料により形成され、画像光が内郭134および接触検出部122を介して、表示部112に投影されるようになる。複数のプロジェクタ142が内部コア140上に設けられ、それぞれから出力される光を組み合わせることで、表示部112の全体に画像光が到達するようにされる。 In the terminal device 100 shown in FIG. 6, in order for the projector 142 to project an image on the display unit 112, the inner shell 134 and the contact detection unit 122 are formed of a transparent material, and image light passes through the inner shell 134 and the contact detection unit 122. Through the display unit 112. A plurality of projectors 142 are provided on the inner core 140, and image light reaches the entire display unit 112 by combining light output from the projectors 142.
 図7は、端末装置100の機能ブロックを示す。端末装置100は、その表面部分にて、表示部112および接触検出部122を備え、また内部コア140において、制御部200、通信部150およびモーションセンサ160を備える。端末装置100における各構成は、図示しないバッテリにより電力の供給を受ける。バッテリは、無線給電により充電される。 FIG. 7 shows functional blocks of the terminal device 100. The terminal device 100 includes a display unit 112 and a contact detection unit 122 on the surface thereof, and includes a control unit 200, a communication unit 150, and a motion sensor 160 in the internal core 140. Each component in the terminal device 100 is supplied with power by a battery (not shown). The battery is charged by wireless power feeding.
 通信部150は、送信部152および受信部154を有し、IEEE802.11やIEEE802.15.1などの所定の通信プロトコルを用いて、情報処理装置10との間でデータを送受信する。モーションセンサ160は、端末装置100の動きや姿勢を検出するためのデータを検出する検出部であって、3軸の加速度センサ162、3軸の角速度センサ164および3軸の地磁気センサ166を有する。 The communication unit 150 includes a transmission unit 152 and a reception unit 154, and transmits / receives data to / from the information processing apparatus 10 using a predetermined communication protocol such as IEEE 802.11 or IEEE 802.15.1. The motion sensor 160 is a detection unit that detects data for detecting the movement and posture of the terminal device 100, and includes a triaxial acceleration sensor 162, a triaxial angular velocity sensor 164, and a triaxial geomagnetic sensor 166.
 制御部200は、接触検出部122およびモーションセンサ160から各検出値を受け取り、通信部150に供給する。送信部152は、所定の周期で、検出値を情報処理装置10に送信する。この送信周期は、出力装置20のフレームレートに応じて定められ、16.6m秒(=1/60)より短く設定されるのが好ましい。送信周期は、たとえば10m秒である。 The control unit 200 receives each detection value from the contact detection unit 122 and the motion sensor 160 and supplies it to the communication unit 150. The transmission unit 152 transmits the detection value to the information processing apparatus 10 at a predetermined cycle. This transmission cycle is determined according to the frame rate of the output device 20, and is preferably set shorter than 16.6 milliseconds (= 1/60). The transmission cycle is, for example, 10 milliseconds.
 複数のタッチセンサ120は、それぞれ検出値を制御部200に供給する。このとき各タッチセンサ120は、自身を特定する識別番号とともに、検出値を制御部200に送出する。これにより、識別番号と検出値とが対応付けられて、制御部200に送られることになる。なお、制御部200は、検出値が入力されるポートによってタッチセンサ120を識別し、識別番号と検出値とを対応付けてもよい。このように制御部200は、タッチセンサ120の識別番号と検出値とを対応付けて、送信部152から情報処理装置10に送信させるようにする。 The plurality of touch sensors 120 each supply a detection value to the control unit 200. At this time, each touch sensor 120 sends a detection value to the control unit 200 together with an identification number for identifying itself. As a result, the identification number and the detected value are associated with each other and sent to the control unit 200. Note that the control unit 200 may identify the touch sensor 120 by a port to which the detection value is input, and associate the identification number with the detection value. In this way, the control unit 200 associates the identification number of the touch sensor 120 with the detection value and causes the information processing apparatus 10 to transmit the information from the transmission unit 152.
 受信部154は、所定の周期で、情報処理装置10から画像データを受信する。受信周期は、送信周期と同じく10m秒に設定される。制御部200は、受信部154から画像データを受け取ると、表示部112に表示する画像を生成し、各表示パネル110にて表示させる。表示パネル110に表示させる画像の描画データは、制御部200で生成してもよく、また情報処理装置10で生成してもよい。以下では、情報処理装置10にて描画データを生成する例を説明する。なお、図6に示すように、プロジェクタ142から画像を投影する場合には、制御部200がプロジェクタ142に対して画像データを提供する。 The receiving unit 154 receives image data from the information processing apparatus 10 at a predetermined cycle. The reception cycle is set to 10 milliseconds as with the transmission cycle. When the control unit 200 receives the image data from the reception unit 154, the control unit 200 generates an image to be displayed on the display unit 112 and displays the image on each display panel 110. Image drawing data to be displayed on the display panel 110 may be generated by the control unit 200 or may be generated by the information processing apparatus 10. Hereinafter, an example in which drawing data is generated by the information processing apparatus 10 will be described. As shown in FIG. 6, when projecting an image from the projector 142, the control unit 200 provides image data to the projector 142.
 なお端末装置100は、表示部112のみならず、スピーカを備えた出力装置として動作してもよい。このとき受信部154は、情報処理装置10から、音声データを受信し、スピーカが音声を出力する。音声データは、画像データとともに圧縮されて送信され、制御部200は、描画データを表示部112に、音声データをスピーカにそれぞれ出力する。また、端末装置100がモータなどの振動子を有する場合、受信部154は、情報処理装置10から振動子の駆動信号を受信し、制御部200が振動子を駆動してもよい。 Note that the terminal device 100 may operate not only as the display unit 112 but also as an output device including a speaker. At this time, the receiving unit 154 receives audio data from the information processing apparatus 10, and the speaker outputs audio. The audio data is compressed together with the image data and transmitted, and the control unit 200 outputs the drawing data to the display unit 112 and the audio data to the speaker. When the terminal device 100 includes a vibrator such as a motor, the reception unit 154 may receive a vibrator drive signal from the information processing apparatus 10 and the control unit 200 may drive the vibrator.
 図8は、情報処理装置10の機能ブロックを示す。情報処理装置10は、通信部30、端末情報処理部40および制御部50を備える。通信部30は、送信部32および受信部34を有し、IEEE802.11やIEEE802.15.1などの所定の通信プロトコルを用いて、端末装置100との間でデータを送受信する。端末情報処理部40は、姿勢特定部42および位置特定部44を有する。姿勢特定部42は、情報処理装置10の姿勢および動きを特定し、位置特定部44は、情報処理装置10の空間内の位置を特定する。具体的に位置特定部44は、端末装置100が出力装置20の表示画面に接触したときの出力装置20の画面上の位置を特定する。制御部50は、キー設定部52およびアプリケーション処理部54を有する。キー設定部52は、端末装置100をゲームコントローラとして使用する場合に、端末装置100の操作面の任意の位置に、操作キーを設定する機能をもつ。またアプリケーション処理部54は、端末装置100や端末情報処理部40からセンサ情報や端末情報を取得し、アプリケーションの処理に反映する機能をもつ。 FIG. 8 shows functional blocks of the information processing apparatus 10. The information processing apparatus 10 includes a communication unit 30, a terminal information processing unit 40, and a control unit 50. The communication unit 30 includes a transmission unit 32 and a reception unit 34 and transmits / receives data to / from the terminal device 100 using a predetermined communication protocol such as IEEE802.11 or IEEE802.15.1. The terminal information processing unit 40 includes a posture specifying unit 42 and a position specifying unit 44. The posture specifying unit 42 specifies the posture and movement of the information processing apparatus 10, and the position specifying unit 44 specifies the position of the information processing apparatus 10 in the space. Specifically, the position specifying unit 44 specifies the position on the screen of the output device 20 when the terminal device 100 touches the display screen of the output device 20. The control unit 50 includes a key setting unit 52 and an application processing unit 54. The key setting unit 52 has a function of setting an operation key at an arbitrary position on the operation surface of the terminal device 100 when the terminal device 100 is used as a game controller. The application processing unit 54 has a function of acquiring sensor information and terminal information from the terminal device 100 and the terminal information processing unit 40 and reflecting the sensor information and terminal information in application processing.
 図8において、さまざまな処理を行う機能ブロックとして記載される各要素は、ハードウェア的には、CPU(Central Processing Unit)、メモリ、その他のLSIで構成することができ、ソフトウェア的には、メモリにロードされたプログラムなどによって実現される。したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できることは当業者には理解されるところであり、いずれかに限定されるものではない。 In FIG. 8, each element described as a functional block for performing various processes can be configured by a CPU (Central Processing Unit), a memory, and other LSIs in terms of hardware, and memory in terms of software. This is realized by a program loaded on the computer. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
 まず端末情報処理部40の動作について説明する。受信部34が端末装置100からモーションセンサ160の検出値を受信し、姿勢特定部42に渡す。姿勢特定部42は、静止時の端末装置100における3軸の加速度センサ162の検出値を、端末装置100の基準姿勢における基準値として予め保持している。姿勢特定部42は、端末装置100が静止していれば、受け取った検出値と基準値との差分から、現在の姿勢を特定する。また端末装置100が動いている場合には、姿勢特定部42は、3軸の加速度センサ162の検出値と、3軸の角速度センサ164の検出値とから、現在の姿勢を特定する。また姿勢特定部42は、3軸の加速度センサ162の検出値と、3軸の角速度センサ164の検出値とから、端末装置100の動きも特定する。特定される端末装置100の動きは、端末装置100の移動方向および移動量を含む。なお姿勢特定部42は、地磁気センサ166の検出値も加味して、端末装置100の姿勢および動きを特定してもよい。 First, the operation of the terminal information processing unit 40 will be described. The receiving unit 34 receives the detection value of the motion sensor 160 from the terminal device 100 and passes it to the posture specifying unit 42. The posture identifying unit 42 holds in advance the detection value of the triaxial acceleration sensor 162 in the terminal device 100 at rest as a reference value in the reference posture of the terminal device 100. If the terminal device 100 is stationary, the posture specifying unit 42 specifies the current posture from the difference between the received detection value and the reference value. When the terminal device 100 is moving, the posture specifying unit 42 specifies the current posture from the detection value of the triaxial acceleration sensor 162 and the detection value of the triaxial angular velocity sensor 164. The posture specifying unit 42 also specifies the movement of the terminal device 100 from the detection value of the triaxial acceleration sensor 162 and the detection value of the triaxial angular velocity sensor 164. The specified movement of the terminal device 100 includes the moving direction and the moving amount of the terminal device 100. The posture specifying unit 42 may specify the posture and movement of the terminal device 100 in consideration of the detection value of the geomagnetic sensor 166.
 実施例で示す端末装置100は、球対称の形状を有するため、ユーザは、端末装置100を回転したり傾けたりしても、端末装置100の現在の姿勢を意識することはない。そのため姿勢特定部42が、モーションセンサ160の検出値を用いて、リアルタイムで端末装置100の姿勢を把握しておくことは、後述するアプリケーションを実行する際に重要となる。 Since the terminal device 100 shown in the embodiment has a spherically symmetric shape, even if the user rotates or tilts the terminal device 100, the user is not aware of the current posture of the terminal device 100. Therefore, it is important for the posture specifying unit 42 to grasp the posture of the terminal device 100 in real time using the detection value of the motion sensor 160 when executing an application described later.
 位置特定部44は、端末装置100の位置を特定する。なお、本実施例の情報処理システム1において、位置特定部44は、端末装置100と出力装置20との相対的な位置関係を特定できればよく、たとえば端末装置100が出力装置20の画面に接触したときの出力装置20の画面上の位置を特定できればよい。また位置特定部44は、端末装置100と出力装置20の画面との距離を特定できることが好ましい。 The position specifying unit 44 specifies the position of the terminal device 100. In the information processing system 1 of the present embodiment, the position specifying unit 44 only needs to be able to specify the relative positional relationship between the terminal device 100 and the output device 20. For example, the terminal device 100 touches the screen of the output device 20. It is only necessary that the position of the output device 20 on the screen can be specified. Further, it is preferable that the position specifying unit 44 can specify the distance between the terminal device 100 and the screen of the output device 20.
 情報処理システム1において、出力装置20の画面を撮影するカメラを配置し、位置特定部44が、端末装置100が画面に接触したときの位置を特定できるようにしてもよい。位置特定部44は、カメラの撮像画像から、端末装置100が出力装置20の画面に接触したことを判定すると、そのときの出力装置20の画面上の接触位置を特定する。なお位置特定部44は、接触検出部122の検出値をもとに、端末装置100がなんらかの物体に接触したか否かの判定を行ってもよく、接触判定により接触が検出されたときのカメラの撮像画像から、出力装置20の画面上の接触位置を特定してもよい。 In the information processing system 1, a camera that captures the screen of the output device 20 may be arranged so that the position specifying unit 44 can specify the position when the terminal device 100 touches the screen. When determining that the terminal device 100 has touched the screen of the output device 20 from the captured image of the camera, the position specifying unit 44 specifies the contact position on the screen of the output device 20 at that time. The position specifying unit 44 may determine whether or not the terminal device 100 has touched any object based on the detection value of the contact detection unit 122, and the camera when contact is detected by the contact determination. The contact position on the screen of the output device 20 may be specified from the captured image.
 図9は、位置特定部がカメラの撮像画像から出力装置20に対する端末装置100の相対的位置を特定する処理を説明するための図である。図9(a)は、出力装置20の画面右上隅にカメラ60を配置した状態を示す。カメラ60を画面右上隅に設置した場合、位置特定部44は、画面横方向にX軸、画面縦方向にY軸を仮想的に設定し、画面右上隅の座標を(0,0)、画面左上隅の座標を(Xmax,0)、画面左下隅の座標を(Xmax,Ymax)、画面右下隅の座標を(0,Ymax)と設定する。なおカメラ60の配置位置は、出力装置20の画面右上隅に限らず、カメラ60の画角に出力装置20の画面全体が入り、カメラ60が出力装置20の画面全体を撮像できる場所であればよい。カメラ60が画角180度の魚眼レンズを有する場合には、出力装置20の画面上部中央位置にカメラ60が設置されてもよい。図9(a)では、ユーザが端末装置100を、出力装置20の画面に押しつけている様子が示される。 FIG. 9 is a diagram for explaining processing in which the position specifying unit specifies the relative position of the terminal device 100 with respect to the output device 20 from the captured image of the camera. FIG. 9A shows a state in which the camera 60 is arranged at the upper right corner of the screen of the output device 20. When the camera 60 is installed in the upper right corner of the screen, the position specifying unit 44 virtually sets the X axis in the horizontal direction of the screen and the Y axis in the vertical direction of the screen, the coordinates of the upper right corner of the screen are (0, 0), and the screen The coordinates of the upper left corner are set to (Xmax, 0), the coordinates of the lower left corner of the screen are set to (Xmax, Ymax), and the coordinates of the lower right corner of the screen are set to (0, Ymax). The arrangement position of the camera 60 is not limited to the upper right corner of the screen of the output device 20, but may be any place where the entire screen of the output device 20 enters the angle of view of the camera 60 and the camera 60 can capture the entire screen of the output device 20. Good. When the camera 60 has a fisheye lens with an angle of view of 180 degrees, the camera 60 may be installed at the upper center position of the screen of the output device 20. FIG. 9A shows a state in which the user presses the terminal device 100 against the screen of the output device 20.
 図9(b)は、カメラ60の撮像画像を示す。説明の便宜上、図9(b)では、ユーザの手は省略している。位置特定部44は、この撮像画像から、端末装置100が存在するXY空間における座標(x,y)を特定する。位置特定部44が、接触検出部122の検出値をもとに端末装置100の接触を検出すると、撮像画像中の端末装置100と出力装置20の接触点を特定することで座標(x,y)を導出し、出力装置20の画面上の接触位置を求める。 FIG. 9B shows a captured image of the camera 60. For convenience of explanation, the user's hand is omitted in FIG. The position specifying unit 44 specifies the coordinates (x, y) in the XY space where the terminal device 100 exists from the captured image. When the position specifying unit 44 detects the contact of the terminal device 100 based on the detection value of the contact detection unit 122, the coordinates (x, y) are specified by specifying the contact point between the terminal device 100 and the output device 20 in the captured image. ) And the contact position on the screen of the output device 20 is obtained.
 なお、位置特定部44は、接触判定の結果を用いずに、カメラ60の撮像画像のみから出力装置20に対する端末装置100の相対的位置を特定することもできる。位置特定部44は、撮像画像を解析することで、端末装置100と出力装置20の画面とが接触しているか検出してもよい。また位置特定部44は画像解析により、端末装置100と出力装置20の画面との距離を導出できるようにしてもよい。なお、端末装置100と出力装置20の画面との距離を求めるために、光軸を画面に平行とするカメラを別途設置して、位置特定部44が、そのカメラの撮像画像から、接触判定、または離間距離の導出をできるようにしてもよい。 The position specifying unit 44 can also specify the relative position of the terminal device 100 with respect to the output device 20 from only the captured image of the camera 60 without using the result of the contact determination. The position specifying unit 44 may detect whether the terminal device 100 is in contact with the screen of the output device 20 by analyzing the captured image. The position specifying unit 44 may be configured to derive the distance between the terminal device 100 and the screen of the output device 20 by image analysis. In addition, in order to obtain the distance between the terminal device 100 and the screen of the output device 20, a camera having an optical axis parallel to the screen is separately installed, and the position specifying unit 44 determines contact from the captured image of the camera, Alternatively, the separation distance may be derived.
 また、出力装置20の画面をタッチパネルで構成することで、端末装置100が画面に接触したときの位置を特定してもよい。このとき位置特定部44は、接触検出部122の検出値をもとに、端末装置100がなんらかの物体に接触したか否かを判定する。接触検出部122の検出値から端末装置100が接触したことが判定されたタイミングと、出力装置20のタッチパネルに接触があったタイミングとが一致すれば、位置特定部44は、出力装置20のタッチパネルに接触があった位置に、端末装置100が接触したことを判定する。なお、タイミングが一致したことだけでなく、接触面積の情報を用いて、端末装置100と出力装置20の画面とが接触したことが判定されてもよい。 Also, the position when the terminal device 100 touches the screen may be specified by configuring the screen of the output device 20 with a touch panel. At this time, the position specifying unit 44 determines whether or not the terminal device 100 has touched any object based on the detection value of the contact detection unit 122. If the timing at which the terminal device 100 is determined to be in contact with the detection value of the contact detection unit 122 coincides with the timing at which the touch panel of the output device 20 is touched, the position specifying unit 44 determines that the touch panel of the output device 20 It is determined that the terminal device 100 has touched the position where the touch has been made. In addition, it may be determined that the terminal device 100 and the screen of the output device 20 are in contact with each other using not only the timing match but also the contact area information.
 端末装置100は、タッチセンサ120の識別番号と検出値とを対応付けた検出値データを、情報処理装置10に周期的に送信している。情報処理装置10は、タッチセンサ120の識別番号と、端末装置100におけるタッチセンサ120の位置とを対応付けたテーブルを保持する。このテーブルは、制御部50により保持され、制御部50がアプリケーションの処理に反映するデータの生成に利用されるが、位置特定部44により保持されていてもよい。位置特定部44は、このテーブルを保持することで、接触を検出したタッチセンサ120の端末装置100における位置を特定できる。 The terminal device 100 periodically transmits detection value data in which the identification number of the touch sensor 120 and the detection value are associated with each other to the information processing device 10. The information processing apparatus 10 holds a table in which the identification number of the touch sensor 120 is associated with the position of the touch sensor 120 in the terminal device 100. This table is held by the control unit 50 and is used by the control unit 50 to generate data reflected in the processing of the application, but may be held by the position specifying unit 44. The position specifying unit 44 can specify the position of the touch sensor 120 that has detected contact in the terminal device 100 by holding this table.
 このとき位置特定部44は、新たに接触を検出した複数のタッチセンサ120から、端末装置100における接触面積を導出する。端末装置100は、ユーザの指でもたれているため、端末装置100と出力装置20との接触以前に、指による接触を検出しているタッチセンサ120は存在しているが、位置特定部44は、新たに接触を検出したタッチセンサ120を特定することで、出力装置20に端末装置100が接触したときの端末装置100上の接触面積を特定する。位置特定部44は、出力装置20のタッチパネル上で接触があった面積と、端末装置100において新たな接触があった面積とを比較して、両者がほぼ一致していれば、端末装置100と出力装置20の画面との接触を判定する。タッチパネルを利用することで、画面上の接触位置を容易に特定することができる。 At this time, the position specifying unit 44 derives the contact area in the terminal device 100 from the plurality of touch sensors 120 that have newly detected contact. Since the terminal device 100 is leaned by the user's finger, the touch sensor 120 that detects contact with the finger exists before the contact between the terminal device 100 and the output device 20, but the position specifying unit 44 is By specifying the touch sensor 120 that has newly detected contact, the contact area on the terminal device 100 when the terminal device 100 contacts the output device 20 is specified. The position specifying unit 44 compares the area where the touch is made on the touch panel of the output device 20 with the area where the terminal device 100 is newly contacted. Contact with the screen of the output device 20 is determined. By using the touch panel, the contact position on the screen can be easily specified.
 以上のように、位置特定部44は、端末装置100が出力装置20の画面に接触したときの画面上の位置を特定する。
 以下、情報処理システム1における端末装置100の使用例を示す。なお、1つの使用例において説明する情報処理装置10の処理や機能は、別の使用例においても適用可能である。
As described above, the position specifying unit 44 specifies the position on the screen when the terminal device 100 touches the screen of the output device 20.
Hereinafter, usage examples of the terminal device 100 in the information processing system 1 will be described. Note that the processing and functions of the information processing apparatus 10 described in one usage example can be applied to another usage example.
(使用例1)
 使用例1では、端末装置100の表面にキーを割り当て、キー操作可能な入力装置として使用する。まず、以下では、ユーザが端末装置100を握った状態に応じて、動的にキーを割り当てる例を示す。
(Usage example 1)
In Usage Example 1, a key is assigned to the surface of the terminal device 100 and used as an input device capable of key operation. First, an example in which keys are dynamically assigned according to a state where the user holds the terminal device 100 will be described below.
 図10は、端末装置100がユーザによって握られている状態を示す。図10に示すように端末装置100が握られる場合、手のひらと手指とが端末装置100の表面に接触する。端末装置100において、接触領域の下方に存在するタッチセンサ120は、接触があったことを示す検出値を制御部200に供給する。制御部200は、各タッチセンサ120の識別番号と検出値とを対応付けて、検出値データを生成し、送信部152が、情報処理装置10に送信する。 FIG. 10 shows a state where the terminal device 100 is held by the user. When the terminal device 100 is gripped as shown in FIG. 10, the palm and fingers come into contact with the surface of the terminal device 100. In the terminal device 100, the touch sensor 120 located below the contact area supplies a detection value indicating that there is a contact to the control unit 200. The control unit 200 associates the identification number of each touch sensor 120 with the detection value, generates detection value data, and the transmission unit 152 transmits the detection value data to the information processing apparatus 10.
 情報処理装置10において、受信部34が検出値データを受信すると、キー設定部52が、各タッチセンサ120による検出値から、指の種類を特定する。制御部50は、タッチセンサ120の識別番号と、端末装置100におけるタッチセンサ120の位置とを対応付けたテーブルを保持している。したがってキー設定部52は、検出値データを受け取ると、含まれる識別番号と検出値を仮想球体上に展開して、端末装置100の接触状態を再現できる。キー設定部52は、再現した接触状態から、手のひらを検出する。 In the information processing apparatus 10, when the receiving unit 34 receives the detection value data, the key setting unit 52 specifies the type of the finger from the detection values obtained by the touch sensors 120. The control unit 50 holds a table in which the identification number of the touch sensor 120 is associated with the position of the touch sensor 120 in the terminal device 100. Accordingly, when receiving the detection value data, the key setting unit 52 can reproduce the contact state of the terminal device 100 by developing the included identification number and detection value on the virtual sphere. The key setting unit 52 detects the palm from the reproduced contact state.
 キー設定部52が、検出値データから接触状態を仮想球体上に再現すると、連続した細長い5つの接触領域と、その5つの接触領域の端部付近に位置する連続した大きな接触領域とが検出される。まずキー設定部52は、大きな接触領域を手のひらと特定する。続いて、手のひらから延びている細長い5つの接触領域のうち、手のひらの両端から延びている2つの接触領域を特定する。キー設定部52は、両端の接触領域を比較し、接触領域の短手方向に太い指を親指、細い指を小指として特定する。なおキー設定部52は、手のひらから延びている細長い両端の接触領域の長手方向の長さを比較し、たとえば短い指を親指、細い指を小指として特定してもよい。またキー設定部52は、手のひらから延びている細長い両端の接触領域の面積を比較し、たとえば面積の大きい指を親指、小さい指を小指として特定してもよい。以上のように、親指と小指とを特定することで、親指と小指の間の3つの接触領域の指も特定でき、また指の先端位置も特定できる。 When the key setting unit 52 reproduces the contact state on the virtual sphere from the detection value data, five continuous elongated contact areas and a large continuous contact area located near the ends of the five contact areas are detected. The First, the key setting unit 52 identifies a large contact area as a palm. Subsequently, among the five elongated contact areas extending from the palm, two contact areas extending from both ends of the palm are specified. The key setting unit 52 compares the contact areas at both ends, and specifies a thick finger as a thumb and a thin finger as a little finger in the short direction of the contact area. Note that the key setting unit 52 may compare the lengths in the longitudinal direction of the contact areas at both ends of the elongated portion extending from the palm, and may specify, for example, a short finger as a thumb and a thin finger as a little finger. Further, the key setting unit 52 may compare the areas of the contact areas at both ends of the elongated portion extending from the palm, and may specify, for example, a finger with a large area as a thumb and a small finger as a little finger. As described above, by specifying the thumb and the little finger, the fingers in the three contact areas between the thumb and the little finger can be specified, and the tip position of the finger can also be specified.
 キー設定部52は、指の先端位置に、キーを設定する。図10に示す例では、中指の先端に、上方向キー302a、人差し指の先端に、左方向キー302b、薬指の先端に、右方向キー302c、親指の先端に、方向キー302dを設定する。キー設定部52は、各指の先端に設定した方向キー302の位置情報を、キー設定情報として、アプリケーションを実行するアプリケーション処理部54に引き渡す。 The key setting unit 52 sets a key at the tip position of the finger. In the example shown in FIG. 10, an up key 302a is set at the tip of the middle finger, a left key 302b is set at the tip of the index finger, a right key 302c is set at the tip of the ring finger, and a direction key 302d is set at the tip of the thumb. The key setting unit 52 delivers the position information of the direction key 302 set at the tip of each finger as key setting information to the application processing unit 54 that executes the application.
 アプリケーション処理部54は、キー設定情報にしたがって、ユーザからの入力を監視する。ここでは、ユーザが、指を端末装置100から離すことで、アプリケーションへの入力が行われるものとする。ユーザがたとえば中指を端末装置100から離すと、設定した上方向キー302aの領域への接触がなくなり、その位置のタッチセンサ120の検出値が変化する。アプリケーション処理部54は、検出値データを受け取っている間に、上方向キー302aが設定されている領域のタッチセンサ120の検出値が接触を示すオン値から非接触を示すオフ値に変化したときに、上方向キー302aの入力がなされたことを検出し、アプリケーションの処理に反映する。 The application processing unit 54 monitors the input from the user according to the key setting information. Here, it is assumed that an input to the application is performed when the user removes his / her finger from the terminal device 100. For example, when the user lifts the middle finger away from the terminal device 100, the contact with the set area of the upward key 302a is lost, and the detection value of the touch sensor 120 at that position changes. When the detection value of the touch sensor 120 in the area where the up direction key 302a is set changes from the on value indicating contact to the off value indicating non-contact while receiving the detection value data, the application processing unit 54 In addition, it is detected that the upward key 302a has been input, and this is reflected in the processing of the application.
 なおユーザが、指を端末装置100から離して再度触ったときに、アプリケーションへの入力が行われたものとしてもよい。ユーザがたとえば中指を端末装置100から離すと、設定した上方向キー302aの領域への接触がなくなり、その位置のタッチセンサ120の検出値が変化して、非接触を示すオフ値となる。検出値の変化後、所定時間以内に、再度、検出値が接触を示すオン値となると、アプリケーション処理部54は、上方向キー302aの入力がなされたことを検出し、アプリケーションの処理に反映してもよい。なおこのとき、アプリケーション処理部54は、再接触があった位置と対向する位置、具体的には端末装置100において再接触位置から球中心を通り反対側の位置の接触/非接触を監視し、再接触の検出と同時に、その反対側位置に、より大きな圧力が加わったことが検出されたときに、上方向キー302aの入力がなされたことを検出してもよい。たとえば、圧力が大きくなったことは、オン値を出力するタッチセンサ120の数が増えたことにより検出される。 It should be noted that when the user removes his / her finger from the terminal device 100 and touches it again, an input to the application may be performed. For example, when the user lifts the middle finger away from the terminal device 100, the contact with the set area of the upward key 302a is lost, and the detection value of the touch sensor 120 at that position changes to an off value indicating non-contact. If the detected value again becomes an ON value indicating contact within a predetermined time after the change of the detected value, the application processing unit 54 detects that the upward key 302a has been input and reflects it in the processing of the application. May be. At this time, the application processing unit 54 monitors the contact / non-contact of the position opposite to the position where the re-contact occurred, specifically, the terminal device 100 from the re-contact position through the center of the sphere to the opposite position. Simultaneously with the detection of re-contact, when it is detected that a larger pressure is applied to the opposite position, it may be detected that the up key 302a has been input. For example, an increase in pressure is detected by an increase in the number of touch sensors 120 that output ON values.
 以上は、接触検出部122がタッチセンサ120で構成されたときの例であるが、接触検出部122は圧力センサで構成されてもよい。圧力センサで構成される場合、ユーザが端末装置100の表面を指で強く押すことで、アプリケーションへの入力がなされてもよい。このときアプリケーション処理部54は、キー設定情報にしたがって、方向キー302が設定されている領域の圧力センサの検出値を監視し、検出値が所定値よりも大きくなったときに、方向キー302の入力がなされたことを検出し、アプリケーションの処理に反映する。 Although the above is an example when the contact detection part 122 is comprised with the touch sensor 120, the contact detection part 122 may be comprised with a pressure sensor. When configured with a pressure sensor, an input to the application may be performed when the user strongly presses the surface of the terminal device 100 with a finger. At this time, the application processing unit 54 monitors the detected value of the pressure sensor in the area where the direction key 302 is set according to the key setting information, and when the detected value becomes larger than a predetermined value, Detects input and reflects it in the application process.
 このようにキー設定部52が、ユーザが端末装置100を把持したときの手指の位置に合わせて、動的にキーを設定することで、ユーザは、自然な持ち方で、端末装置100を入力装置として使用できるようになる。また、割り当てるキーの種類も、実行するアプリケーションに応じて動的に変更することができ、従来にない入力装置を実現できる。たとえば、図10に示す例では、4つの手指の先端に上下左右の方向キー302を割り当てたが、たとえばキャラクターにジャンプ動作、しゃがむ動作、前進動作、後退動作をさせるアプリケーションにおいては、それぞれの動作を指定する操作キーを、各指の先端領域に割り当ててもよい。 In this way, the key setting unit 52 dynamically sets the key in accordance with the position of the finger when the user grips the terminal device 100, so that the user inputs the terminal device 100 in a natural manner. It can be used as a device. Also, the type of key to be assigned can be dynamically changed according to the application to be executed, and an unprecedented input device can be realized. For example, in the example shown in FIG. 10, the up / down / left / right direction keys 302 are assigned to the tips of four fingers. For example, in an application that causes a character to perform a jump operation, a squatting operation, a forward operation, and a backward operation, the respective operations are performed. The designated operation key may be assigned to the tip region of each finger.
 また、キー設定情報の生成後、1つの手指が動かされて、別の位置に配置されるような場合には、キー設定部52が、その手指に対するキー設定情報の変更処理を行う。すなわち、キー設定部52は、元のキー設定情報で特定される領域への接触がなくなり、所定時間にわたり新たな領域への接触の検出が続くと、元の領域を新たな領域へと変更したキー設定情報を再生成する。 Also, after the key setting information is generated, when one finger is moved and placed at another position, the key setting unit 52 performs a process for changing the key setting information for the finger. That is, the key setting unit 52 changes the original area to the new area when the contact with the area specified by the original key setting information is lost and the detection of the contact with the new area continues for a predetermined time. Regenerate key setting information.
 またユーザが端末装置100を持ち替えた場合には、それまでの端末装置100で検出されていた接触領域が大きく変更されることになる。そのような場合には、キー設定部52が、再度、手指の特定処理を実行して、キー設定情報を生成する。 Further, when the user changes the terminal device 100, the contact area detected by the terminal device 100 until then is greatly changed. In such a case, the key setting unit 52 executes the finger specifying process again to generate key setting information.
 なおキー設定部52は、一度キーを割り当てると、割り当てたキーの相対的な位置関係を保持しておく。キー設定部52は、キー設定情報を、相対的な位置関係として保持しておいてもよい。キー設定部52は、次回、端末装置100に同じキー(図10の例では方向キー)を設定する際には、保持した位置関係をもとに、割り当てたキーの画像を、表示部112に表示する。図10の例では、矢印のマークが表示されてもよい。キー設定部52が保持する位置関係は、ユーザの手の大きさにあったものであるため、ユーザは、表示されたキー画像に指を合わせることで、キー設定処理を省略することができる。 Note that once the key setting unit 52 assigns a key, it holds the relative positional relationship of the assigned key. The key setting unit 52 may hold the key setting information as a relative positional relationship. Next time, when the same key (direction key in the example of FIG. 10) is set in the terminal device 100, the key setting unit 52 displays the assigned key image on the display unit 112 based on the held positional relationship. indicate. In the example of FIG. 10, an arrow mark may be displayed. Since the positional relationship held by the key setting unit 52 is in accordance with the size of the user's hand, the user can omit the key setting process by placing a finger on the displayed key image.
 なお、キー設定部52は、保持した位置関係をもとにキーを割り当てる際に、毎回、割り当てる領域が異なるようにする。ユーザは、端末装置100を握った状態でキー操作するため、常に同じ領域にキーを割り当てると、その部分の表示パネル110やタッチセンサ120などの劣化が早まる可能性がある。そのため、キー割当の際に、基準姿勢をもとに、ランダムにキー割当の際の姿勢を設定することで、劣化が特定の箇所に集中することを回避できる。たとえば、割り当てたキーの相対的な位置関係を、中指に割り当てる上方向キー302aを基準とした位置関係として保持している場合、上方向キー302aの領域を、毎回ランダムに定めることで、割り当てる領域を毎回異ならせることが可能となる。なお、キー設定部52が前回のキー設定情報を保持している場合には、上方向キー302aの領域を任意に設定し、その領域を基準として、他の方向キーの領域を、前回のキー設定情報から演算により求めてもよい。 Note that the key setting unit 52 assigns different areas each time a key is assigned based on the held positional relationship. Since the user operates the key while holding the terminal device 100, if the key is always assigned to the same area, there is a possibility that the display panel 110, the touch sensor 120, and the like in that portion are quickly deteriorated. For this reason, when the key assignment is performed, the posture at the time of key assignment is randomly set based on the reference posture, so that deterioration can be prevented from being concentrated at a specific location. For example, when the relative positional relationship of the assigned key is held as a positional relationship based on the upward key 302a assigned to the middle finger, the assigned region is determined by randomly determining the region of the upward key 302a each time. Can be different each time. When the key setting unit 52 holds the previous key setting information, the area of the up direction key 302a is arbitrarily set, and the area of the other direction key is set to the previous key using the area as a reference. You may obtain | require by calculation from setting information.
 なお、ゲームコントローラとして端末装置100が使用される場合、端末装置100は、情報処理装置10からコントローラ番号を割り当てられる。アプリケーション処理部54は、割り当てたコントローラ番号を示す画像を、表示部112に所定期間表示させる。これにより、ユーザは、割り当てられたコントローラ番号を知ることができる。その後、上記したキー割当処理が行われてもよい。 When the terminal device 100 is used as a game controller, the terminal device 100 is assigned a controller number from the information processing device 10. The application processing unit 54 causes the display unit 112 to display an image indicating the assigned controller number for a predetermined period. Thereby, the user can know the assigned controller number. Thereafter, the key assignment process described above may be performed.
(使用例2)
 使用例2では、端末装置100の表示部112に画像を表示させ、具体的には、出力装置20に表示されているキャラクタなどのオブジェクトが、あたかも端末装置100に移動したかのような画像処理を行う。具体的には、まず出力装置20にオブジェクトが表示されている状態で、所定の条件が成立したことを契機として、出力装置20におけるオブジェクトの表示処理を停止し、端末装置100におけるオブジェクトの表示処理を行うことで、ユーザに、オブジェクトが出力装置20から端末装置100に移動してきたかのような感覚を与えることができる。この様子を、図11(a)~図11(c)を用いて説明する。
(Usage example 2)
In Usage Example 2, an image is displayed on the display unit 112 of the terminal device 100. Specifically, an image process as if an object such as a character displayed on the output device 20 has moved to the terminal device 100 is performed. I do. Specifically, when an object is displayed on the output device 20, the object display processing on the output device 20 is stopped when a predetermined condition is met, and the object display processing on the terminal device 100 is performed. By performing the above, it is possible to give the user a feeling as if the object has moved from the output device 20 to the terminal device 100. This will be described with reference to FIGS. 11 (a) to 11 (c).
 アプリケーション処理部54は、オブジェクトの表示画像を生成し、出力装置20に出力する。図11(a)は、出力装置20の画面にキャラクタが表示される様子を示す。このキャラクタが表示されている画面上の領域に、ユーザが端末装置100を接触させる。図11(b)は、画面に表示されているキャラクタに端末装置100を押し当てている様子を示す。 The application processing unit 54 generates a display image of the object and outputs it to the output device 20. FIG. 11A shows how a character is displayed on the screen of the output device 20. The user brings the terminal device 100 into contact with an area on the screen where the character is displayed. FIG. 11B shows a state where the terminal device 100 is pressed against the character displayed on the screen.
 既述したように、位置特定部44は、端末装置100が出力装置20の画面に接触したときの出力装置20の画面上の位置を特定する。アプリケーション処理部54は、キャラクタの表示位置を把握しており、キャラクタ表示位置と、位置特定部44で特定された位置とが一致しているか判定する。一致している場合、アプリケーション処理部54は、キャラクタが出力装置20から端末装置100に移動するような演出を行うべく、端末装置100用のキャラクタの画像データを生成する。 As described above, the position specifying unit 44 specifies the position on the screen of the output device 20 when the terminal device 100 contacts the screen of the output device 20. The application processing unit 54 grasps the display position of the character, and determines whether the character display position matches the position specified by the position specifying unit 44. If they match, the application processing unit 54 generates character image data for the terminal device 100 so as to produce an effect that the character moves from the output device 20 to the terminal device 100.
 制御部50は、端末装置100におけるタッチセンサ120の識別番号と、端末装置100におけるタッチセンサ120の位置とを対応付けたテーブルを保持している。また、制御部50は、端末装置100における表示パネル110の識別番号と、端末装置100における表示パネル110の位置とを対応付けたテーブルも保持している。また制御部50は、タッチセンサ120と表示パネル110との位置関係を特定するテーブルも保持している。 The control unit 50 holds a table in which the identification number of the touch sensor 120 in the terminal device 100 is associated with the position of the touch sensor 120 in the terminal device 100. The control unit 50 also holds a table in which the identification number of the display panel 110 in the terminal device 100 is associated with the position of the display panel 110 in the terminal device 100. The control unit 50 also holds a table for specifying the positional relationship between the touch sensor 120 and the display panel 110.
 アプリケーション処理部54は、各タッチセンサ120の検出値から、出力装置20の画面に接触されたことを示す検出値を出力したタッチセンサ120を特定する。アプリケーション処理部54は、端末装置100がユーザの手指により保持されている状態から、新たに接触を検出したタッチセンサ120を特定する。なお、この特定は、位置特定部44により行われて、アプリケーション処理部54は、位置特定部44から、出力装置20に接触したタッチセンサ120の識別番号を取得してもよい。 The application processing unit 54 identifies the touch sensor 120 that has output a detection value indicating that the screen of the output device 20 has been touched from the detection value of each touch sensor 120. The application processing unit 54 specifies the touch sensor 120 that newly detects contact from the state in which the terminal device 100 is held by the user's finger. This specification may be performed by the position specifying unit 44, and the application processing unit 54 may acquire the identification number of the touch sensor 120 that has touched the output device 20 from the position specifying unit 44.
 アプリケーション処理部54は、タッチセンサ120と表示パネル110との位置関係を特定するテーブルを利用して、接触位置の反対に位置する表示パネル110を特定する。この表示パネル110は、図11(b)において、端末装置100の中心位置に存在する。アプリケーション処理部54は、図11(b)において端末装置100の中心に位置する表示パネル110を特定すると、この表示パネル110にキャラクタ画像の中心がくるように、端末装置100用のキャラクタの画像データを生成する。端末装置100の表示部112において、キャラクタ画像は、複数の表示パネル110に分割して表示されることになるが、アプリケーション処理部54は、表示パネル110の識別番号と、その表示パネル110に表示させる画像データとを対応付けて、キャラクタ全体の画像データを生成する。送信部32は、生成された画像データを端末装置100に送信し、制御部200は、指定された表示パネル110の識別番号と画像データをもとに、各表示パネル110で表示する画像を生成する。図11(c)は、キャラクタが端末装置100に表示される様子を示す。端末装置100がキャラクタを表示すると同時に、出力装置20は、キャラクタの表示を終了する。 The application processing unit 54 uses the table that specifies the positional relationship between the touch sensor 120 and the display panel 110 to specify the display panel 110 located opposite to the contact position. The display panel 110 is present at the center position of the terminal device 100 in FIG. When the application processing unit 54 specifies the display panel 110 located at the center of the terminal device 100 in FIG. 11B, the character image data for the terminal device 100 is such that the center of the character image comes to the display panel 110. Is generated. In the display unit 112 of the terminal device 100, the character image is divided and displayed on the plurality of display panels 110. The application processing unit 54 displays the identification number of the display panel 110 and the display panel 110. The image data of the entire character is generated in association with the image data to be processed. The transmission unit 32 transmits the generated image data to the terminal device 100, and the control unit 200 generates an image to be displayed on each display panel 110 based on the identification number of the specified display panel 110 and the image data. To do. FIG. 11C shows a state in which the character is displayed on the terminal device 100. At the same time that the terminal device 100 displays the character, the output device 20 ends the character display.
 図11(c)にキャラクタが表示された状態で、再度、端末装置100を出力装置20の画面に押しつけると、アプリケーション処理部54は、キャラクタを、押しつけられた出力装置20の画面位置に戻してもよい。端末装置100が押しつけられた出力装置20の画面上の位置は、位置特定部44により特定される。この例では、ユーザが、キャラクタを端末装置100に移動させ、また出力装置20に戻すアプリケーションを実現できる。アプリケーション処理部54は、端末装置100におけるキャラクタの表示を中止し、出力装置20の画面上の任意の位置にキャラクタを表示できるので、たとえば、キャラクタが端末装置100に移動している間は、キャラクタが体力を回復し、回復後、またキャラクタをゲーム画面に戻すような、よりインタラクティブなゲームを作ることができる。 When the terminal device 100 is pressed against the screen of the output device 20 again while the character is displayed in FIG. 11C, the application processing unit 54 returns the character to the screen position of the pressed output device 20. Also good. The position on the screen of the output device 20 where the terminal device 100 is pressed is specified by the position specifying unit 44. In this example, an application in which the user moves the character to the terminal device 100 and returns it to the output device 20 can be realized. Since the application processing unit 54 can stop displaying the character on the terminal device 100 and display the character at an arbitrary position on the screen of the output device 20, for example, while the character is moving to the terminal device 100, the character is displayed. Makes it possible to create a more interactive game that restores physical strength and then returns the character to the game screen.
 なお、図11(a)~図11(c)では、キャラクタの表示を出力装置20の画面から端末装置100の表示部112に切り替える例を示したが、画面上の接触位置に応じた表示物が表示部112に表示されてもよい。たとえば、あるアプリケーションを実行中に、端末装置100を出力装置20の画面に接触させると、出力装置20の接触位置に表示されている内容に関係するヘルプ画面が表示部112に表示されてもよい。 11A to 11C show an example in which the character display is switched from the screen of the output device 20 to the display unit 112 of the terminal device 100. However, the display object according to the contact position on the screen is shown. May be displayed on the display unit 112. For example, when the terminal device 100 is brought into contact with the screen of the output device 20 while an application is being executed, a help screen related to the content displayed at the contact position of the output device 20 may be displayed on the display unit 112. .
 なお既述したように、位置特定部44は、端末装置100と出力装置20の画面との距離を導出できる。図11(b)は、端末装置100を出力装置20の画面に押し当てている様子を示しているが、端末装置100を画面に接触させるまでの過程において、位置特定部44は、端末装置100と出力装置20の画面との距離を導出して、アプリケーション処理部54に伝え、アプリケーション処理部54は、所定の距離よりも接近した場合に、端末装置100への表示対象となるキャラクタが選択されたことをユーザに通知するようにしてもよい。この通知は、たとえば端末装置100の表示部112に、キャラクタが吸い込まれるような画像を生成することで行われてもよく、また、出力装置20の画面上でキャラクタが端末装置100に吸い込まれるような画像を生成することで行われてもよい。このような通知により、ユーザは、端末装置100の表示部112に表示させるキャラクタを認識でき、そして押し当てた時点で、完全なキャラクタが表示部112にて表示されるようにしてもよい。 As already described, the position specifying unit 44 can derive the distance between the terminal device 100 and the screen of the output device 20. FIG. 11B shows a state in which the terminal device 100 is pressed against the screen of the output device 20. In the process until the terminal device 100 is brought into contact with the screen, the position specifying unit 44 is connected to the terminal device 100. And the screen of the output device 20 are derived and transmitted to the application processing unit 54. When the application processing unit 54 is closer than a predetermined distance, a character to be displayed on the terminal device 100 is selected. The user may be notified of this. This notification may be performed, for example, by generating an image in which the character is sucked in the display unit 112 of the terminal device 100, and the character is sucked into the terminal device 100 on the screen of the output device 20. This may be done by generating a simple image. By such notification, the user can recognize the character to be displayed on the display unit 112 of the terminal device 100, and the complete character may be displayed on the display unit 112 when pressed.
(使用例3)
 使用例3では、端末装置100に表示したオブジェクトを、ユーザの操作に応じて制御する。以下、使用例2で説明した図11(c)に示すキャラクタの動作を制御する例を示す。
(Usage example 3)
In Usage Example 3, the object displayed on the terminal device 100 is controlled in accordance with a user operation. Hereinafter, an example of controlling the movement of the character shown in FIG.
 図12(a)は、端末装置100がユーザの手指で挟まれている状態を示す。このように挟持された場合、手のひらに面する表示部112にキャラクタを表示しても、ユーザは見ることができない。そのため、アプリケーション処理部54は、各タッチセンサ120の検出値から、端末装置100が挟持されている状態を特定し、具体的には、右手であるか、左手であるかを特定して、手のひらの位置を推測する。 FIG. 12A shows a state where the terminal device 100 is sandwiched between the fingers of the user. When sandwiched in this way, even if the character is displayed on the display unit 112 facing the palm, the user cannot see it. Therefore, the application processing unit 54 specifies the state in which the terminal device 100 is held from the detection value of each touch sensor 120, specifically, specifies whether the terminal device 100 is the right hand or the left hand, Guess the position.
 図10に示すように、端末装置100が握られている場合には、手のひらも端末装置100の表面に接触するため、タッチセンサ120の検出値により、手のひらの位置を特定できる。一方、図12(a)に示すように、端末装置100が指で挟持されている場合には、手のひらが端末装置100の表面に接触しないため、タッチセンサ120の検出値から直接手のひらの位置を特定することはできない。そのため、アプリケーション処理部54は、各タッチセンサ120の検出値から、挟持している手が右手か左手かを判定する。 As shown in FIG. 10, when the terminal device 100 is gripped, the palm also comes in contact with the surface of the terminal device 100, so that the position of the palm can be specified by the detection value of the touch sensor 120. On the other hand, as shown in FIG. 12A, when the terminal device 100 is held between fingers, the palm does not contact the surface of the terminal device 100, so the palm position is directly determined from the detection value of the touch sensor 120. It cannot be specified. Therefore, the application processing unit 54 determines whether the holding hand is the right hand or the left hand from the detection value of each touch sensor 120.
 図12(b)は、端末装置100を5本の指先で挟んだときにタッチセンサ120が検出する接触状態を示す説明図である。5本の指先で挟んだとき、5つの接触領域304a~304eが検出される。なお点線で表現している接触領域304eは、図12(b)に示す状態において、端末装置100の下面側に位置する。このとき、アプリケーション処理部54は、一番小さい接触領域304dを検出することで、右手または左手の判定を行う。5本の手指で端末装置100を挟んだとき、小指の接触領域が一番小さくなる。そのため、アプリケーション処理部54は、接触領域304dの指が小指であることを判定する。小指が特定されると、小指に一番近い位置に薬指が存在するため、接触領域304cの指が薬指、接触領域304bの指が中指、接触領域304aの指が人差し指であることが判定でき、接触領域304eが親指であることも判定できる。 FIG. 12B is an explanatory diagram showing a contact state detected by the touch sensor 120 when the terminal device 100 is sandwiched between five fingertips. When sandwiched between five fingertips, five contact areas 304a to 304e are detected. Note that the contact region 304e expressed by a dotted line is located on the lower surface side of the terminal device 100 in the state shown in FIG. At this time, the application processing unit 54 determines the right hand or the left hand by detecting the smallest contact area 304d. When the terminal device 100 is sandwiched between five fingers, the contact area of the little finger is the smallest. Therefore, the application processing unit 54 determines that the finger in the contact area 304d is a little finger. When the little finger is specified, since the ring finger is present at the position closest to the little finger, it can be determined that the finger in the contact area 304c is the ring finger, the finger in the contact area 304b is the middle finger, and the finger in the contact area 304a is the index finger. It can also be determined that the contact area 304e is a thumb.
 なおアプリケーション処理部54は、親指の接触領域を特定することで、各接触領域の指を特定してもよい。5本の手指で端末装置100を挟んだとき、親指の接触領域が一番大きくなる。そのため、アプリケーション処理部54は、接触領域304eの指が親指であることを判定する。親指が特定されると、親指に一番近い位置に人差し指が存在するため、接触領域304aの指が人差し指であることが判定できる。したがって、順に中指、薬指、小指の接触領域を判定できる。 The application processing unit 54 may specify the finger of each contact area by specifying the contact area of the thumb. When the terminal device 100 is sandwiched between five fingers, the contact area of the thumb is the largest. Therefore, the application processing unit 54 determines that the finger in the contact area 304e is a thumb. When the thumb is specified, since the index finger is present at a position closest to the thumb, it can be determined that the finger in the contact area 304a is the index finger. Therefore, the contact area of the middle finger, the ring finger, and the little finger can be determined in order.
 なおアプリケーション処理部54は、親指の接触領域が一番大きく、小指の接触領域が一番小さくなることを利用して、各接触領域304の面積から、親指および小指を特定してもよい。 Note that the application processing unit 54 may specify the thumb and the little finger from the area of each contact area 304 using the fact that the contact area of the thumb is the largest and the contact area of the little finger is the smallest.
 またアプリケーション処理部54は、接触領域304の間隔から、手指を特定してもよい。手のサイズに対して、端末装置100が相対的に大きい場合、親指と、親指以外の4つの指は、図12(b)に示すような配置となる傾向がある。すなわち、親指以外の4つの指同士の間隔は、比較的狭く、親指が、他の指と比較的離れるように配置される。アプリケーション処理部54は、1つの接触領域304に対して、他の4つの接触領域304との間隔を演算し、演算した4つの間隔のうち、最も小さい間隔を特定する。この処理により、5つの指に対してそれぞれの最小間隔が導出される。アプリケーション処理部54は、各接触領域304における最小間隔の大小を比較する。このとき、最も大きい最小間隔と、2番目に大きい最小間隔との比を導出し、(最も大きい最小間隔)/(2番目に大きい最小間隔)が2以上である場合に、手のサイズに対して端末装置100が相対的に大きいことが判定され、以下の指特定処理が実行される。 Further, the application processing unit 54 may specify a finger from the interval of the contact area 304. When the terminal device 100 is relatively large with respect to the size of the hand, the thumb and the four fingers other than the thumb tend to be arranged as shown in FIG. That is, the interval between the four fingers other than the thumb is relatively narrow, and the thumb is disposed so as to be relatively separated from the other fingers. The application processing unit 54 calculates the interval between one contact region 304 and the other four contact regions 304, and specifies the smallest interval among the four calculated intervals. With this process, the minimum interval for each of the five fingers is derived. The application processing unit 54 compares the minimum intervals in the contact areas 304 with each other. At this time, the ratio of the largest minimum interval to the second largest minimum interval is derived, and when (largest minimum interval) / (second largest minimum interval) is 2 or more, the size of the hand It is determined that the terminal device 100 is relatively large, and the following finger specifying process is executed.
 アプリケーション処理部54は、最も大きい最小間隔をもつ接触領域304を、親指の接触領域として特定する。図12(b)の場合は、接触領域304eが親指であることが特定される。親指が特定されると、次に人差し指が特定される。端末装置100が手のサイズに対して相対的に大きい場合、親指と人差し指の間隔の方が、親指と小指の間隔よりも狭くなる。そのため、アプリケーション処理部54は、接触領域304eに最も近い接触領域304aを、人差し指の接触領域として特定する。その後、アプリケーション処理部54は、中指、薬指、小指の接触領域を特定する。 The application processing unit 54 identifies the contact area 304 having the largest minimum interval as the contact area of the thumb. In the case of FIG. 12B, it is specified that the contact area 304e is a thumb. Once the thumb is identified, the index finger is then identified. When the terminal device 100 is relatively large with respect to the size of the hand, the distance between the thumb and the index finger is narrower than the distance between the thumb and the little finger. Therefore, the application processing unit 54 specifies the contact area 304a closest to the contact area 304e as the contact area of the index finger. Thereafter, the application processing unit 54 specifies the contact area of the middle finger, the ring finger, and the little finger.
 なお、手のサイズに対して端末装置100が相対的に小さい場合、すなわち(最も大きい最小間隔)/(2番目に大きい最小間隔)が2よりも小さい場合、ユーザは、指の腹ではなく、指の先端で端末装置100を持つようになる。したがって、この場合は、上記したように、接触領域304の面積を利用した指特定処理が有効となる。 In addition, when the terminal device 100 is relatively small with respect to the size of the hand, that is, when (largest minimum interval) / (second largest minimum interval) is smaller than 2, the user is not a finger belly, The terminal device 100 is held at the tip of the finger. Therefore, in this case, as described above, the finger specifying process using the area of the contact region 304 is effective.
 以上のように接触領域304の指を特定すると、アプリケーション処理部54は、手のひらの位置を推定する。ここでは、たとえば接触領域304a~304dの長さ方向の先端を結んで、互いに交わらない仮想ラインを引き(図中の312a、312b)、その曲がり具合によって、手のひらの向きを推定する。方向306側から見た弧が描ける場合には、手のひらが方向306側に位置し、方向308側から見た弧が描ける場合には、手のひらが方向308側に位置する。この例では、方向306側から見た弧が描けるため、右手のひらが方向306側に位置することが分かる。したがって、アプリケーション処理部54は、端末装置100が図12(a)に示すように指で挟持されていることを判定し、キャラクタを表示領域310に表示することを定め、キャラクタを表示する。 When the finger of the contact area 304 is specified as described above, the application processing unit 54 estimates the palm position. Here, for example, the lengthwise ends of the contact areas 304a to 304d are connected, virtual lines that do not intersect with each other are drawn (312a and 312b in the figure), and the direction of the palm is estimated based on the degree of bending. When an arc viewed from the direction 306 side can be drawn, the palm is positioned on the direction 306 side. When an arc viewed from the direction 308 side can be drawn, the palm is positioned on the direction 308 side. In this example, since the arc viewed from the direction 306 side can be drawn, it can be seen that the right palm is located on the direction 306 side. Therefore, the application processing unit 54 determines that the terminal device 100 is pinched with a finger as shown in FIG. 12A, determines to display the character in the display area 310, and displays the character.
 アプリケーション処理部54は、接触領域304の配置を監視し、表示領域310をリアルタイムで設定する。これにより、ユーザが、端末装置100を持つ位置を変えた場合にも、適切な位置に表示領域310を設定して、キャラクタを表示できる。 The application processing unit 54 monitors the arrangement of the contact area 304 and sets the display area 310 in real time. Thereby, even when the user changes the position where the terminal device 100 is held, the display area 310 can be set at an appropriate position and the character can be displayed.
 アプリケーション処理部54は、少なくとも接触領域304には表示領域310を設定しない。また、アプリケーション処理部54は、手のひらが存在すると推定される方向の領域にも表示領域310を設定しない。このようにアプリケーション処理部54は、ユーザが見ることのできない位置に表示領域310を設定しないようにし、したがって、ユーザが端末装置100を持ち替えたような場合にも、接触領域304の指を特定し、手のひらが存在する方向を特定することで、動的に適切な位置に表示領域310を設定できる。 The application processing unit 54 does not set the display area 310 at least in the contact area 304. Further, the application processing unit 54 does not set the display area 310 in the area in the direction in which the palm is estimated to exist. As described above, the application processing unit 54 does not set the display area 310 at a position that the user cannot see, and therefore specifies the finger of the contact area 304 even when the user changes the terminal device 100. By specifying the direction in which the palm exists, the display area 310 can be dynamically set at an appropriate position.
 ユーザが表示パネル110上でキャラクタを指でなぞると、キャラクタが、なぞられた方向に移動するように表示される。アプリケーション処理部54は、各タッチセンサ120の検出値から、「なぞる動作」を検出する。「なぞる動作」は、端末装置100の表面上の接触領域が一方向に移動することで検出される。アプリケーション処理部54は、各タッチセンサ120の検出値から、接触領域が移動していることを判定すると、その方向にキャラクタ画像を移動するように、画像データを生成し、送信部32が送信する。端末装置100において制御部200は、画像データをもとに、表示部112上にキャラクタ画像を表示する。これにより、表示部112には、なぞられた方向に移動するキャラクタが表示される。 When the user traces the character with the finger on the display panel 110, the character is displayed so as to move in the traced direction. The application processing unit 54 detects a “tracing operation” from the detection value of each touch sensor 120. The “tracing operation” is detected when the contact area on the surface of the terminal device 100 moves in one direction. When the application processing unit 54 determines from the detection value of each touch sensor 120 that the contact area is moving, the application processing unit 54 generates image data so as to move the character image in that direction, and the transmission unit 32 transmits the image data. . In the terminal device 100, the control unit 200 displays a character image on the display unit 112 based on the image data. Thereby, the character which moves in the traced direction is displayed on the display unit 112.
 使用例2において、表示部112にヘルプ画面を表示する例を示したが、ユーザが、ヘルプ画面が表示されている表示パネル110上を指でなぞると、ヘルプ画面の次のページが表示されるようにしてもよい。またユーザが端末装置100を水平面において所定角度ひねると、ページがめくられるようにしてもよい。端末装置100のひねり角度は、角速度センサ164の検出値により導出される。 In the usage example 2, the help screen is displayed on the display unit 112. However, when the user traces the display panel 110 on which the help screen is displayed, the next page of the help screen is displayed. You may do it. Further, when the user twists the terminal device 100 at a predetermined angle in the horizontal plane, the page may be turned. The twist angle of the terminal device 100 is derived from the detection value of the angular velocity sensor 164.
(使用例4)
 使用例4では、端末装置100へのユーザの入力により、新たな情報やオブジェクトをユーザに提供する。ユーザが端末装置100の表面を押すと、その表面位置に応じた情報が表示部112に表示される。たとえば端末装置100に地球を表示している場合、ユーザが表面を押すと、その位置に対応するニュースや世界遺産情報などが表示される。
(Usage example 4)
In the usage example 4, new information and objects are provided to the user by the user input to the terminal device 100. When the user presses the surface of the terminal device 100, information corresponding to the surface position is displayed on the display unit 112. For example, when the earth is displayed on the terminal device 100, when the user presses the surface, news, world heritage information, or the like corresponding to the position is displayed.
 アプリケーション処理部54は、地球の画像データを生成する。アプリケーション処理部54は、表示パネル110とタッチセンサ120の位置関係を把握しているため、接触があったことを示す検出値を出力したタッチセンサ120を特定すると、そのタッチセンサ120の位置における仮想地球上の緯度経度を特定できる。このとき、アプリケーション処理部54は、新たに接触があったことを示す検出値を出力したタッチセンサ120を特定することで、ユーザが指定した緯度経度を特定する。アプリケーション処理部54は、その緯度経度に存在する地域に関連する情報を生成し、送信部32が端末装置100に送信すると、制御部200が、その情報を表示部112に表示する。 Application processing unit 54 generates image data of the earth. Since the application processing unit 54 knows the positional relationship between the display panel 110 and the touch sensor 120, when the touch sensor 120 that outputs a detection value indicating that there is a touch is specified, the virtual position at the position of the touch sensor 120 is identified. The latitude and longitude on the earth can be specified. At this time, the application processing unit 54 specifies the latitude and longitude designated by the user by specifying the touch sensor 120 that has output a detection value indicating that a new contact has occurred. When the application processing unit 54 generates information related to an area existing in the latitude and longitude, and the transmission unit 32 transmits the information to the terminal device 100, the control unit 200 displays the information on the display unit 112.
 またユーザが端末装置100を回転すると、地球は、その回転に合わせて回るが、回転をとめたときには、これまでの回転に応じて、地球儀のように慣性で回るような画像処理が行われる。 Further, when the user rotates the terminal device 100, the earth rotates in accordance with the rotation. When the user stops the rotation, image processing is performed so as to rotate by inertia like a globe according to the previous rotation.
 アプリケーション処理部54は、角速度センサ164の検出値から、端末装置100の回転速度を導出する。回転速度は、回転中の任意のタイミングで導出される。アプリケーション処理部54は、導出した回転速度を保持しておく。アプリケーション処理部54は、加速度センサ162の検出値から端末装置100の回転停止を検出すると、表示する地球の画像を、これまでの回転方向で回転させ、徐々に回転速度を落とすように制御する。このときアプリケーション処理部54は、端末装置100の回転時に導出した回転速度をもとに、あたかも地球儀が慣性で回っているかのように、徐々に表示する地球画像の回転速度を緩めていく。また、ユーザが端末装置100から手を離して、机に置いたような場合にも、慣性で回るようにしてもよい。 The application processing unit 54 derives the rotation speed of the terminal device 100 from the detection value of the angular velocity sensor 164. The rotation speed is derived at an arbitrary timing during the rotation. The application processing unit 54 holds the derived rotation speed. When detecting the rotation stop of the terminal device 100 from the detection value of the acceleration sensor 162, the application processing unit 54 controls the Earth image to be displayed to rotate in the rotation direction so far and gradually decrease the rotation speed. At this time, the application processing unit 54 gradually reduces the rotation speed of the displayed earth image as if the globe is rotating with inertia based on the rotation speed derived when the terminal device 100 is rotated. Further, even when the user removes his / her hand from the terminal device 100 and places it on a desk, the user may rotate by inertia.
 逆に、端末装置100を回転させても、アプリケーション処理部54は、回転速度を導出し、表示画像を、端末装置100とは反対方向に回転するように制御することで、端末装置100の回転にかかわらず、同じ向きからは同じ画像が見えるようにしてもよい。 Conversely, even if the terminal device 100 is rotated, the application processing unit 54 derives the rotation speed and controls the display image to rotate in the opposite direction to the terminal device 100, thereby rotating the terminal device 100. Regardless, the same image may be seen from the same direction.
(使用例5)
 使用例5では、音楽ファイルやムービーファイルなどのコンテンツのサムネイルを、表示部112に並べる。コンテンツは、情報処理装置10に保持されており、ユーザがサムネイルを選択すると、コンテンツが出力装置20上で再生される。たとえば、図12(a)に示すように、ユーザが端末装置100を持っているとき、アプリケーション処理部54は、たとえば経度(上下方向)をジャンルで分け、緯度(左右方向)を年代で分けて、ユーザが感覚的にコンテンツを選択できるようにする。サムネイルを表示する契機は、たとえばユーザが端末装置100を振ったときであり、アプリケーション処理部54は、加速度センサ162の検出値から、端末装置100が振られたことを検出すると、コンテンツのサムネイル画像をジャンルと年代で分類した画像データを生成する。このような情報の提示は、たとえば、インターネット上での記事のアクセスランキングなどの表示にも利用できる。この場合も、経度をジャンルで分け、緯度を順位で分けて、ユーザが感覚的にランキングを把握できるようにする。
(Usage example 5)
In Usage Example 5, thumbnails of contents such as music files and movie files are arranged on the display unit 112. The content is held in the information processing apparatus 10, and when the user selects a thumbnail, the content is played on the output device 20. For example, as shown in FIG. 12A, when the user has the terminal device 100, the application processing unit 54 divides longitude (vertical direction) by genre and divides latitude (horizontal direction) by age. , Allowing the user to select content sensuously. The opportunity to display the thumbnail is, for example, when the user shakes the terminal device 100. When the application processing unit 54 detects that the terminal device 100 is shaken from the detection value of the acceleration sensor 162, the thumbnail image of the content is displayed. Image data classified by genre and age. Such information presentation can also be used to display, for example, the access ranking of articles on the Internet. Also in this case, the longitude is divided by genre and the latitude is divided by rank so that the user can grasp the ranking sensuously.
 ユーザは表示されたサムネイルを選択し、再生することを決定すると、出力装置20においてコンテンツが再生される。選択操作は、たとえばサムネイルをタップすることであり、決定操作は、端末装置100を振ることで行われる。アプリケーション処理部54は、タッチセンサ120の検出値からタップされた位置を検出して、その位置に表示しているサムネイルを特定し、端末装置100が振られたことを検出すると、そのコンテンツを記憶装置から読み出して、再生する。 When the user selects the displayed thumbnail and decides to reproduce it, the content is reproduced on the output device 20. The selection operation is, for example, tapping a thumbnail, and the determination operation is performed by shaking the terminal device 100. The application processing unit 54 detects the tapped position from the detection value of the touch sensor 120, identifies the thumbnail displayed at the position, and stores the content when detecting that the terminal device 100 is shaken. Read from device and play.
(使用例6)
 使用例6では、ゲームの実行中に、出力装置20の画面に表示されるゲーム画像とは異なる画像を、端末装置100の表示部112に表示する。ゲーム画像が、キャラクタの視点からの映像であるとき、表示部112には、別の視点からの画像が表示される。たとえばアプリケーション処理部54は、端末装置100の動きを検出し、キャラクタ視点を基準として、端末装置100が上に持ち上げられたら、キャラクタより上の視点からの画像を表示部112に表示し、また左に動かされたら、キャラクタより左の視点からの画像を表示部112に表示する。ゲームによって表示部112に様々な画像をだすことができ、ユーザは、端末装置100を動かすことで、アイテムを探したり、ゲーム空間を俯瞰するなど、出力装置20の画面に表示されていないゲーム世界の情報を取得できるようにしてもよい。
(Usage example 6)
In Usage Example 6, an image different from the game image displayed on the screen of the output device 20 is displayed on the display unit 112 of the terminal device 100 during the execution of the game. When the game image is a video from the character's viewpoint, an image from another viewpoint is displayed on the display unit 112. For example, the application processing unit 54 detects the movement of the terminal device 100, displays the image from the viewpoint above the character on the display unit 112 when the terminal device 100 is lifted up on the basis of the character viewpoint, The image from the viewpoint left from the character is displayed on the display unit 112. Various images can be displayed on the display unit 112 depending on the game, and the user can move the terminal device 100 to search for an item or look down on the game space. The information may be acquired.
 図13(a)は、出力装置20の画面に表示されるゲーム画像を示す。このゲーム画像は、キャラクタの視点からの映像である。アプリケーション処理部54は、3次元ゲーム空間において仮想カメラをキャラクタの背後に設置し、ユーザからの操作データをもとに、ゲーム画像を生成し、出力装置20から出力させる。このときアプリケーション処理部54は、端末装置100の表示部112の構成に合わせた画像データを生成し、送信部32が端末装置100に送信する。端末装置100において、制御部200が画像データを受け取ると、表示部112にゲーム画像を表示させる。これにより、端末装置100の表示部112にも、出力装置20と同じゲーム画像が表示されることになる。 FIG. 13A shows a game image displayed on the screen of the output device 20. This game image is a video from the viewpoint of the character. The application processing unit 54 installs a virtual camera behind the character in the three-dimensional game space, generates a game image based on operation data from the user, and outputs the game image from the output device 20. At this time, the application processing unit 54 generates image data that matches the configuration of the display unit 112 of the terminal device 100, and the transmission unit 32 transmits the image data to the terminal device 100. In the terminal device 100, when the control unit 200 receives image data, the display unit 112 displays a game image. Thereby, the same game image as that of the output device 20 is also displayed on the display unit 112 of the terminal device 100.
 図13(b)は、ユーザが端末装置100を左に動かしたときに端末装置100に表示されるゲーム画像を示す。情報処理装置10において、姿勢特定部42は、モーションセンサ160の検出値から、端末装置100の移動方向および移動量を特定する。アプリケーション処理部54は、姿勢特定部42から、端末装置100の移動方向および移動量を受け取ると、その移動方向および移動量を、3次元ゲーム空間における仮想カメラの移動方向および移動量に変換する。なお仮想カメラの画角には、キャラクタが含まれるようにする。アプリケーション処理部54は、移動した仮想カメラで撮影されるゲーム画像を生成し、端末装置100の表示部112に合わせた画像データを生成する。送信部32が画像データを端末装置100に送信し、端末装置100において制御部200が画像データを受け取り、表示部112に表示させる。これによりユーザは、出力装置20の表示画像とは異なるゲーム画像を端末装置100において見ることができ、図13(b)の例では、木の陰に犬がいることを発見できる。 FIG. 13B shows a game image displayed on the terminal device 100 when the user moves the terminal device 100 to the left. In the information processing apparatus 10, the posture specifying unit 42 specifies the moving direction and moving amount of the terminal device 100 from the detection value of the motion sensor 160. When the application processing unit 54 receives the movement direction and movement amount of the terminal device 100 from the posture specifying unit 42, the application processing unit 54 converts the movement direction and movement amount into the movement direction and movement amount of the virtual camera in the three-dimensional game space. Note that a character is included in the angle of view of the virtual camera. The application processing unit 54 generates a game image shot by the moved virtual camera, and generates image data that matches the display unit 112 of the terminal device 100. The transmission unit 32 transmits the image data to the terminal device 100, and the control unit 200 receives the image data in the terminal device 100 and causes the display unit 112 to display the image data. Thereby, the user can see the game image different from the display image of the output device 20 on the terminal device 100, and in the example of FIG. 13B, can find that the dog is behind the tree.
 また、ユーザが端末装置100を出力装置20の画面に押し当てたとき、ゲーム攻略のヒントなど、ゲーム画像に表示されていない情報が表示部112に表示されてもよい。具体的なヒントではなく、注目するタイミングであることをユーザに提示するために、表示部112の全体が光ったりしてもよい。なお、ヒントの提示は、ゲーム進行に応じて行われてもよく、たとえばチャンスが到来したときに、表示部112にヒントや、注意喚起するための発光などが表示されてもよい。このヒントの提示は、たとえばスピーカから音声出力することで実行されてもよい。また、表示部112には、ゲームのリプレイ画像が表示されてもよい。リプレイ画像の表示に際しては、たとえば端末装置100の姿勢を変えることで、視点を変更できるようにしてもよい。 In addition, when the user presses the terminal device 100 against the screen of the output device 20, information that is not displayed in the game image, such as a hint for game capture, may be displayed on the display unit 112. The entire display unit 112 may shine to indicate to the user that it is not a specific hint but a timing to focus on. The hints may be presented in accordance with the progress of the game. For example, when a chance comes, a hint, light emission for calling attention, or the like may be displayed on the display unit 112. This hint presentation may be executed by, for example, outputting sound from a speaker. In addition, a replay image of the game may be displayed on the display unit 112. When displaying the replay image, the viewpoint may be changed by changing the attitude of the terminal device 100, for example.
(使用例7)
 使用例7では、端末装置100そのものの動きを、出力装置20の画面に表示する。たとえば端末装置100を投げたり、バウンドさせたりしたときに、アプリケーション処理部54は、モーションセンサ160による検出値をもとに、この動きを出力装置20の画面上に再現する。たとえば端末装置100を野球のボールと見立てて、ユーザが実際に投げたとき、出力装置20の画面に、その回転や軌道が映し出されることで、ピッチング練習に利用できる。
(Usage example 7)
In Usage Example 7, the movement of the terminal device 100 itself is displayed on the screen of the output device 20. For example, when the terminal device 100 is thrown or bound, the application processing unit 54 reproduces this movement on the screen of the output device 20 based on the detection value by the motion sensor 160. For example, when the terminal device 100 is regarded as a baseball ball and the user actually throws it, the rotation and trajectory are displayed on the screen of the output device 20 so that it can be used for pitching practice.
(使用例8)
 使用例8では、複数の端末装置100を接触させたことを、アプリケーションの処理に反映する。たとえば、ゲームであれば、キャラクタの特性が変化したり、キャラクタが合体するなど、端末装置100同士が接触したことを観念させる処理が行われる。端末装置100同士の接触は、それぞれの接触面における法線ベクトルが互いに逆向きとなっていることで判定される。法線ベクトルは、端末装置100の姿勢および接触領域によって定められる。
(Usage example 8)
In the usage example 8, the fact that a plurality of terminal devices 100 are brought into contact is reflected in the processing of the application. For example, if it is a game, the process which makes it think that the terminal devices 100 contacted is performed, such as a characteristic of a character changing or a character uniting. The contact between the terminal devices 100 is determined by the normal vectors on the contact surfaces being opposite to each other. The normal vector is determined by the attitude of the terminal device 100 and the contact area.
 以上、本発明を実施例をもとに説明した。この実施例は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The present invention has been described based on the embodiments. This embodiment is an exemplification, and it will be understood by those skilled in the art that various modifications can be made to the combination of each component and each processing process, and such modifications are also within the scope of the present invention. .
 端末装置100は、情報処理装置10から起動信号を受信すると起動し、また終了信号を受信するとスリープするように構成されてもよい。この場合、端末装置100は、情報処理装置10が動作していることを条件として動作可能であり、したがって、たとえば情報処理装置10から所定時間、信号を受信できないときには、自動的にスリープするように構成されてもよい。なお端末装置100は、スリープ状態において接触検出部122またはモーションセンサ160の検出値を監視し、端末装置100が所定の状態となったときに自律的に起動するよう構成されてもよい。所定の状態とは、たとえば端末装置100のほぼ全面が覆われた状態や、非常に高速で動かされた状態などであってよく、端末装置100の起動後には通常とらない状態であることが好ましい。端末装置100は、動作中に、この所定の動作状態となることで、スリープするように構成されてもよい。 The terminal device 100 may be configured to start when it receives a start signal from the information processing device 10 and to sleep when it receives an end signal. In this case, the terminal device 100 can operate on the condition that the information processing device 10 is operating. Therefore, for example, when a signal cannot be received from the information processing device 10 for a predetermined time, the terminal device 100 automatically sleeps. It may be configured. Note that the terminal device 100 may be configured to monitor the detection value of the contact detection unit 122 or the motion sensor 160 in the sleep state and to autonomously start when the terminal device 100 enters a predetermined state. The predetermined state may be, for example, a state in which almost the entire surface of the terminal device 100 is covered, a state in which the terminal device 100 is moved at a very high speed, or the like. . The terminal device 100 may be configured to sleep by entering the predetermined operation state during operation.
 なお図10に示すように端末装置100がユーザによって握られている場合、アプリケーション処理部54は、端末装置100の姿勢から、手のひらの向きを特定してもよい。手のひらの向きを特定することで、端末装置100が、手のひらを上向きとした状態で握られているのか、または手のひらを下向きとした状態で握られているのかを判定できる。なお図10に関連して説明したキー設定処理では、キー設定部52が接触領域の特定処理などを行っていたが、この例では、アプリケーション処理部54が同様の処理を実行する。姿勢特定部42が、端末装置100から受け取ったモーションセンサ160の検出値と基準値との差分から現在の姿勢を特定すると、アプリケーション処理部54は、姿勢特定部42により特定された現在の姿勢と、手のひらの接触を検出したタッチセンサ120の位置から、手のひらが重力方向に対していずれを向いているかを特定する。アプリケーション処理部54は、手のひらの向きを、アプリケーションの処理に反映するための入力データとして利用してもよい。 Note that, as shown in FIG. 10, when the terminal device 100 is held by the user, the application processing unit 54 may specify the orientation of the palm from the attitude of the terminal device 100. By specifying the orientation of the palm, it is possible to determine whether the terminal device 100 is gripped with the palm facing upward or with the palm facing downward. In the key setting process described with reference to FIG. 10, the key setting unit 52 performs the contact area specifying process and the like. In this example, the application processing unit 54 performs the same process. When the posture identifying unit 42 identifies the current posture from the difference between the detection value of the motion sensor 160 received from the terminal device 100 and the reference value, the application processing unit 54 determines the current posture identified by the posture identifying unit 42 and the current posture. From the position of the touch sensor 120 that detects the contact of the palm, it is specified which direction the palm is facing with respect to the direction of gravity. The application processing unit 54 may use the palm direction as input data for reflecting the palm direction in the processing of the application.
 上記実施例では、表示部112に表示する画像データを情報処理装置10におけるアプリケーション処理部54が生成したが、端末装置100における制御部200が、アプリケーション処理部54と同じ機能を有して、画像データを生成してもよい。 In the above embodiment, the application processing unit 54 in the information processing apparatus 10 generates image data to be displayed on the display unit 112. However, the control unit 200 in the terminal device 100 has the same function as the application processing unit 54, and Data may be generated.
1・・・情報処理システム、10・・・情報処理装置、20・・・出力装置、30・・・通信部、32・・・送信部、34・・・受信部、40・・・端末情報処理部、42・・・姿勢特定部、44・・・位置特定部、50・・・制御部、52・・・キー設定部、54・・・アプリケーション処理部、100・・・端末装置、110・・・表示パネル、112・・・表示部、120・・・タッチセンサ、122・・・接触検出部、130・・・保護層、132・・・支持部材、134・・・内郭、140・・・内部コア、142・・・プロジェクタ、150・・・通信部、152・・・送信部、154・・・受信部、160・・・モーションセンサ、162・・・加速度センサ、164・・・角速度センサ、166・・・地磁気センサ、200・・・制御部。 DESCRIPTION OF SYMBOLS 1 ... Information processing system, 10 ... Information processing apparatus, 20 ... Output device, 30 ... Communication part, 32 ... Transmission part, 34 ... Reception part, 40 ... Terminal information Processing unit 42 ... Posture specifying unit 44 ... Position specifying unit 50 ... Control unit 52 ... Key setting unit 54 ... Application processing unit 100 ... Terminal device 110 ... Display panel, 112 ... Display section, 120 ... Touch sensor, 122 ... Contact detection section, 130 ... Protective layer, 132 ... Support member, 134 ... Inner shell, 140 ... Inner core, 142 ... Projector, 150 ... Communication unit, 152 ... Transmission unit, 154 ... Reception unit, 160 ... Motion sensor, 162 ... Acceleration sensor, 164 ... -Angular velocity sensor, 166 ... Geomagnetic sensor, 200 ... control unit.
 本発明は、ユーザインタフェースの分野に適用できる。 The present invention can be applied to the field of user interface.

Claims (7)

  1.  曲面形状または略曲面形状を有する携帯型の端末装置であって、
     当該端末装置の表面への接触を検出する検出部と、
     画像を表示する表示部と、
     前記検出部による検出値を送信する送信部と、
     画像を生成するための画像データを受信する受信部と、
     受信した画像データを用いて、前記表示部に表示する画像を生成する制御部と、
     を備えることを特徴とする端末装置。
    A portable terminal device having a curved surface shape or a substantially curved surface shape,
    A detection unit for detecting contact with the surface of the terminal device;
    A display for displaying an image;
    A transmission unit for transmitting a detection value by the detection unit;
    A receiving unit for receiving image data for generating an image;
    A control unit that generates an image to be displayed on the display unit using the received image data;
    A terminal device comprising:
  2.  前記検出部は、当該端末装置の連続的な表面における接触を検出可能に設けられることを特徴とする請求項1に記載の端末装置。 The terminal device according to claim 1, wherein the detection unit is provided so as to be able to detect contact on a continuous surface of the terminal device.
  3.  前記表示部は、当該端末装置の表面から画像を表示できるように設けられることを特徴とする請求項1または2に記載の端末装置。 The terminal device according to claim 1 or 2, wherein the display unit is provided so that an image can be displayed from a surface of the terminal device.
  4.  前記表示部は、複数の表示パネルを組み合わせて構成されることを特徴とする請求項1から3のいずれかに記載の端末装置。 4. The terminal device according to claim 1, wherein the display unit is configured by combining a plurality of display panels.
  5.  前記表示パネルは、曲面状に形成されることを特徴とする請求項4に記載の端末装置。 The terminal device according to claim 4, wherein the display panel is formed in a curved surface shape.
  6.  前記端末装置は、球体または卵形の外形を有することを特徴とする請求項1から5のいずれかに記載の端末装置。 The terminal device according to any one of claims 1 to 5, wherein the terminal device has a spherical or egg-shaped outer shape.
  7.  端末装置と、情報処理装置とを備えた情報処理システムであって、
     前記端末装置は、
     当該端末装置の表面への接触を検出する検出部と、
     画像を表示する表示部と、
     前記検出部による検出値を前記情報処理装置に送信する送信部と、
     画像を生成するための画像データを前記情報処理装置から受信する受信部と、
     受信した画像データを用いて、前記表示部に表示する画像を生成する制御部と、を有し、
     前記情報処理装置は、
     前記端末装置から、前記検出値を受信する受信部と、
     前記検出値をもとに、画像データを生成するアプリケーション処理部と、
     画像データを前記情報処理装置に送信する送信部と、を有する、
     ことを特徴とする情報処理システム。
    An information processing system comprising a terminal device and an information processing device,
    The terminal device
    A detection unit for detecting contact with the surface of the terminal device;
    A display for displaying an image;
    A transmission unit for transmitting a detection value by the detection unit to the information processing device;
    A receiving unit that receives image data for generating an image from the information processing apparatus;
    A control unit that generates an image to be displayed on the display unit using the received image data;
    The information processing apparatus includes:
    A receiving unit for receiving the detection value from the terminal device;
    An application processing unit for generating image data based on the detection value;
    A transmission unit that transmits image data to the information processing apparatus,
    An information processing system characterized by this.
PCT/JP2011/004538 2010-10-22 2011-08-10 Terminal device and information processing system WO2012053141A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-237899 2010-10-22
JP2010237899A JP5769947B2 (en) 2010-10-22 2010-10-22 Terminal device and information processing system

Publications (1)

Publication Number Publication Date
WO2012053141A1 true WO2012053141A1 (en) 2012-04-26

Family

ID=45974873

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/004538 WO2012053141A1 (en) 2010-10-22 2011-08-10 Terminal device and information processing system

Country Status (2)

Country Link
JP (1) JP5769947B2 (en)
WO (1) WO2012053141A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013232044A (en) * 2012-04-27 2013-11-14 Toshiba Corp Electronic apparatus, control method, and program
WO2014199154A1 (en) * 2013-06-11 2014-12-18 Sony Computer Entertainment Europe Limited Head-mountable apparatus and systems

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014026318A (en) * 2012-07-24 2014-02-06 Ricoh Co Ltd Power management device, power management system, power management method, and program
JP6417673B2 (en) * 2013-05-08 2018-11-07 株式会社デンソー Vehicle operation detection system, vehicle operation detection unit, and vehicle operation detection device
JP6715562B2 (en) * 2014-03-27 2020-07-01 任天堂株式会社 Information processing system, information processing program, information processing method, information processing terminal
JP6402348B2 (en) * 2015-03-30 2018-10-10 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME CONTROL METHOD, AND PROGRAM
JP2017116893A (en) * 2015-12-26 2017-06-29 株式会社村田製作所 Stereo image display device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009094091A1 (en) * 2008-01-25 2009-07-30 Microsoft Corporation Projection of graphical objects on interactive irregular displays
WO2010067537A1 (en) * 2008-12-08 2010-06-17 シャープ株式会社 Manual control action input device and computer program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001154592A (en) * 1999-09-13 2001-06-08 Minolta Co Ltd Display device
US9218116B2 (en) * 2008-07-25 2015-12-22 Hrvoje Benko Touch interaction with a curved display
JP2010244772A (en) * 2009-04-03 2010-10-28 Sony Corp Capacitance type touch member and method for producing the same, and capacitance type touch detection device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009094091A1 (en) * 2008-01-25 2009-07-30 Microsoft Corporation Projection of graphical objects on interactive irregular displays
WO2010067537A1 (en) * 2008-12-08 2010-06-17 シャープ株式会社 Manual control action input device and computer program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013232044A (en) * 2012-04-27 2013-11-14 Toshiba Corp Electronic apparatus, control method, and program
US9001063B2 (en) 2012-04-27 2015-04-07 Kabushiki Kaisha Toshiba Electronic apparatus, touch input control method, and storage medium
WO2014199154A1 (en) * 2013-06-11 2014-12-18 Sony Computer Entertainment Europe Limited Head-mountable apparatus and systems
US10198866B2 (en) 2013-06-11 2019-02-05 Sony Interactive Entertainment Europe Limited Head-mountable apparatus and systems

Also Published As

Publication number Publication date
JP5769947B2 (en) 2015-08-26
JP2012093800A (en) 2012-05-17

Similar Documents

Publication Publication Date Title
JP7277545B2 (en) Rendering virtual hand poses based on detected hand input
JP5769947B2 (en) Terminal device and information processing system
JP6093473B1 (en) Information processing method and program for causing computer to execute information processing method
JP6158406B2 (en) System for enabling video capture of interactive applications on mobile devices
CN107368183B (en) Glove for providing input
US10317997B2 (en) Selection of optimally positioned sensors in a glove interface object
JP2019522849A (en) Direction interface object
KR101576979B1 (en) Electric apparatus which determines user input using magnetic field sensor
JP2021501496A (en) Robot utilities and interface devices
WO2018196552A1 (en) Method and apparatus for hand-type display for use in virtual reality scene
JP2012507102A (en) Controller with spherical end with configurable mode
JP2018194889A (en) Information processing method, computer and program
CA2843670A1 (en) Video-game console for allied touchscreen devices
JP7115695B2 (en) animation production system
JP6893532B2 (en) Information processing methods, computers and programs
JP7356827B2 (en) Program, information processing method, and information processing device
JP7071134B2 (en) Information processing device, operation control program and operation control method
JP6728111B2 (en) Method of providing virtual space, method of providing virtual experience, program, and recording medium
JP6444345B2 (en) Method and apparatus for supporting input in virtual space, and program for causing computer to execute the method
JP6189495B1 (en) Method for providing virtual space, method for providing virtual experience, program, and recording medium
JP6189496B1 (en) Method for providing virtual space, method for providing virtual experience, program, and recording medium
CN116149590A (en) Display control method, device, head-mounted equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11833988

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11833988

Country of ref document: EP

Kind code of ref document: A1