TW201220109A - Controller device and information processing device - Google Patents

Controller device and information processing device Download PDF

Info

Publication number
TW201220109A
TW201220109A TW100126152A TW100126152A TW201220109A TW 201220109 A TW201220109 A TW 201220109A TW 100126152 A TW100126152 A TW 100126152A TW 100126152 A TW100126152 A TW 100126152A TW 201220109 A TW201220109 A TW 201220109A
Authority
TW
Taiwan
Prior art keywords
game
operation
device
terminal device
data
Prior art date
Application number
TW100126152A
Other languages
Chinese (zh)
Other versions
TWI442963B (en
Inventor
Ken-Ichirou Ashida
Yositomo Gotou
Takanori Okamura
Junji Takamoto
Masato Ibuki
Shinji Yamamoto
Hitoshi Tsuchiya
Fumiyoshi Suetake
Akiko Suga
Naoya Yamamoto
Daisuke Kumazaki
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2010245299A priority Critical patent/JP4798809B1/en
Priority to JP2010245298 priority
Priority to JP2011092506 priority
Priority to JP2011092612A priority patent/JP6103677B2/en
Priority to JP2011102834A priority patent/JP5837325B2/en
Priority to JP2011103704A priority patent/JP6005907B2/en
Priority to JP2011103706A priority patent/JP6005908B2/en
Priority to JP2011103705 priority
Priority to JP2011118488A priority patent/JP5936315B2/en
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Publication of TW201220109A publication Critical patent/TW201220109A/en
Application granted granted Critical
Publication of TWI442963B publication Critical patent/TWI442963B/en

Links

Abstract

A terminal device 7 is a controller device to be operated by a user. The terminal device 7 includes a generally plate-shaped housing 50, an LCD 51, and a projecting portion (an eaves portion 59). The LCD 51 is provided on the front side of the housing 50. The projecting portion is provided so as to project at least at left and right positions on a back side of the housing 50 above the center of the housing 50. When the user holds the left and right portions of the housing 50 with respect to the LCD 51, the user can easily hold the terminal device 7 by holding it so as to allow the projecting portion to rest on the fingers.

Description

201220109 VI. Description of the Invention: TECHNICAL FIELD OF THE INVENTION The present invention relates to an operating device in which a player can hold a operation. [Prior Art] In the past, there has been an operation and a device used by a player to hold the hand. For example, the portable game device described in the specification of Japanese Patent No. 3,703,473 is a folding type, and an operation key is provided on the lower side cover. According to this game device, the user can perform the game operation by using the # operation keys provided on both sides of the screen while viewing the face, and the game operation can be easily performed under the holding game device. [Problems to be Solved by the Invention] In recent years, with regard to portable terminal devices (operating devices), devices having larger kneading surfaces and the like, and devices having a larger size have also been increasing. Here, when the device itself used by the user to hold the hand becomes large, there is a possibility that the device is not easily held. Accordingly, it is an object of the present invention to provide an operating device that can be easily held by a user. (Means for Solving the Problems) In order to solve the above problems, the present invention adopts the following configurations (1) to (18). (1) An example of the present invention is an operation device for allowing a user to operate. The operation device includes a cover having a substantially plate shape, a display portion, and a projection 3 323330 201220109. The display portion is provided on the surface side of the cover. The projection is attached to the back side of the outer cover, at the center of the outer cover, closer to the upper side, and protruded at least at the left and right positions. 'The above-mentioned "operations"" can be any device as long as the user-operable operation device, for example, a joystick (analog joystick), a button (key), a touch panel, a touchpad (touch) Pad) and so on. The above-mentioned "positions on the left and right sides" mean that they are in the left and right direction.

The left side and the right side of the center of the cover are provided with protrusions, and protrusions may be provided at both ends of the left and right sides, or protrusions may be provided at positions closer to the center than the left and right ends. According to the configuration of the above (1), since the protruding portion is provided on the back side of the cover, when the user holds the right and left covers of the display portion, the protruding portion can be grasped by the finger, and the operating device can be easily held. Furthermore, since the + part is placed on the upper side of the outer cover, when the user holds the index finger, the middle finger, or the lower part of the protrusion to hold the outer (4), the palm can be used to support the building ^ (refer to the third) Figure and the "picture", but can be tightly gripped: two sets according to the above (1) composition 'provided _ kinds of users easy to hold: (2) with: in the outer cover, the center is close to the upper side According to the configuration of the above (2), since the operation unit is provided on the display side, the user operates the operation unit by, for example, a thumb while holding the left and right sides of the display unit. Therefore, according to the above (2), the second 323330 4 201220109 can be easily shaken by the user for easy operation. The present invention: another example is provided with: a substantially plate-shaped scorpion; The operation unit and the second operation unit; and the page display portion of the protrusion portion are provided on the surface side of the cover. i operation, poly:: : Do not set on the left and right sides of the display. When the user can use the two = oil as the S 1 (four) part and the 帛 2 operation part, the outside of the thumb is placed on the back side of the outer cover, and the finger can be hooked according to the composition of the above (3). When the user holds the left and right cover of the display unit, the protrusion can be grasped by a finger other than the thumb, and can be held by the user and the U-shaped figure. In addition, since the operation unit is provided in the display side, the user can hold the cover on the right and left sides of the display unit, and the operation can be made in accordance with the above configuration (3). The operation of the opposite side of the operation unit can be set in the area including the second operation unit and the first side position. The "comparison position" is the state in which the secret operation unit and the protrusion position are in the same state. On the other hand, when the surface of the outer cover is projected on the back side, the area including the raised portion on the back side of the cover is partially overlapped with the projected area. 0 323330 5 201220109 (4) The user can use the index finger, the middle finger, or the ring finger to operate the operation sections, and the user can take the 10th and the uth diagrams.曰 檐 檐 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 握 59 59 59 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握The third operation material is placed on the outer cover according to the configuration of the above (5) = 4 operation portion. In the state of the outer cover, for example, the person can hold the left and right portions of the display portion. That is, the above state or the middle finger is operated. The third and fourth operations are better operating devices. In addition, Hi operates more and provides an operative holding device, so that it is easier to hold the operating device. C6) The two portions may have a shape that extends toward the left and right. ^The composition of the above (6) 'The user can make the finger of the branching portion be placed on the device' so that it is easier to hold the operation mounting portion and the right side of the (4) is extended to form 'when the user operates the device with the protruding device Regardless of the holding operation & setting, the hand other than the plum finger can be used, so that the user can hold the operating device tightly even when the projection is made longitudinal. _Packing (7) Below the protrusion, a third hole that can be locked with the operating device can be provided. Addition of U 323330 6 201220109 According to the above configuration of α), the first locking hole can be used to firmly connect the operating device and the attachment. In addition, when the configuration of the above (6) and the configuration of the above (7) are combined, the first locking hole can be provided in the vicinity of the center in the left-right direction of the operating device, so that the right and left balance can be uniformly maintained and the connection can be stably connected. Device. (8) The second mounting hole to which the attachment device can be locked may be provided on the lower surface of the housing. According to the configuration of the above (8), the i-th card ==, the hole provided at different positions can be used to connect the operating device and the additional device, so that it is more capable of (9) operating the device, located at the sudden μ-right rain gamma... Below the starting portion and on the left and right sides of the back of the cover, a convex portion having a convex cross section may be provided. Left, according to the above-mentioned (9), the small finger is hooked to the convex portion to hold the outer I. The user can put a finger (for example, a ring finger or a finger. Therefore, the operating device (10) can be gripped tightly. According to the configuration of the above (10), the middle finger or the ring finger or the like _= the user can hold the operation device without the convex portion becoming the obstruction portion, thereby clamping the other fingers (1) ^Easy to hold the operating device. The operating device can be equipped with a grip on the left and right sides of 323330 7 201220109 on the back of the outer cover. According to the above configuration (11), the user can put a finger (such as the ring finger) Or the little finger) hooks the grip portion to grip the cover, so that the operation device can be gripped tightly. (12) The operation device can be provided with the fifth operation portion and the sixth operation portion. The fifth operation portion is The surface on the front surface side of the outer cover is disposed below the first operation portion. The sixth operation portion is disposed on the surface on the surface side of the outer cover below the second operation portion. The configuration of the above (12) can be performed using an operation device. More kind of operation. In addition, even in operation 5 In the case of the operation unit and the sixth operation unit, the user can hold the operation device tightly, so that the operation device can be provided with good operability. 〈 (13) 部· if otherwise - (4) having a cover having a plate-like shape; The display cover and the operation device of the operation unit are provided. The display portion is provided on the outer surface side. The protrusion portion is provided on the back side of the cover, and is provided at least on the left side = the protrusion. The operation portion is provided on the side of the sandblast due to the protrusion portion. (10) Fig. In addition, since the operation portion is provided when the protrusion ^ hooks the protrusion green holding operation device, the upper side is the surface, and at the time of the finger, the user can easily perform the operation from the upper and lower sides. Therefore, it is easier to hold the operation. The second operation is as follows. According to the above (13) 323330 8 201220109, a user can easily hold and accommodate. Device. (14)

Other examples of the invention are those for the user to operate. The operation device includes a cover having a substantially plate shape, and the display portion "is a grip portion. The age is set on the surface side of the cover. The fourth portion is on the left and right sides of the cover on the back side of the cover. The direction 2 is set to 'the cross section is convex. 匕 According to the configuration of the above (14), the user can hold the cover (for example, the ring finger = finger) to hold the cover, so that the operation device can be gripped tightly. 'According to the above ((4), it is possible to provide an operating device that can be easily held by the user. (15) You can operate the skirt to have: on the back side of the cover, on the grip portion > on the left and right According to the configuration of the above (15), since the protrusion is provided on the back surface of the cover, when the user holds the cover on the left and right sides of the display portion, the protrusion can be used for the hand. The holding device is loosely held, so that the gripping operation is tighter (16). The operation is ashamed. The upper side of the outer cover is provided on the left side, the seventh side, the operating part, and the eighth operating part. The above configuration (16) can use an operating device Since the operation part is disposed on the upper surface of the outer cover, the user can hold the operating device from the surface side, the upper side and the back of the outer cover. Covering the cover tightly (17) The operation device can be provided on the display (4) 4 According to the configuration of (17) above, the user can make the panel. The image displayed on the display portion is more intuitive and easy to operate: := == 'The operating device is operated by slightly protruding the touch panel by the protrusions. ~ 'The next easy (18) f device can be equipped with an inertial sensor inside the cover. (18) The composition can be swung or moved =:,, =τ as the device is more intuitive and the capacity _= _ _ this configuration 'is assumed to be used by the shaft operating device, when attaching the additional device to the operating device, firmly It is extremely important to connect the operation device 2:: The above configuration (10) is the same as the configuration of the above (7) or (8), and is particularly effective for a strong connection between the operation device and the 'additional device. The device can be equipped with a communication unit and a display control unit. The communication department is: the line mode will display the operation information of the operation performed on the own machine to the game device' and receive the image displayed from the aforementioned game device (4). The data is shown in the above-mentioned display section. 323330 10 201220109 The material of Dugan, the composition of the above (19) 'user can use the easy grip and the operation of the user to operate the game to perform the game operation. In addition, since the game device is shipped Since the image is displayed on the display unit, the player can perform the game operation while viewing the image of the display unit of the operation device. (20) The device can be provided with a game processing unit and a display control unit. The game performs game processing according to the operation of the own machine II. The display control unit: generates a game image based on the result of the game processing and displays it on the display. 薏 According to the above configuration (2G), the portable game device can be easily held and the operability is good. (21) The display unit may have a face of 5 inches or more. According to the configuration of the above (21), it is possible to use a large face to display an image that is easy to fact and has a shock. When the configuration of the above (21) is large, the size of the operation device itself is increased, so that the configuration of the above (1) to (10) which can be easily gripped by the user is particularly effective. According to another aspect of the present invention, a flat type (the effect of the invention) having the above-mentioned (1) information portion I:: (cover, display portion, protrusion, etc.) may be provided. According to the present invention, the surface of the outer cover is The display portion is disposed, and the outer Hungarian surface is placed closer to the upper side than the center of the outer cover, and at least two left and right U-shaped protrusions are provided, whereby the user can further hold the operation device 323330 11 201220109. The other objects, features, aspects, and effects of the game system will be further understood by reference to the accompanying drawings and the following detailed description. [Embodiment] [1. Overall configuration of the game system] A game system 1 according to an embodiment of the present invention. Fig. 1 is an external view of the game system 1. In the first diagram, the game system 1 ® includes a fixed display device represented by a television receiver or the like (hereinafter referred to as TV ") 2, fixed-type game apparatus 3, an optical disc 4, a controller 5, display device 6 standard, and a terminal device 7. The game system 1 executes game processing in the game device 3 in accordance with the game operation using the controller 5, and displays the game image obtained by the game processing on the television 2 and/or the terminal device 7. In the game device 3, a disc 4, which is an example of a replaceable information storage medium, is detachably inserted into the game device 3. The disc 4 system stores φ an information processing program (typically a game program) for execution in the game device 3. An insertion port of the optical disc 4 is provided in front of the game device 3. The game device 3 performs game processing by reading and executing an information processing program stored in the optical disc 4 inserted in the insertion port. The television 2 is connected to the game device 3 via a connection line. The television 2 displays a game image obtained by the game processing executed in the game device 3. The television 2 has a racquet 2a (Fig. 2), and the horn 2a outputs the game sound obtained as a result of the above-described game processing. In other embodiments, the game device 3 and the stationary display device may be integrally formed. In addition, the communication between the game device 3 12 323330 201220109 and the television 2 can be wireless communication. A marking device 6 is provided around the kneading surface of the television 2 (the upper side of the screen in Fig. 1). The details will be described later in detail, and the user (player) can perform the game operation of the mobile controller 5, and the pointing device 6 is used to calculate the motion, position, posture, and the like of the controller 5. The marking device 6 has two markers 6R and 6L at its both ends. The marker 6R (the same is also used for the marker 6L), specifically, one or more infrared LEDs (Light Emitting Diodes), and outputs infrared rays toward the front of the television 2. The pointing device 6 ® is connected to the game device 3, and the game device 3 can control the lighting of each of the infrared LEDs provided in the signing device 6. The marking device 6 is of a transportable type, and the user can set the marking device 6 in a free position. In Fig. 1, the type in which the marking device 6 is placed on the television 2 is shown, but the position and orientation of the setting indicating device 6 can be arbitrary. The controller 5 displays the operation data indicating the contents of the operation performed by the own machine to the game device 3. Communication between the controller 5 and the game device 3 φ can be performed by wireless communication. In the present embodiment, wireless communication between the controller 5 and the game device 3 is, for example, a Bluetooth (registered trademark) technology. In other embodiments, the controller 5 and the game device 3 can be connected in a wired manner. Further, in the present embodiment, the number of controllers 5 included in the game system 1 is one, but the game device 3 can communicate with a plurality of controllers 5, and by using a predetermined number of controllers at the same time, it is possible to Several users play the game. The detailed configuration of the controller 5 will be described in detail later. The terminal device 7 is of a size that can be gripped by the user, so that the user can move the terminal device 7 by hand or the terminal device 7 can be used in a free position. The detailed configuration will be described later, and the terminal device 7 includes an LCD (Liquid Crystal Display) 51 as a display means and an input means (a touch panel 52 or a rotation sensor 74 to be described later). Communication between the terminal device 7 and the game device 3 can be performed by wireless (or wired). The terminal device 7 receives the material of the image (e.g., game image) generated in the game device 3 from the game device 3, and displays the image on the LCD 51. In the present embodiment, an LCD is used as the display device. However, the terminal device 7 may have any other display device such as a display device to which EL (Electr® Luminescence) is applied. Further, the terminal device 7 transmits the operation data showing the contents of the operation performed by the own device to the game device_3. [2. Internal Configuration of Game Device 3] Next, the internal configuration of the game device 3 will be described with reference to Fig. 2 . Fig. 2 is a block diagram showing the internal structure of the game device 3. The game device 3 system φ includes CPU (Central Processing Unit) 1 (), system LSI 11, external main memory 12, R〇M/RTC 13, optical disk drive, and AV-IC 15. The CPU 10 performs game processing by executing a game program stored on the disc 4, and has a function of a game processor. Cpu 1〇 is connected to the system L Magic 11. The system LSI 11 is connected to the external main memory unit 12, the ROM/RTC 13, the optical disk drive 14 and the AV_IC 15 in addition to the CPU 1. The system transmits data between the components connected thereto. The control, the generation of the image to be displayed, the acquisition of data from the external I, etc. The internal composition of the system kiss 323330 14 201220109 11 will be detailed later. Volatile external main eC* '丨蒽 body 12, It is a program that stores a game program read from the optical disk player 14 or a game program read from a flash memory, or a program that stores various data, and uses a program ROM for booting the device 3 (so-called boot R〇). m), with the working area or buffer of CPU 10. R0M/RTC 13 system and right station λ丄^ α „„ „ . , 'Load clock with game and counting time (RTC: Real Time Clock) . CD player 14, ~ CD 4 reads program data or material graphic data, etc., and will read $ from a.

It is written in the internal main memory lie or the external main memory 12 which will be described later. In the system LSI 11, an output input processor ([/o), GPU (Graphics Processor Unit) llb, DSP (Digital Signal Processor), and VRAM ( Video RAM) lld, and internal main memory lle. Although omitted from the illustration, these constituent elements 11a to lie are connected to each other by an internal bus bar. & ~ GPU lib forms part of the rendering method and generates images based on drawing commands (drawing instructions) from cpu w φ. VRAM lid, which is used to store the data (polygon data or material graphic data) required by the GPU lib to execute the drawing instructions. When generating an image, the text is used to make image data using the data of memory=vram lid. In the present embodiment, the game device 3 generates both a game image displayed on the television 2 and a game image displayed on the terminal device 7. Hereinafter, the game image displayed on the television 2 is referred to as a "view game image", and the game image displayed on the terminal device 7 is referred to as a "terminal game image". The DSP 11c has the function of audio processing, and uses the sound data or sound waveform (tone) data stored in the main memory lie or the external main memory 12 to generate sound data. In the present embodiment, the game sound is the same as the game image, and both the game sound outputted from the racquet of the television 2 and the game sound outputted from the horn of the terminal device 7 are generated. In the following, the game sound outputted from the television 2 is referred to as "game sound for television", and the game sound outputted from the terminal device 7 is referred to as "game sound for the terminal". As described above, among the images and sounds generated by the game device 3, the images of the images and sounds output from the TV 2 are read by the AV-IC 15. The AV-IC 15 outputs the read image data to the television 2 via the AV connector 16, and outputs the read sound data to the built-in picture 2a of the television 2. Thereby, the image is displayed on the television 2, and the sound is output from the thorns 八2a. Further, among the images and sounds generated by the game device 3, the data of the images and sounds outputted to the terminal device 7 are transmitted to the terminal device 7 by the output/input device 11a or the like. The transmission method of transmitting data to the terminal device 7 by the output input processor 11a or the like will be described in detail later. The output input processor 11a performs reception and transmission of data between components connected thereto, or performs data download from an external device. The output input processor 11a is connected to the flash memory 17, the network communication module 18, the controller communication module 19, the expansion connector 20, the memory card connector 21, and the codec LSI 27. In addition, the network communication module 18 is connected to the antenna 22. The controller communication module 19 is connected to the twist line 23. The codec LSI 27 is connected to the terminal communication module 28, and the terminal communication module 28 16 323330 201220109 is connected to the antenna 29. • The game device 3 can be connected to a network such as the Internet to communicate with an external information processing device (for example, other game devices or various servers). That is, the output input processor 11a can be connected to a network such as the Internet via the network communication module 18 and the antenna 22, and communicates with an external information processing device connected to the network. The output input processor 11a periodically accesses the flash memory 17 to detect whether there is data to be transmitted to the network. When the data is present, it is transmitted to the network via the network communication module 18 and the antenna 22. In addition, the output® input processor 11a receives the data transmitted from the external information processing device or the data downloaded from the download server via the network, the antenna 22, and the network communication module 18, and memorizes the received data. Flash memory 17. C P U 10 reads the data stored in the flash memory 17 and applies it to the game program by executing the game program. In the flash memory 17, in addition to the data transmitted between the game device 3 and the external information processing device, the stored data of the game in which the game device 3 is played may be memorized (the φ result data of the game or the middle) . Way information). In addition, a game program can also be memorized in the flash memory 17. Further, the game device 3 can receive an operation material from the controller 5. That is, the output input processor 11a can receive the operation data transmitted from the controller 5 via the antenna 23 and the controller communication module 19, and memorize (temporary memory) in the internal main memory lie or external main memory. Buffer of body 12. Further, the game device 3 can perform reception and transmission of data such as images or sounds with the terminal device 7. The output/output processor 11a transfers the data of the GPU lib 17 323330 201220109 = 7 == to the codec, 27 when the game image (the game image for the terminal) is transmitted to the terminal device 7. The image data of the codec...the input processor is scheduled to be finalized = the end of the communication module 28 is between the terminal device 7 and the image data compressed by the codec LSI 27, The module 28 is transmitted to the terminal device 7 via the antenna 29. In the form: the image data transmitted from the game device 3 to the terminal device 7 is used in the game, and the image displayed in the game is produced.

Operationality has an adverse effect. Therefore, with regard to the transmission of the image data (4) ^ device 3 to the terminal device 7, it is preferable to form as much as possible without delay. Therefore, in the present embodiment, the codec (3) 27 compresses image data using, for example, the H.264 high efficiency compression technique. Other compression techniques can also be used to transmit image data without compression when the communication speed is sufficient. Further, the terminal communication module 28 is, for example, a communication group that accepts Wi-Fi authentication, and can perform high-speed operation with the terminal device 7 using MIMG (Multiple Input Multiple (4) technology) adopted in the specification (10) 臓 (1), for example. In addition to the image data, the game device 3 transmits the sound data to the terminal device 7. That is, the output input processor 11a is the sound generated by the DSP 11c. Turn (4) LSI π output to the terminal communication mode, and 28 ° editing ^ In [SI 27 for the sound data, the same compression processing as the image data. The compression of the sound data can be any way' The value is preferably a method in which the compression ratio is high and the sound deterioration is small. In addition, in other implementations, the sound data can be transmitted without compression. 323330 18 201220109 , the compressed image data is transmitted via the day terminal communication module I and 28 And the sound line 29 is transmitted to the terminal device 7.

If the player device 3 is in addition to the above-mentioned image data and audio data, it is also necessary to display the components of the terminal device 7 in accordance with the necessity of the device 4 (4). The information that is not available, for example, the indication of the lighting of the control indicator (shown in the ig chart 55), or the control of the camera (the first picture: the instruction of the shooting, etc.. Output wheeling processing) The controller & transmits the control data to the terminal device 7 in response to the message of the coffee maker. In this embodiment, the codec LSI 27 does not perform compression of the data. From the game device, the above-mentioned data of the device 7 is required to be encoded or not encoded. Further, the game device 3 can receive various materials from the terminal device 7. The details will be described later, and the present embodiment will be described. In the form of a command, the terminal device 7 transmits the spring (four) tilt, the image data, and the sound (four). The various materials transmitted from the terminal device 7 are received by the terminal communication module 28 via the antenna 29. Here, the terminal device is received. 7 image data The sound data is also subjected to the same compression processing as the image data and sound data of the game device 3 to the final county 7. Therefore, the image data and the sound data are transmitted from the terminal communication module 28 to the edited The decoder LSI 27 is subjected to (four) reduction processing by the codec (8) 27 and output to the output input processor. On the other hand, * with respect to the operation data from the terminal device 7, since the amount of data is compared with the image or sound In addition, compression processing may be omitted, and 323330 19 201220109 may or may not be encoded as necessary. Therefore, the operation is received by the terminal communication module 28, and then output via the codec LSI 27. To the output input processor 11a, the output/output processor 11a memorizes (temporarily memorizes) the data received from the terminal device 7 into the buffer of the internal main memory or the external main memory 12. Further, the game device 3 can be connected. For other devices or external memory media, that is, the expansion connector 2〇φ and the memory card connector 21 are connected to the output input processor 11a. The expansion connector 20 is USB or S. A connector for an interface such as CSI, which can be connected to a network by connecting a medium such as an external memory medium, a peripheral device such as another controller, or a wired communication connector to the extended connection H 2G. It is similar to the network communication module 18. The memory card connector 21 & is used to connect an external memory such as a memory card, the connector of the body. For example, the output wheel-in processor Ua via the expansion connector 20 has recalled The card connector 21 accesses the external memory medium, and the data can be stored in the external memory medium or read from the external memory medium. The game device 3 is provided with a power button 24, a child button 25, and a drop button 26. The power key 24 and the reset key 25 are connected to the system LSI η. When the power button 24 is turned on, power is supplied from the external power source to each component of the game device f3 by an Ac adapter (not shown). When the reset button 25 is pressed, the LSI 11 restarts the startup program of the game device 3. The eject button is connected to the disc player 14. When the eject button 26 is pressed, the disc 4 is discharged from the disc player 14. In other embodiments, some of the constituent elements of the game device 3 may be configured as expansion machines 323330 20 201220109 that are different from the game device 3. At this time, the expansion machine can be connected to the game device 3 via the above-described expansion connector 2, for example. Specifically, the expansion device includes, for example, the respective components of the codec LSI 27, the terminal communication module 28, and the antenna 29, and can expand the connection H20. According to this, the game device can be configured to communicate with the terminal device 7 by connecting the expansion device to a game device that constitutes a component of death. [3. Configuration of Controller 5] Next, the controller 5 will be described with reference to Figs. 3 to 7 . Fig. 3 is a perspective view showing the appearance of the device 5. The figure shows a perspective view of the appearance of the controller. Fig. 3 is a perspective view from the upper rear side of the controller 5, and Fig. 4 is a lower side of the controller 5. In the third and fourth figures, the controller 5 has an outer cover 3 formed of, for example, plastic molding, and the outer cover 31 has a long side in the front-rear direction (the z-axis direction shown in Fig. 3). The direction of the general rectangular shape, the size of the one or the child's one hand can be held by the user. The user changes the position or posture by the key set on the controller 5 and the mobile controller 5 itself. (Slope), the game operation can be performed. The outer cover 31 is provided with a plurality of operation keys. As shown in Fig. 3, on the upper surface of the outer cover 31, a cross key 32a, a ! key key hunger, and a 2nd key 32c are provided. The A key 32d, the minus key 32e, the first key 32f, the positive key 32g, and the power key 32h. In the present specification, the upper surface of the cover 31 provided with the keys 32a to 3 is referred to as a "key face". . On the other hand, as shown in Fig. 4, a concave portion is formed on the lower surface of the outer cover 31, and the rear side of the concave portion is tilted 323330 21 201220109, and the B key 32h is provided on each of the operation keys 32a to be smashed, suitably The function of the information processing program executed in response to the game device 3 is assigned: in addition, the power button 32h is used to turn the power of the main body of the game device 3 to the close-off in a remote manner. The home button 32f and the power button are provided on the upper surface of the cover 31. This prevents the user from accidentally pressing the first === power button 32h. A connector 33 is provided behind the outer cover 31. Connector 33 is used to connect other machines (e.g., other sensor units or controllers) to controller 5. Further, on both sides of the connector 33 on the rear side of the outer cover 31, locking holes 33a for preventing the above-mentioned other devices from being easily detached are provided. A plurality of (four in FIG. 3) LEDs 34a to 34d are disposed behind the upper surface of the outer cover 31. Here, the controller 5 is given a controller type (number) in order to distinguish it from the other controllers 5. Each of the LEDs 34a to 34d is used for the purpose of notifying the user of the type of controller currently set in the controller 5 or notifying the user of the remaining amount of the battery of the controller 5. Specifically, when the game operation is performed using the controller 5, any one of the plurality of LEDs 34a to 34d is turned on in response to the type of the controller. Further, the controller 5 includes an imaging information computing unit 35 (Fig. 6). As shown in Fig. 4, a light incident surface 35a of the imaging information computing unit 35 is provided on the front surface of the cover 31. The light incident surface 35a is made of a material that allows at least infrared rays from the markers 6R and 6L to penetrate. Between the first key 32b on the upper surface of the outer cover 31 and the first key 32f, a sound emitting hole 31a for discharging the sound from the speaker 47 (Fig. 5) built in the controller 5 to the outside is formed. 22 323330 201220109 Next, the internal structure of the controller 5 will be described with reference to Figs. 5 and 6. 5 and 6 are views showing the internal structure of the controller 5. Fig. 5 is a perspective view showing a state in which the upper casing (part of the outer casing 31) of the controller 5 is removed. Fig. 6 is a perspective view showing a state in which the lower casing (a part of the outer cover 31) of the controller 5 is removed. The perspective view shown in Fig. 6 is a perspective view of the substrate 30 shown in Fig. 5 as viewed from the back. In FIG. 5, the substrate 30 is fixedly disposed inside the outer cover 31. On the upper main surface of the substrate 30, the operation keys 32a to 32h, the respective LEDs 34a to 34d, the acceleration sensor 37, and the antenna 45 are disposed. And β thorn 0 eight 47 and so on. These are connected to a microcomputer (Micro Computer) 42 by wiring (not shown) formed on the substrate 30 or the like (see Fig. 6). In the present embodiment, the acceleration sensor 37 is disposed at a position deviated from the center of the controller 5 in the X-axis direction. Thereby, the operation of the controller 5 when the controller 5 is rotated about the Z axis can be easily calculated. Further, the acceleration sensor 37 is disposed at a position forward of the center of the controller 5 in the longitudinal direction (Z-axis direction). Further, φ has the function of the wireless controller by the wireless module 44 (Fig. 6) and the antenna 45. On the other hand, in Fig. 6, the imaging information computing unit 35 is provided on the distal end edge of the lower main surface of the substrate 30. The imaging information computing unit 35 is provided with an infrared filter 38, a lens 39, an imaging element 40, and an image processing circuit 41 in this order from the front of the controller 5. These members 38 to 41 are attached to the lower main faces of the substrate 30, respectively. Further, the microcomputer 42 and the vibrator 46 are provided on the lower main surface of the substrate 30. The vibrator 46 is, for example, a vibration motor or a solenoid, and is connected to the microcomputer 42 by wiring of the substrate 30 or the like by 23 323330 201220109. The dream is operated by the microcomputer 42 to cause the vibrator 46 to operate, thereby causing the controller 5 to vibrate. Thereby, the vibration can be transmitted to the hand of the user holding the controller 5, and a so-called corresponding vibration game can be realized. In the present embodiment, the vibrator is disposed slightly in front of the outer cover 31. That is, by arranging the vibrator 46 at the end side more than the center of the controller 5, the vibration of the vibrator 46 can cause greater vibration to the entire controller 5. Further, the connector 33 is mounted on the rear end edge of the lower main surface of the substrate 30. The controller 5 includes a crystal vibration element that generates a basic clock of the microcomputer 42 and an amplifier that outputs an audio signal to the speaker 47, in addition to those shown in Fig. 5 and Fig. 6 . The shape of the controller 5 shown in Figs. 3 to 6 , the shape of each operation key, the number of acceleration sensors or vibrators, and the installation position are merely examples, and may be other shapes, numbers, and setting positions. Further, in the present embodiment, the imaging direction of the imaging means is the Z-axis positive direction, but the imaging direction may be any direction. In other words, the position φ of the imaging information computing unit 35 in the controller 5 (the light incident surface 35a of the imaging information computing unit 35) may not be in front of the outer cover 31, and may be set as long as it can be taken from the outside of the outer cover 31. On the other side. Fig. 7 is a block diagram showing the configuration of the controller 5. The controller 5 includes an operation unit 32 (each operation key 32a to 32i), a camera information operation 35, a communication unit 36, an acceleration sensor 37, and a rotation sensor 48° controller 5, which will display a pair. The material 'the content of the operation content performed by the own machine' is transmitted to the game device 3 as the operation data. Hereinafter, the operation data transmitted from the controller 5 will be referred to as "controller operation data", and the operation data transmitted from the terminal device 7 will be referred to as "terminal operation data". 323330 24 201220109 The operation unit 32 includes the above-described operation keys 32a to 32i, and will display the wheel-in state of each of the operation keys 32a to 32i (whether or not the operation key data of each operation key 32a to 32 is depressed is rotated to The microcomputer 42 of the communication unit is used to analyze the image data captured by the imaging means, determine the area where the brightness is high, and calculate the system of the center of gravity of the area ==. 35, even with relatively high speed control due to a sampling period of, for example, ^ frame/sec

The action of the device 5 can also be followed by analysis. The imaging information calculation unit L analyzes 39, the imaging element 40, and the preparation m wave 11 38 'lens. The front side of the controller 5 is connected to the circuit 41. Infrared filter 38' The lens 39, which is the light that will penetrate, passes only the infrared rays. Go to camera 4G. The solid-state imaging device of the camera _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ Here, the infrared light illuminating and illuminating 55 and the pointing device 6 of the singular 39 are _=, and the indicator portion of the terminal device 7 is constituted by a marker for setting the H-short wave H H line. Therefore, the infrared ray of the line filter 38 is first and only the image of the state indicating portion 55 and/or the image of the image is more accurately imaged by infrared. Hereinafter, the image is taken by the image pickup device 6 without the image of the device 6). The image captured by the imaging element 40 is referred to as an image in the image path. _ image ^ / data, in the position of the image processing camera. At the image portion 41, the coordinate wheel in the captured image to the communication unit 36: 41 will display the calculated position m computer 42. The coordinates of the coordinates, 323330 25 201220109, are transmitted to the game device 3 by the microcomputer 42 as operational data. Hereinafter, the above coordinates are referred to as "marker coordinates". Since the marker coordinates are changed corresponding to the orientation (tilt angle) or position of the controller 5 itself, the game device 3 can use the marker coordinates to calculate the orientation or position of the controller 5. In other embodiments, the controller 5 may be configured not to include the image processing circuit 41, and the captured image itself may be transmitted from the controller 5 to the game device 3. At this time, the game device 3 may have a circuit or a program having the same function as the image processing circuit 41, and calculate the marker coordinates. The acceleration sensor 37 detects the acceleration (including the gravitational acceleration) of the controller 5, that is, the force applied to the controller 5 (including the gravity). The acceleration sensor 37 detects the value of the acceleration (linear acceleration) in the direction along the direction of the sensing axis in the acceleration applied to the detecting portion of the acceleration sensor 37. For example, in the case of a multi-axis acceleration sensor of two or more axes, the acceleration of the component along each axis is detected as the acceleration applied to the detecting portion of the acceleration sensor. The acceleration sensor 37 is, for example, a capacitive MEMS (Micro Electro Mechanical System) type acceleration sensor, but other types of acceleration sensors can also be used. In the present embodiment, the acceleration sensor 37 is in the up-down direction (the Y-axis direction shown in FIG. 3), the left-right direction (the X-axis direction killed in FIG. 3), and the front-rear direction with respect to the controller 5. The linear acceleration is detected in the three-axis directions of the Z-axis direction (Fig. 3). Since the acceleration sensor 37 detects the acceleration in the linear direction along each axis, the output of the acceleration sensor 37 shows the value of the linear acceleration of each of the three axes. That is, the acceleration measured by the debt 26 323330 201220109 is shown as a 3-dimensional vector on the XYZ coordinate system (controller coordinate system) set by the controller 5. The data (acceleration k) of the acceleration detected by the acceleration sensor 37 is output to the communication unit 36. Since the acceleration detected by the acceleration sensor 37 changes in accordance with the orientation (tilt angle) or action of the controller 5 itself, the game device 3 can use the acquired acceleration data to calculate the orientation or action of the controller 5. In the present embodiment, the game device 3 calculates the posture, the tilt angle, and the like of the controller 5 based on the acquired acceleration data. It is easy for the person skilled in the art to understand from the description of the present specification that the processor of the game device 3 is based on the signal of the acceleration output from the acceleration sensor 37 (the same applies to the acceleration sensor 63 to be described later). For example, the CPU 10) or a computer such as the processor (for example, the microcomputer 42) of the controller 5 performs processing 'by this, it is possible to estimate or calculate (determine) further information about the controller 5. For example, when the computer side processing is performed on the premise that the controller 5 # of the acceleration sensor 37 is in a stationary state (that is, when the acceleration detected by the acceleration sensor has only the gravitational acceleration), the processing is performed. As long as the controller 5 is substantially at a standstill, it can be known from the detected acceleration whether the posture of the controller 5 is tilted or tilted with respect to the direction of gravity. Specifically, when the detection axis of the acceleration sensor 37 is directly below the vertical direction, whether or not the controller 5 is tilted with respect to the reference can be obtained by whether 1G (gravity acceleration) is applied or not. This size is used to know how much the tilt is relative to the reference. In addition, for the multi-axis acceleration sensor 37, the extent to which the controller 5 is tilted with respect to the direction of gravity can be known in more detail by applying the acceleration signal to each axis 323330 27 201220109. At this time, the processor can calculate the tilt angle of the controller 5 based on the output from the acceleration sensor 37, or calculate the tilt direction of the controller 5 without calculating the tilt angle. Thus, by using the acceleration sensor 37 in combination with the processor, the tilt angle or posture of the controller 5 can be determined. On the other hand, when the controller 5 is in the operating state (the controller 5 is in the moving state), since the acceleration sensor 37 detects the acceleration of the action of the controller 5 in addition to the gravitational acceleration, The component of the gravitational acceleration is removed from the detected acceleration by the pre-determination process, whereby the direction of operation of the controller 5 can be known. In addition, even if the controller 5 is in the operating state, the component of the acceleration corresponding to the motion of the acceleration sensor is removed from the detected acceleration by a predetermined process, thereby knowing that the controller 5 is relative to the controller 5 The slope of the direction of gravity. In other embodiments, the acceleration sensor 37 may further include: an in-line processing for pre-determining the acceleration signal φ before outputting the acceleration signal detected by the built-in acceleration detecting means to the microcomputer 42. A device or other type of specialized processing device. An integrated or dedicated processing device, for example, when used to cause the acceleration sensor 37 to detect a static acceleration (e.g., gravitational acceleration), can convert the acceleration signal to a tilt angle (or other preferred parameter). The rotation sensor 48 detects the angular velocity around the three axes (the XYZ axis in this embodiment). In the present specification, the direction of rotation about the X axis is referred to as a pitch direction with reference to the imaging direction of the controller 5 (the positive direction of the Z axis), and the direction of rotation about the Y axis is called a yaw (raw) The direction of rotation around the Z axis is called the roll direction. The rotation sensor 48 can detect the angular velocity around the 3 axes as long as it can detect 28 323330 201220109, and the number and combination of the ubiquitous sensors used can be arbitrary. For example, the swing sensor 48 can be a 3-axis swivel sensor or a combined 2-axis swivel sensor and a x-axis swivel sensor to detect angular velocity about the 3 axes. The batting material ' showing the angular velocity detected by the gyro sensor is output to the communication unit 36. Further, the gyro sensor 48 can also measure the angular velocity about one or two axes. The communication unit 36 includes a microcomputer 42, a memory 43, a wireless module 44, and an antenna 45. The microcomputer 42 controls the wireless module 44 so as to transmit the data acquired by the microcomputer & to the game device 3 in a wireless manner while using the memory 43 as a memory area during processing. The data output from the operation unit 32, the imaging information calculation unit 35, the acceleration sensor 37, and the rotation sensor 48 to the microcomputer 42 are temporarily stored. These hidden materials are transmitted to the game device 3 as operation data (controller operation data). That is, the microcomputer 42 outputs the operation data stored in the memory call 43 to the wireless module 44 when the transmission timing of the controller communication module 19 of the game device 3 comes. The wireless module 44, for example, uses Bluetooth (registered trademark) technology to adjust the transmission wave of a predetermined frequency by operating data and broadcasts the weak electric wave signal from the antenna 45. That is, the poor component is modulated by the wireless module 44 into a weak electric wave signal and transmitted from the controller 5. The weak electric wave signal is received by the controller communication module 19 on the side of the game device 3. The game device 3 can acquire the operational data by demodulating or decoding the received weak electric wave signal. Further, the CPU 10 of the game device 3 performs game processing using the operation data acquired from the controller 5. The wireless transmission from the communication unit 36 to the controller communication module 19 is performed 323330 29 201220109 in each predetermined cycle. Since the game processing is generally performed in units of 1/60 second (1 frame time), Preferably, the communication unit 36 of the controller 5 outputs the operation data to the controller communication module 19 of the game device 3 at a rate of 1/200 second, for example, as described above. The controller 5 can transmit the marker coordinate data, the acceleration data, the angular velocity data, and the operation key data as operation data showing the operation of the own machine. Further, the game device 3 performs game processing using the above-described ® operation data as a game input. Therefore, by using the controller 5 described above, the user can perform the game operation of the mobile controller 5 itself in addition to the conventional general game operation of pressing the operation keys. For example, an operation of tilting the controller 5 to an arbitrary posture, an operation of instructing an arbitrary position on the face by the controller 5, an operation of the movement controller 5 itself, and the like can be performed. In the present embodiment, the controller 5 does not have a display means for displaying a game image, but may have, for example, a display means for displaying an image indicating the remaining amount of the battery. [4. Configuration of Terminal Device 7] Next, the configuration of the terminal device 7 will be described with reference to Figs. 8 to 13 . Fig. 8 is a plan view showing the appearance of the terminal device 7. Fig. 8(a) is a front view of the terminal device 7, (b) is a plan view, (c) is a right side view, and (d) is a bottom view. Figure 9 is a rear view of the terminal device 7. Further, Fig. 10 and Fig. 11 are views showing a state in which the user holds the terminal device 7 laterally. Fig. 12 and Fig. 13 are views showing the appearance of the user's longitudinal direction 30 323330 201220109 to the holding terminal device 7. As shown in Fig. 8, the terminal device 7 is provided with a cover 50 having a substantially square plate shape which is substantially horizontally long. That is, the terminal device 7 can be referred to as an information processing device of the tablet 1. The outer cover 50 may have a curved surface or a part of a projection or the like as long as the entire shape is a plate shape. Since the cover 5 is a size that can be held by the user, the user can move the terminal device by hand or change the arrangement position of the terminal device 7. The length of the longitudinal direction (z-axis direction) of the terminal device 7 is preferably 100 to 15 〇 [mm], and is 133.5 [mm] in the present embodiment. The length of the lateral direction (x-axis direction) of the terminal device 7 is 200 to 250 [mm], and is 228 26 [min] in the present embodiment. The thickness of the terminal device 7 (the length in the y-axis direction) is preferably about π to [mm], and the thickness of the thickest portion is about 30 to 50 [mm], and the present embodiment is 23.6. (The thickest part is 4〇26) [mm]. Further, the reset of the terminal device 7 is approximately 400 to 6 〇〇 [g], and is 53 〇 [g] in the present embodiment. The details will be described later. However, even in the case of the relatively large terminal device (operating device) as described above, the terminal device 7 is easy to hold and easy to operate by the user. The terminal device 7 has a surface of the outer cover 50 (front side) having a size of the LCD 5 and the surface of the LCD 51 is preferably 5 吋 or more, and is 6.2 inches here. In the present embodiment, the winter end device 7' is easy to handle and easy to operate, that is, the U-large LCD' is also easy to operate. In other embodiments, a relatively small LCD 51 can be provided to set the size of the operating device 7 to be relatively small. The LCD 51 - is again placed near the center of the surface of the outer cover 5G. Therefore, the user can hold the terminal device 7 while moving while viewing the screen of the LCD 51 by holding the cover 50, 31 323330 201220109 on both sides of the LCD 51 as shown in FIGS. 1 and 11 . In Fig. 10 and Fig. 11, the example shows that the user holds the outer cover 50 on the left and right sides of the LCD 51, and holds the terminal device 7 in a horizontally held manner (long lateral direction), but it may be as As shown in Fig. 12 and Fig. 13, the terminal device 7 is held in a longitudinally gripping manner (longitudinal direction).

As shown in Fig. 8(a), the terminal device 7 has a touch panel 52 as an operation means on the side of the 1CD 51. In the present embodiment, the touch panel 52 is a resistive touch panel. However, the touch panel is not limited to the resistive film type, and for example, a touch panel of any type such as an electrostatic capacitance method can be used. In addition, the touch panel 52 can be a single touch method or a multi-touch method. In the present embodiment, the touch panel 52 is applied to the same resolution (_precision) as the resolution of the LCD 51. However, the resolution of the touch panel 52 does not have to be consistent with the resolution of the LCD 51 (four). The input to the touch panel 52 is performed by the stylus pen 60. However, it is not limited to the pen (9), and the touch panel 52 can be input by the user's finger. The housing cover can be provided with a receiving hole 6〇a for accommodating the stylus for operating the touch panel 52 (refer to FIG. 8(10). Here, the receiving hole _ is such that the stylus 60 does not The falling manner is disposed on the upper surface of the outer cover 5 (), but may be disposed on the side or the lower side. Thus, since the terminal device 7 has a touch panel, the user can operate the touch panel while moving the terminal device 7 That is, the user can move the screen of the LQ 51 while directly inputting the screen (by the touch panel 52). As shown in Fig. 8, the terminal device 7 is provided with two analog joysticks 53A and 5 megabytes, and a plurality of operation keys (keys) 54A to _ as operation means (operation 323330 32 201220109). The various types of rocker bars 53A and 53B are devices that indicate the direction. The swing cups 53A and 53B are configured to be slidable by the movable member (rocker portion) operated by the user's finger in any direction up and down, and at any angle in the oblique direction with respect to the surface of the outer cover 5A. That is, it is also referred to as a sliding direction input device. For all kinds of movable members of the rocker 53A and 53B, Yin is a type that can be tilted in any direction with respect to the surface of the outer cover 5 可. In this embodiment, the type of the movable member can be slidable. Therefore, even if the user does not move the thumb a lot, the joysticks 53A and 53B can be operated, and the operation can be performed while holding the cover 5 紧 tightly. When the movable member is tilted, the rocker is M3B, it is easier for the user to understand the degree of input (degree of inclination), and it is easier to perform detailed operations. Further, the left analog stick 53A and the right analog stick 53β are respectively disposed on the left and right sides of the face of the LCD 51. Therefore, the user can use the analog stick to control the direction of the person by either left or right. In addition, as shown in the first and second figures, the various types of rocker levers 53A and 53B are disposed in a state in which the user can hold the left and right portions of the terminal 7 (the left and right sides of the LCD 51). At the position where the operation is performed, even if the user holds the terminal device 7 to move, it is easy to operate the various types of the joysticks 53A and 53B. The port keys 54A to 54L are operation means (operation ')' for pressing the predetermined input. As will be described below, each of the operation keys ... to 4L is provided at a position where the user can operate while holding the left and right portions of the terminal device 7 (see Figs. 10 and 11). Therefore, even when the user holds the terminal device 7 to move, it is easy to operate the 323330 33 201220109 operation means.

11 figure). As shown in Fig. 8(a), the outer cover cross key 54A is provided on the left side of the LCD 51 and is the lower side of the left analog rocker 53A. That is, the 'cross key 54A' is disposed at a position where the user's left hand can operate. The cross key 54A has a cross shape, a key in the up, down, left, and right directions. To at least indicate. In addition, the keys 54B to 54D are disposed on the lower side of the LCD 51. The keys 54B to 54D are disposed at positions where the left and right hands can be operated. Further, the terminal device 7 has a power source key 54M for turning on/off the power of the terminal device 7. The power of the game device 3 can also be turned on/off in a remote manner by the operation of the power button 54M. The power key 54M, like the keys 5A to 54D, is disposed on the lower side of the LCD 51. The power button 54 is disposed on the right side of the keys 54B to 54D. Therefore, the power key 54M is disposed at a position where the right hand can be operated (easy operation). Further, four keys 54E to 54H are provided on the right side of the LCD 51 and are the lower side of the right analog rocker 53B. That is, the four keys 54 £ to 54H are disposed at positions where the user's right hand can operate. Further, the four keys 54E to 54H are arranged such that they are in a positional relationship of up, down, left, and right (with respect to the center positions of the four keys 54E to 54H). Therefore, the terminal device 7 can have four functions of the keys 54E to 54H for the user to instruct the keys in the up, down, left, and right directions. 323330 34 201220109 In the present embodiment, the various types of ratio rockers 53A and 53B are disposed on the upper side of the ten-key 54A and the respective keys 54E to 54H. Here, the various types of ratio rockers 53A and 53B are protruded in the thickness direction (y-axis direction) from the cross key 54A and the respective keys 54E to 54H. Therefore, when the position of the analog rocker 53A and the cross key 54A is reversed, when the user operates the cross key 54A with the thumb, there is a possibility that the thumb hits the analog rocker 53A and an erroneous operation is caused. When the analog rocker 53B and the respective keys 54E to 54H are arranged oppositely,

Will produce the same problem. On the other hand, in this embodiment,

The ratio of the rocker levers 53A and 53B is set to the upper side of the cross key 54A and the respective keys 54E to 54H. Therefore, when the user operates the analog rockers 53A and 53B, the possibility that the finger touches the cross key 54A and the respective keys 54E to 54H It is lower than the above situation. As described above, in the present embodiment, the possibility of erroneous operation can be reduced, and the monthly b is sufficient to improve the operability of the terminal device 7. However, in other embodiments, the analog rocker 53A and the cross key 54A may be arranged oppositely, or the analog rocker 53B and the respective keys 54E to 54H may be arranged oppositely. Here, in the present embodiment, some of the operation portions (various types of the joysticks 53a and 53B, the cross key 54A, and the three keys 5 are broken to 54G) are disposed on the left and right sides of the display unit (10) in the outer cover 5 In the upper and lower directions (y-axis direction) of the middle, the user mainly holds the center in the upper and lower directions of the upper side as the upper side. Here, when the grip = the lower side of the cover 5G is made (especially when the terminal device 7 is separated from the present embodiment, there is a relatively large size), the held terminal device 7 becomes no; '疋, It is not easy for the user to hold the terminal to install t7. On the other hand, in the present embodiment, the user of the 'A operating material' is mainly held at the center in the upper and lower directions of the terminal 323330 35 201220109 7, and the outer cover 50 can be supported from the lateral direction by the palm. Therefore, the user can grip the cover 50 in a stable state and easily hold the terminal device 7', so that the above-described operation portion becomes easier to operate. In other embodiments, at least one of the operation portions is provided on the left and right sides of the display portion, respectively, on the upper side of the center of the outer cover 50. For example, only the analog rockers 53A and 53B may be disposed on the upper side of the center of the outer cover 50. Further, for example, when the cross key 54A is disposed on the upper side of the left analog stick 53A, and the four keys 54E to 54H are disposed on the upper side of the right analog stick 53B, the cross key 54A and the four keys 54E to 54H' can be set. The center of the outer cover 5 is the upper side.

Further, in the present embodiment, 'the projections (the crotch portion 59) are provided on the back side of the outer cover 5A (the side opposite to the surface on which the LCD 51 is provided) (see (c) and ninth of Fig. 8; Figure). As shown in Fig. 8(c), the crotch portion 59 is a mountain-shaped member which is provided to protrude from the back surface of the substantially plate-shaped outer cover 50. The projection has a degree (thickness) that allows the user's finger holding the back surface of the cover 50 to be hooked. The height of the protrusion is preferably from 1 〇 to 25 [,], and in the present embodiment, it is 16.66 [mm]. Further, it is preferable that the lower surface of the projection portion has 45 with respect to the back surface of the outer cover 5A so that the projection portion is easily caught by the user's finger. The above (more preferably, above) tilt. The 'underside of the protrusion' as shown in the figure can be formed to have a larger inclination angle than the upper surface. For example, in Figure 1 and Figure 11, the user hooks the finger = ride on the finger to simplify it, even if the terminal device? The size of $ is also not fatigued and the terminal device 7 can be held in a stable state. That is, the one portion 59 can be used as a support member ‘itUb for supporting the outer cover 50 with fingers, and can also serve as a finger hooking portion. Also 檐. The cymbal 59 is disposed at the center in the up-and-down direction of the outer cover 50, and the upper layer portion 59 is provided on the surface of the outer unit 50, and is provided in the approximate portion of the operating portion i of the rocker 53A & 53b). The position on the opposite side, that is, the large portion is provided in a region including the position on the opposite side to the operation P of the left and right sides of the display portion. Therefore, when operating the above operation P, the user can use the middle finger or the ring finger to support the way of the Ministry of Finance 59. • Hold? End device 7 (refer to Figure 10 and *11). Thereby, the terminal device 7 is easy to hold, and the above operation portion is also easy to operate. In addition, the real y second middle dog has a chevron shape that extends (the convex portion) to the left and right. Therefore, the user can hold the middle finger or the ring finger along the lower surface of the protrusion to hold the terminal I, which is easier. The terminal device 7 is held. It is only necessary for the Zhan part 59 to be formed as a gamma convex portion to extend left and right, and is not limited to the shape extending in the horizontal direction as shown in Fig. 9. In other embodiments, the crotch portion 59 can extend in a direction that is slightly inclined from the horizontal side (four). For example, the planning unit 5 can be set to tilt upwards (or down) from the left and right ends toward the center. In the present embodiment, the material 59 which is a pure shape is taken as the rule of the material 59, and the material 59 which is a pure shape is taken as the protrusion portion of the outer cover, but the protrusion portion may have any shape. For example, in the other embodiments 4 and 5, the two protrusions may be provided on the back side of the outer cover 5, and the two protrusions may be provided on the left and right sides (the protrusions are not provided in the center in the left-right direction) (refer to the figure). Further, in other embodiments, the cross-sectional shape of the protruding portion (the shape perpendicular to the cross-sectional direction in the x-axis direction)' may also allow the user's finger: the manner of the floor terminal device 7 (the finger is more tightly hooked to the protruding portion) The way), 323330 37 201220109 is formed into a hook shape (the shape of the concave shape below). The width of the protrusion (the crotch portion 59) in the upper and lower directions may be any width. For example, the protrusion may be formed to the upper side of the outer cover 50. That is, the upper surface of the protruding portion can be formed at the same position as the side surface on the upper side of the outer cover 50. At this time, the outer cover 50 is configured in two stages in which the lower side is thin and the upper side is thick. Thus, the outer cover 50 is preferably formed on the left and right sides of the back surface, and has a downward facing surface (the lower surface of the projection). Thereby, the user can easily hold the final end device 7 by holding the finger against the face. The above-mentioned "face facing downward" may be formed at any position on the back surface of the outer cover 50, but is preferably located on the upper side of the center of the outer cover 5''. Further, as shown in Fig. 8 (a), (b), and (c), the first L key 541 and the first R key 54J are provided on the left and right sides of the upper surface of the outer cover 5 分别. In the present embodiment, the sinker key 541 and the first key 54J' are provided on the obliquely upper portion (the upper left portion and the upper right portion) of the outer cover 50. Specifically, the first L-key 541 is provided on the left side of the upper side of the upper surface of the plate-shaped outer cover 50, and is exposed from the side surface on the upper left side (in other words, exposed from the spring side of both the upper side and the left side). Further, the first r-key 54J is provided on the right side end of the upper side of the plate-shaped outer cover, and is exposed from the side surface on the upper right side (in other words, exposed from the side faces of both the upper side and the right side). In this manner, the first L key 541 is placed at a position where the user's left index finger can be operated, and the first R key 54J is placed at a position where the user's right index finger can be operated (see Fig. 10). In the other embodiments, the left and right operation portions ′ provided on the upper surface of the outer cover 5 不 are not required to be provided at the left and right end portions, and may be provided at positions other than the end portions. Further, the operation portions may be provided on the left and right side surfaces of the outer cover 50, respectively. Further, as shown in Fig. 8 (c) and Fig. 9, the second L key 54K 323330 38 201220109 and the second R key 54L are disposed on the protruding portion (the crotch portion 59). The second L key 54K is provided near the left end of the crotch portion 59. The 2nd R key 54L is provided near the right end of the crotch portion 59. That is, the 2nd L key 54K is provided slightly above the left side of the back cover 50 (the left side when viewed from the front side), and the 2nd R key 54L is provided on the right side of the back surface of the cover 50 (the right side when viewed from the front side) It is called above the micro. In other words, the 2nd L key 54K is disposed at a position on the (substantially) opposite side of the left analog rocker 53A provided on the surface, and the 2nd R key 54L is disposed (substantially) opposite to the right analog rocker 53B provided on the surface. Side position. For example, the second L key 54K is disposed at a position where the user's left middle finger or index finger can be operated, and the second R key 54L is disposed at a position where the user's right middle finger or index finger can operate (refer to FIG. 10 and 11 figure). Further, as shown in Fig. 8(c), the second L key 54K and the second R key 54L are provided on the upper surface of the crotch portion 59. Therefore, the second L key 54K and the second R key 54L have a key surface that faces upward (inclined upward). Since the middle finger or the index finger is estimated to move in the up and down direction when the user holds the terminal device 7, the user can easily press the second L key 54K and the second R key 54L by pressing the φ key face upward. As described above, in the present embodiment, the operation unit (the analog rockers 53A and 53B) is provided on the upper side of the display unit (LCD 51) on the upper side of the outer cover 5A, and the outer cover 50 is provided on the outer cover 50. On the back side, other operation portions (the second L-key 54K and the second R-key 54L) are provided at positions on the opposite side of the operation portion. According to this, the operation unit and the other operation unit are disposed at positions facing each other between the front side and the back side of the outer cover 50. When the operation unit is operated, the user can hold the outer cover 5 from the front side and the back side. Hold. Further, the user who operates the operation portions grips the upper and lower sides of the upper and lower sides of the outer cover 50, 323330 39 201220109, so that the terminal device 7 can be held on the upper side and the terminal device 7 is supported by the hand 4 ( Refer to Figure 1 and Figure n). With the above-described inner valley, the user can stably hold the terminal device 7' while operating at least four operation portions, and can provide an operation device (terminal device 7) which is easy for the user to grip and has excellent operability. As described above, in the present embodiment, the user can easily hold the terminal device 7 by holding the terminal device 7 while holding the finger against the lower surface of the projection (the crotch portion 59). Further, since the second L key 54K and the second R key 54L are provided on the upper surface of the projection, the user can easily recognize the key in the above state. The user can easily hold the terminal device 7 in the following manner, for example. That is, as shown in FIG. 10, the user can also hold the ring finger against the underside of the crotch portion 59 (the single-dot chain line shown in FIG. 1), and hold the terminal (in the manner of the ring finger supporting the crotch portion 59). Device 7. At this time, the user can operate the four keys (the first L key 541, the irth key 54J, the second L key 54IC, the ® and the second R key 54L) with the index finger or the middle finger. For example, when the required game operation is more and more complicated, the majority of the keys can be easily operated by being held as shown in Fig. 1 . In addition, since the various types of rockers 53A and 53B are disposed on the upper side of the ten-key 54A and the keys 54E to 54H, the user can operate the analog rockers 53A and 53B by means of the thumb when relatively complicated operations are required. , and can be carried out conveniently. Further, in Fig. 10, the user puts the thumb against the surface of the outer cover 50, presses the index finger against the upper surface of the outer cover 5, and presses the middle finger against the upper surface of the crotch portion 59 of the outer cover 50, and the ring finger is pressed against the crotch portion 59. Below, the little finger is pressed against the back of the outer cover 50 to hold the terminal device 7. Thus, 323330 40 201220109 Can the user hold the terminal device tightly from the four sides of the outer cover 5? . Further, as shown in Fig. 11, the user can hold the middle finger against the bottom of the Zhan 59 (the single-dot chain line not shown in Fig. 11). At this time, the user's month b is enough to easily operate the two groups of the & key and the second key 541^), for example, 'when the required game operation is less and simpler, Hold it as shown in Figure u. In Fig. u, since the user can hold the lower side of the cover 5〇 by two fingers (the ring finger and the little finger), the terminal device 7 can be gripped tightly. In the present embodiment, the lower surface of the I portion 59 is disposed between the various types of ratio rockers 53A and 53B, and the cross key 54A and the four keys 5 to 54H (located in the various types of joysticks 53A and 53B). Below, and the cross key 5M and the four keys 54E to 54H are upper). Therefore, when the ring finger is held against the preparation unit 59 to hold the terminal device 7 (Fig. 1), it is easy to operate the various types of ratio rockers 53A and 53B with the thumb, and the middle finger is pressed against the jaw portion 59 to hold the terminal crack. When 7 (Fig. 11) is used, it is easy to operate the cross key 54A and the four keys 54e to _ 54H with the plum fingers. That is, regardless of the above two cases, the user can perform the direction input operation while holding the terminal device 7 tightly. Further, as described above, the user can also hold the terminal device 7 vertically. That is, as shown in Fig. 12, the user holds the upper side of the terminal device \ with his left hand, whereby the terminal can be held longitudinally. Further, as shown in (d), the user holds the lower side of the terminal device 7 with the left hand, whereby the terminal device 7 can be held longitudinally. In Figs. 12 and 13, the case where the terminal device 7 is held by the left hand is shown, but it is also possible to hold the terminal (4) with the right hand. In this way, since the user can hold the terminal device 7 with one hand, for example, 323330 201220109 can be operated by holding the terminal device 7 with one hand while inputting the touch panel 52 with the other hand. Further, when the terminal device 7 is held by the gripping method shown in Fig. 12, the user touches the lower surface of the crotch portion 59 with a finger other than the thumb (the middle finger, the ring finger, and the little finger in Fig. 12) (Fig. 12) The single-point chain line shown), whereby the terminal device 7 can be held tightly. In particular, in the present embodiment, since the crotch portion 59 is formed to extend left and right (upper and lower in FIG. 12), it is possible to hold the finger other than the thumb regardless of the position on the upper side of the terminal device 7 by the user. The crotch portion 59 can hold the terminal device 7 tightly. That is, the crotch portion 59 can be used as a handle when the user holds the terminal device 7 longitudinally for use. On the other hand, when the terminal device 7 is held by the holding mode shown in Fig. 13, the user can operate the keys 54B to 54D with the left hand. Therefore, for example, the touch panel 52 can be input with one hand, and the hand-to-keys 54B to 541) of the terminal device 7 can be operated, and more operations can be performed. In the terminal device 7 of the present embodiment, when the protruding portion (the crotch portion 59) is provided on the back surface, when the terminal device 7 is placed with the crotch surface of the LCD 51 (the surface of the outer cover 50) facing upward, the surface device 7 is placed. It is slightly tilted. Thereby, it is possible to view the screen more easily in the state in which the terminal device 7 is placed. Further, it is easy to perform an input operation on the touch panel 52 in a state where the terminal device 7 is placed. Further, in another embodiment, an additional protrusion having a height equal to that of the crotch portion 59 may be formed on the back surface of the outer cover 50. According to this, the projections are brought into contact with the ground while the screen of the LCD 51 is facing upward, and the terminal device 7 can be placed to horizontally. In addition, the protrusions of the chasing 42 323330 201220109 can be detachable (or foldable). According to this, the terminal device can be placed when both the state in which the face is slightly inclined and the state in which the face is horizontal. That is, the crotch portion 59 can be used as a foot when the terminal device 7 is placed for use. Each of the operation keys 54A to 54L is appropriately assigned a function corresponding to the game program. For example, the cross key 54A and the keys 54E to 54H can be used for a direction indicating operation or a selection operation, etc., and the keys 54B to 54E can be used for a decision operation or a cancel operation or the like. Further, the terminal device 7 may have a button for turning on/off the face display of the LCD 51 or a button for making a setting (pairing) with the game device 3. As shown in Fig. 8(a), the terminal device 7 is provided with a indicator portion 55 composed of a marker 55A and a marker 55B on the surface of the cover 50. The indicator portion 55 is provided on the upper side of the LCD 51. Each of the marker 55A and the indicator 55B is composed of one or more infrared LEDs, similarly to the markers 6R and 6L of the indicator device 6. The red φ outer-line LEDs constituting the markers 55A and 55B are disposed inside the window portion through which infrared rays can pass. The indicator unit 55 is similar to the above-described indicator device 6, and is used to cause the game device 3 to calculate the operation of the controller 5 or the like. Further, the game device 3 can control the lighting of the respective infrared LEDs provided in the indicator portion 55. The terminal device 7 is provided with a camera 56 as an imaging means. The camera 56 includes an imaging element (for example, a CMOS image sensor or a CCD image sensor) having a predetermined resolution and a lens. As shown in Fig. 8, in the present embodiment, the camera 56 is provided on the surface of the outer cover 50. Therefore, the camera 56 can image the face of the user holding the terminal device 7, for example, 43 323330 201220109, while viewing the LCD 51 while the user is playing the game. In the present embodiment, the camera 56 is disposed between the two markers 55A and 55B. The terminal device 7 is provided with a microphone 69 as a voice input means. A microphone hole 5〇c is provided on the surface of the outer cover 50. The microphone 69 is disposed inside the outer cover 50 in the microphone hole 50c. The microphone 69 detects the sound of the user and the like, and the sound around the terminal device 7. ^ s ^ 竣 7 is equipped with σ 八 77 as a means of sound output. As shown in Fig. 8(d), a horn hole 57 is provided on the lower side of the surface of the outer cover 50. The output sound of the speaker 77 is output from the horn hole 57. In the present embodiment, the terminal block 7 has two horns, and a horn hole 57 is provided in each of the left horn and the right horn. The terminal device 7 is provided with a second knob 64 for adjusting the horn 77. Further, the terminal 褒f 7 is provided with a sound output terminal 62 for connecting a headphone or the like. Here, in consideration of the case where the attachment device is attached to the lower side surface of the outer cover, the sound wheel output terminal 62 and the knob 64 are provided on the upper side of the outer army 50, but may be disposed on the left side or the lower side. . Further, on the outer sheet 50, a window 63 for injecting an infrared signal from the infrared communication module 82 to the outside of the terminal split 7 is provided. Here, the window 63 is provided on the outer cover 5Q in such a manner that the infrared signal is incident on the front side of the user when the two sides of the 1XD 51 are held. In other embodiments, the window 63 can be disposed, for example, on the outer cover 5. The back of the cockroach waits for the position of Ren. Further, the terminal device 7 is provided with a navigation connector 58 for connecting another device to the terminal device 7. Expansion connector Qiuwei_ Communication terminal for receiving and transmitting data (information) between other devices connected to terminal 323330 44 201220109. In the present embodiment, as shown in Fig. 8(d), the expansion connector 58 is provided on the lower side surface of the outer cover 50. The other device connected to the expansion connector 58 may be any device such as a controller (gun type controller or the like) used for a specific game order or an input device such as a keyboard. If there is no need to connect an add-on device, the expansion connector 58 may not be provided. The expansion connector 58 may include a terminal for supplying power to the additional device, or I is a terminal for charging. Further, the terminal device 7 has a charging terminal 66 for taking power from the additional device in addition to the expansion connector 58. When the charging terminal 66 is connected to a stand 210 to be described later, power is supplied from the holder 21 to the terminal device 7. In the present embodiment, the charging terminal 66 is provided on the lower side surface of the outer cover 5A. Therefore, when the terminal device 7 and the additional device are connected (for example, the input device 2 shown in Fig. 15 or the input device 22 shown in Fig. 17)

To the terminal device 7. The lower side of the cover 50, the upper side of the cover 50, and the upper side of the cover 50 are considered to be attached to the outer charging connector (the cover portion 61). It is placed on the left and right side or the lower side. 323330 45 201220109 Further, the terminal device 7 has a battery cover 67 that is detachable from the outer cover 50. A battery (battery 85 shown in Fig. 14) is disposed inside the battery cover 67. In the present embodiment, the battery cover 67 is provided on the back side of the outer cover 50, and is provided on the lower side of the protruding portion (the crotch portion 59). Further, on the outer cover 50 of the terminal device 7, holes 65a and 65b for attaching the straps of the straps to the fixing are provided. As shown in Fig. 8(d), in the present embodiment, the holes 65a and 65b are provided on the lower surface of the outer cover 50. Further, in the present embodiment, two holes 65a and 65b are provided on the left and right sides of the outer cover 50, respectively. That is, the hole 65a is provided on the lower side of the outer cover 50 to the left side, and the hole 65b is provided on the lower side of the outer cover 50 to the right side. The user can tying the sling to either of the holes 65a and 65b and attaching the sling to the wrist of the user. Thereby, even when the user accidentally drops the terminal device 7 or detaches the terminal device 7 from the hand, the terminal device 7 can be prevented from falling or colliding with other articles. In the present embodiment, since the holes are provided on the left and right sides, the user can attach the sling to any of the hands, which is extremely convenient. Regarding the terminal device 7 shown in Figs. 8 to 13, the shape of each of the operation keys or the cover 50, the number of each component, the installation position, and the like are merely examples, and may be other shapes, numbers, and installation positions. Next, the internal configuration of the terminal device 7 will be described with reference to Fig. 14. Fig. 14 is a block diagram showing the internal configuration of the terminal device 7. As shown in FIG. 14, the terminal device 7 includes a touch panel controller 71, a magnetic sensor 72, an acceleration sensor 73, a rotation sensor 74, and a user in addition to the configuration shown in FIG. Interface controller (UI controller) 75, codec 46 323330 201220109 LSI 76, speaker 77, sound IC 78, microphone 79, wireless module 80, antenna 81, infrared communication module 82, flash memory 83, power supply 1C 84, and battery 85. These electronic components are mounted on an electronic circuit board and housed in the housing 50. The UI controller 75 is a circuit for inputting output of control data to various output input sections. The UI controller 75 is connected to the touch panel controller 71, the analog rocker 53 (analog rocker 53A and 53B), the operation keys 54 (each operation keys 54A to 54L), the indicator portion 55, the magnetic sensor 72, Acceleration detector 73, and swing sensor 74. Further, the UI controller 75 is connected to the codec LSI 76 and the expansion connector 58. Further, the power source 1C 84 is connected to the UI controller 75, and supplies power to each unit via the UI controller 75. The built-in battery 85 is connected to the power source 1C 84 and supplies power. Further, a charger 77 or a cable that obtains electric power from an external power source via a charging connector or the like can be connected to the power source 1C 84, and the terminal device 7 can be charged when the charger 86 or cable is used to supply electric power from an external power source. . The terminal device φ 7 can also be charged by attaching the terminal device 7 to a charger having a charging function (not shown). That is, although not shown in the drawing, the charger (the holder 210 shown in FIG. 20) that can obtain electric power from the external power source can be connected to the power source 1C 84 via the charging terminal 66, and the terminal device 7 can use the charger. Power supply and charging from an external power source are performed. The touch panel controller 71 is connected to the touch panel 52 and is a circuit for controlling the touch panel 52. The touch panel controller 71 generates a predetermined form of touch position data based on the signal from the touch panel 52 and outputs it to the UI controller 75. The touch position data indicates the coordinates of the position after input on the input face of the 47 323330 201220109 of the touch panel 52. The touch panel controller 71 reads the signal from the touch panel 52 and generates the touch position data at a ratio of one time every predetermined time. In addition, various control instructions for the touch panel 52 are output from the UI controller 75 to the touch panel controller '71 ° analog rocker 53 to display the rocker portion operated by the user's finger. The joystick data of the direction and amount of the sliding (or dumping) is output to the UI controller 75. Further, the operation key 54 outputs an operation key data indicating the input state (whether or not) of the operation keys 54A to 54L to the UI controller 75. The magnetic sensor 72 detects the orientation by detecting the magnitude and direction of the magnetic field. The orientation data showing the detected orientation is output to the UI controller 75. Further, a control instruction to the magnetic sensor 72 is output from the UI controller 75 to the magnetic sensor 72. Regarding the magnetic sensor 72, there are MI (magnetic impedance) elements, magnetic flux gate sensors, Hall elements, GMR φ (major magnetoresistance) elements, TMR (tunneling magnetic resistance) elements, or AMR (different). A sensor such as a magnetic resistance element, but can be used as long as it can detect the orientation. Strictly speaking, in the place where the magnetic field is generated in addition to the geomagnetism, the obtained orientation data does not show the orientation, but even in this case, since the orientation data changes when the terminal device 7 moves, the terminal device 7 can be calculated. Posture changes. The acceleration sensor 73 is disposed inside the outer cover 50 and detects the magnitude of linear acceleration along the three axes (the xyz axis shown in (a) of Fig. 8). Specifically, the longitudinal direction 48 323330 201220109 of the outer cover 50 of the acceleration sensor 73 is the x-axis, the direction perpendicular to the surface of the outer cover 50 is the y-axis, and the short-side direction of the outer cover 50 is the z-axis. The value of the linear acceleration of each axis is measured. The acceleration data showing the detected acceleration is output to the UI controller 75. Further, a control instruction to the acceleration sensor 73 is output from the UI controller 75 to the acceleration sensor 73. In the present embodiment, the acceleration sensor 73 is, for example, a capacitive MEMS type acceleration sensor. However, in other embodiments, other types of acceleration sensors may be used. In addition, the acceleration sensor 73 can also detect the acceleration sensor in the 1-axis or 2-axis direction. The swing sensor 74 is disposed inside the outer cover 50 and detects angular velocities of three axes around the X-axis, the y-axis, and the z-axis. The angular velocity data showing the detected angular velocity is output to the UI controller 75. Further, a control instruction to the swing sensor 74 is output from the UI controller 75 to the swing sensor 74. The number and combination of the rotary sensors for detecting the angular velocity of the three axes can be any, and the rotary sensor 74 is the same as the rotary sensor 48^, and can be sensed by a 2-axis rotary sensor and a 1-axis rotary sensor. The composition of the device. In addition, the swing sensor 74 can also be a rotary sensor that detects a 1-axis or 2-axis direction. The UI controller 75 outputs the operation data including the touch position data, the joystick data, the operation key data, the orientation data, the acceleration data, and the angular velocity data received from the respective constituent elements to the codec LSI 76. When other devices are connected to the terminal device 7 via the expansion connector 58, the operational data may further include information showing operations performed on the other devices. 49 323330 201220109 The codec LSI 76 is a circuit that compresses the data transmitted to the game device 3 and decompresses the data transmitted from the game device 3. The codec LSI 76 is connected to an LCD 51, a camera 56, a sound 1C 78, a wireless module 80, a flash memory 83, and an infrared communication module 82. Further, the codec LSI 76 includes a CPU 87 and an internal memory 88. The terminal device 7 is configured not to perform the game processing itself, but must perform the minimum degree of management or communication for the terminal device 7. When the power is turned on, the program stored in the flash memory 83 is read to the internal ® memory 88 and executed by the CPU 87, whereby the terminal device 7 is activated. Further, a part of the area of the internal memory 88 is used as the VRAM of the LCD 51. The camera 56 images the image in accordance with an instruction from the game device 3, and outputs the image data after the image capture to the codec LSI 76. Further, a control instruction for the camera 56, such as an image capturing instruction, is output from the codec LSI 76 to the camera 56. The camera 56 can also perform photography of moving φ 昼. That is, the camera 56 can perform repeated imaging and repeatedly output image data to the codec LSI 76. The sound 1C 78 is connected to the slave 77 and the microphone 79, and is a circuit for controlling the output of the sound data to the speaker 77 and the microphone 79. That is, when the sound data is received from the codec LSI 76, the sound 1C 78 outputs an audio signal obtained by D/A conversion of the sound data to the speaker 77, and outputs sound from the "eighth 77". Further, the microphone 79 detects the sound transmitted to the terminal device 7 (the user's voice or the like), and outputs the sound signal for displaying the sound to the sound 1C 78. The sound 1C 78 A/D converts the 50 323330 201220109 sound signal from the microphone 79, and outputs the sound data of a predetermined form to the codec LSI 76. The codec LSI 76 transmits image data from the camera 56, sound data from the microphone 79, and operation data from the UI controller 75 to the game device 3 via the wireless module 80 as terminal operation data. In the present embodiment, the codec LSI 76 performs the same compression processing as the codec LSI 27 on the image data and the audio data. The terminal operation data and the compressed image data and sound data are output to the wireless module 80 as transmission data. An antenna 81 is connected to the wireless module 80, and the wireless module 80 transmits the transmission data to the game device 3 via the antenna 81. The wireless module 80 has the same function as the terminal communication module 28 of the game device 3. That is, the wireless module 80 has a function of being connected to the wireless LAN by, for example, a method according to the IEEE 802.1 In specification. The transmitted data may or may not be encoded as necessary. As described above, the transmission information transmitted from the terminal device 7 to the game device 3 includes operation data (terminal operation data), image data, and sound data. When other devices are connected to the terminal device 7 via the expansion connector 58, the above-mentioned transmission data may further contain information received from the other devices. In addition, the infrared communication module 82 can perform infrared communication, for example, in accordance with IRM specifications, with other devices. The codec LSI 76 can transmit the data received via the infrared communication to the game device 3 by including the above-mentioned transmission data as necessary. Further, as described above, the compressed image data and sound data are transmitted from the game device 3 to the terminal device 7. These data are received by the codec LSI 76 via the antenna 81 and the line module 80 without the 51 323330 201220109. The codec lsi 76 decompresses the received image data and sound data. The decompressed image data is output to the LCD 5 to display an image on the LCD 51. That is, the codec LSI 76 (CPU 87) displays the received image data on the display unit. Further, the decompressed sound data is output to the sound IC 78, and the sound 〖CM outputs sound from the slave 77. Further, when the control data is included in the material received from the game device 3, the codec LSI 76 and the UI controller 75 instruct the respective units to follow the control information of the control data. As described above, the control data indicates that each component included in the terminal device 7 is present (in the present embodiment, the rider%, the touch panel controller 71, the indicator 55, and the sensors are 74, and The information of the control indication performed by the infrared communication module 82). In the present embodiment, the control instruction indicated by the control data may be an instruction to operate the above-described components or to stop (stop) the operation. In other words, in order to suppress the power consumption, the components that are not used in the game can be stopped. In this case, the data transmitted from the terminal device 7 to the game device 3 is set to include data from the components of the suspension. . Since the indicator portion is an infrared LED', the control can be set only to turn on/off the power supply. As described above, the terminal device 7 includes the operation means of the touch panel 52, the analog rocker 53, and the operation key 54, but in other embodiments, other operation means may be provided instead of or in addition to the operation means. . Further, the terminal device 7 includes a magnetic sensor 72, an acceleration sensor 73, and a rotation sensor 74 as calculations for calculating the position (including position or posture or position or posture change) of the terminal. However, in another embodiment, 323330 52 201220109 may be configured to include only one or two of the sensors. Further, in other embodiments, other sensors may be provided instead of or in addition to the sensors. Further, the terminal device 7 is configured to include the camera 56 and the microphone 79. However, in other embodiments, the camera 56 and the microphone 79 may not be provided or only one of them may be provided. Further, the terminal device 7 includes a marker 55 as a configuration for calculating a positional relationship between the terminal device 7 and the controller 5 (a position and/or a posture of the terminal device 7 when viewed from the controller 5, etc.), but other implementations In the form, the indicator 55 may not be provided. Further, in other embodiments, the terminal device 7 may be provided with other means as a configuration for calculating the positional relationship. For example, in another embodiment, the controller 5 may be provided with an indicator portion and the terminal device 7 may be provided with an imaging element. Further, at this time, the indicator device 6 may be provided with an image pickup element instead of the infrared LED. (Configuration of Attachment Device) Next, an example of an attachment device that can be attached (connected) to the terminal device 7 will be described with reference to Figs. 15 to 20 . The attachment device may have any function, for example, an additional operation device attached to the terminal device 7 for performing a predetermined operation, or a charger for supplying power to the terminal device 7, or for erecting the terminal device 7 to a predetermined posture. The bracket. As shown in Figs. 8(d) and 9, the locking holes 59a and 59b which the claws of the attachment device can lock are provided below the projections (the crotch portion 59). The locking holes 59a and 59b are used when connecting other attachment means to the terminal device 7. That is, the attachment device has a lockable hole 59a 53 323330 201220109. When the attachment device is connected to the terminal device 7, the claw is locked to the locking hole 59a « and the government is wrong. And the terminal device 7 and the additional dream are fixed by 59b. Further, a = hole may be provided inside the locking holes 59a and 59b, and the attachment is firmly fixed by screws. Here, the projection on the back surface of the wire 7 is provided as a wide portion 59 having a shape of a Jane shape. Also ^ Zhan Department 59 series is extended in the left and right direction. As shown in Fig. 9, the locking holes 59a and 59b are provided in the vicinity of the lower side of the Zhan portion 59 (in the left-right direction). The number of the locking holes 59a and _ provided under the Zhan portion 59 can be arbitrary. When the number ' is 1, it is more than a 犄, which is preferably placed in the center of the towel of the material 59. According to this, the additional device can be stably connected while maintaining the right and left equalization. Further, when the locking hole is disposed near the center, the size of the attachment can be reduced as compared with when it is disposed at the left and right ends. That is, the crotch portion 59 can be used as a locking member for the additional device. Further, in the present embodiment, as shown in Fig. 8 ((1), the locking holes 50a and 50b are provided on the lower surface of the outer cover 5A. Therefore, when the attachment device is connected to the terminal device 7, four are provided. The claw portions are respectively locked to the four locking holes to fix the terminal device 7 and the attachment device. Thereby, the attachment device can be firmly fixed to the terminal device 7. The inside of the locking holes 5〇a and 5〇b A screw hole may be provided to screw the attachment to the attachment. Further, in other embodiments, the locking hole provided in the cover may be arbitrarily arranged. Figures 15 and 16 show the attachment of the attachment to the terminal device. Fig. 15 is a view of the terminal device 7 and the input device 200 viewed from the front side of the terminal device 7, and Fig. 16 is a view of the terminal device 7 and the input device 200 viewed from the back side of the terminal device 7. 15 and 16 323 330 54 201220109 In the figure, the wheel device of the attachment device is placed on the terminal device 7. The drama is provided with a first grip portion 2QQa and a second grip portion 200b. 2〇〇a and ?nnu, 2〇〇b are rod-shaped (columnar) shapes, respectively The user can hold the two sides of each of the vibrating parts 200a and 2b to use the input device 2〇0 (and the terminal device 7), or hold both sides to use the input device paste. The human device may be configured to include only i grip portions. Further, the input splitting_ includes a support portion 205. This embodiment

In the form, the branch portion 205 supports the back surface (inner surface) of the terminal device 7. Further, the branch portion 205 has four claw portions (protrusions), and four claw portions are slidable; the latching holes 50a, 50b, 59a & 59b are not locked. As shown in Fig. 15, when the wheeling device 200 is connected to the terminal device 7, the terminal device 7 is attached by attaching the four claw portions to the locking holes _, Na, and 59b', respectively. Device 岐. Thereby, the input device 200 can be firmly fixed to the terminal device 7. In addition, in other embodiments, in addition to the locking of the claw portion and the locking hole (or instead of locking), the input device 2A and the terminal device 7 may be screwed and fixed, etc. The input device 200 is more firmly fixed to the terminal device 7. The position where the screw is fixed can be any position. For example, the support portion 205 of the input device 200 that abuts against the back surface of the outer cover 5 can be screwed and fixed to the crotch portion 59. Thus, in the present embodiment, the attachment device can be tightly fixed to the terminal device 7 by the locking holes 59a and 59b. The terminal device 7 has a sensor (magnetic sensor 72, acceleration sensor 73, and rotation sensor 74) for detecting the motion or tilt of the terminal device 7, so that the terminal device 7 itself can be moved. To use. For example, when the input device 200 shown in FIGS. 15 and 16 of 323330 55 201220109 is connected to the terminal device 7, the user may also hold the grip portion 200a and/or the grip portion 200b of the input device 200. And, as in the case of grabbing, the input device 200 is moved to operate. As in the present embodiment, when it is assumed that the terminal device 7 itself is moved for use, it is particularly effective for the attachment device to be tightly fixed by the locking holes 59a and 59b. Further, in the present embodiment, the support portion 205 detachably supports the terminal so that the writing of the LCD 51 is substantially perpendicular when the first grip portion 200a (or the second grip portion 200b) faces in the vertical direction. The device 7 ^ each of the grip portions 200a and 200b is formed to be substantially parallel to the display portion (the surface of the outer cover 50) of the terminal device 7 connected to the input device 2A. In other words, each of the grip portions 200a and 200b is formed in the vertical direction of the display portion of the terminal device 7 connected to the input device 2A. As described above, the input device 200 connects the display unit of the terminal device 7 to the terminal device 7 in a posture toward the user when the user holds the input device 200. The user holds the grip portions 200a and 200b substantially at least vertically, and φ allows the face of the display portion to face toward the user'. Therefore, the input device 200 can be used while viewing the face of the display portion. operating. In the present embodiment, the second grip portion 200b is oriented substantially parallel to the first grip portion 200a. However, in other embodiments, at least one grip portion may be formed to be substantially parallel to the screen of the Lcd 51. Orientation. Thereby, the user can easily hold the input device 2 and the terminal device 7 by holding the grip portion ' toward the self. Further, in the above-described embodiment, the support portion 205 is provided in the connecting member 2〇6 that connects the first grip portion 200a and the second grip portion 200b. That is, 323330 56 201220109 Since the support portion 205 is provided between the two grip portions 2a and 200b, the terminal device γ connected to the input device 200 is disposed in the two grip portions 2a and 2 〇〇b between. At this time, the center of gravity of the operating device (operating system) constituted by the terminal device 7 and the input device 2 is located between the two grip portions 200a and 200b, so that the user can hold the two grip portions 2 by 〇〇& 

Hold the 200b and hold the operating unit easily. In the above embodiment, the square grip portion (the first grip portion 2A) is provided at the position on the front side of the screen of the terminal device 7 mounted on the input device 200, and the other .卩 (the second grip portion 20Gb) is provided at a position on the rear side of the kneading surface. Because the user's hand is located in front of the face, so that the other hand position; Dan, rear' and holding the two grips in a gripping manner, can easily hold the device. For example, the above-described operating device is used as a rush to perform, and the above-described operating device is suitable for shooting. Further, the input vibrating unit 2 includes a first key 2, a second key 2〇2, ^2〇3, and a rocker 204 as an operation unit. Each of the keys 201 to 203 is divided into two keys (keys) pressed by the user. The rocker 204 is a device that can indicate the direction. The above-mentioned cat .lV, p ^ compositing part is preferably located at a position where the finger of the hand of the hand can be operated when the user grips the grip portion. In the present embodiment, the handcuffs 2〇2 and the rocker 204 of the first key 201 and the ninth key portion 200a are provided at positions where the gripping first grip is provided at the position where the gripping fingers can be operated. Further, the third button 203 is set. Brother 2 The position of the index finger of the hand of the grip portion 200b can be operated. Input Guojin Device 2〇〇 2〇〇 can be equipped with an imaging device (camera). For example, the input θ / is the same as that of the imaging information calculation unit 323330 57 201220109 35 provided in the controller 5 described above. At this time, the imaging element of the imaging information computing unit can be set to face the front side of the input device 200 (the rear of the screen of the terminal device 7). For example, the third key 203 can be placed in the position of the third key 203 instead of the third key 203 by an infrared filter, and the image pickup element can be disposed inside. According to this, the user can use the front side of the input device 200 toward the television 2 (the pointing device 6), so that the game device 3 can calculate the orientation or position of the input device 200. Therefore, the user can perform the operation of the input device 200 in a desired direction, and the input device 200 can be used for intuitive control and easy operation. Further, the input device 200 may be configured to include a camera similar to the camera 56 instead of the imaging information computing unit. At this time, the camera is the same as the above-described imaging element, and can be set to face the front side of the input device 200. According to this, by using the front side of the input device 200 toward the television 2 (the pointing device 6), the user can image the image in the imaging direction opposite to the camera 56 of the terminal device 7. Further, the input device 200 includes a connector (not shown), and the connector φ is connected to the extension connector 58 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. Thereby, data can be received and transmitted between the input device 200 and the terminal device 7. For example, data showing the operation performed on the input device 200 or data showing the imaging result of the imaging device can be transmitted to the terminal device 7. At this time, the terminal device 7 can also transmit the data showing the operation performed on the terminal device 7 and the data transmitted from the input device to the game device 3 in a wireless manner. Further, the input device 200 may include a charging terminal that is connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. According to this, when the terminal device 7 is attached to the input device 2GG_, the power can be supplied from one side to the other device. For example, the input device can be connected to the charger, and the terminal device 7 can be charged by taking power from the charger via the input. The input device 200' may be configured, for example, as follows. Fig. 17 is a view showing another example of the input device. Further, Fig. 18 and Fig. 19 are views showing the appearance of the input device 220 shown in the drawing of the terminal device 7. Fig. 18 is a view of the terminal device 7 and the input device 220 viewed from the back side of the terminal device 7. Fig. 19 is a view showing the terminal 7 and the input device 220 as viewed from the front side of the terminal device 7. In the terminal device 7, for example, the input device 22Q of Fig. 17 may be struck. The following description of the input | set 22 〇. In the drawings, the components of the components corresponding to the input device 2GG shown in Figs. 15 and 16 are attached with the same reference numerals as in the drawings and the drawings, and the detailed description is omitted. Description. As shown in the figure, the input device 22A is the same as the input device 2A, and includes a first grip portion 200a and a second grip portion. Therefore, the user can use the input device 220 (and the terminal device 7) by holding only one of the grip portions 2a and 2_, or use the input device (10) while holding both. . Further, the input device 220 is provided with the same branch > P 205 as the input device 2A. The support portion 205 is the same as the support portion of the input device 2'', and has = claws.卩 (only 3 claws to 2 c are shown in the figure). In the respective claws, the upper two claw portions 205a and 205b are respectively locked to the locking holes 59a and 59b of the terminal device 7. The two claws on the lower side can be separated into the locking holes and the ancestors of the final set 7. 323330 59 201220109 The claw portion is not disposed in the left-right direction (in the left-right direction of the terminal device 7 attached to the support portion 205), and is disposed at a position symmetrical with the claw portion 205c. As shown in FIGS. 18 and 19, when the input device 220 is connected to the terminal device 7, the terminal devices are locked by locking the four claw portions to the locking holes 50a, 50b, 59a, and 59b, respectively. 7 is fixed to the input device 220. Thereby, the input device 220 can be firmly fixed to the terminal device 7. In addition, in other embodiments, in addition to the locking of the claw portion and the locking hole (or instead of locking), the input device 220 and the terminal device 7 may be screwed and fixed, and the input may be input. The device 220 is more firmly fixed to the terminal device 7. For example, screw holes may be provided inside the locking holes 50a and 50b, and the two lower claw portions may be screwed and fixed to the locking holes 50a and 50b. In addition, the position where the screw is fixed can be any position. As described above, the input device 220 is also similar to the input device 200, and can be tightly fixed to the terminal device 7. Further, the input device 220 is also the same as the input device 200, and supports the φ struts 205 such that the face of the LCD 51 is substantially perpendicular when the first grip portion 200a (or the second grip portion 200b) is oriented in the vertical direction. In this manner, the terminal device 7 is detachably supported. Each of the grip portions 200a and 200b is formed substantially in parallel with the display portion (the surface of the outer cover 50) of the terminal device 7 connected to the input device 220. Therefore, by holding the at least one of the grip portions 200a and 200b substantially vertically, the user can face the face of the display portion toward the user, so that the input device 200 can be used while viewing the face of the display portion. operating. Further, with respect to the input device 220, like the input device 220, the support portion 205 also supports the terminal device 7 above the grip portion, and therefore, 60 323330 201220109 is easy for the user holding the grip portion for easy viewing. The configuration of the face. In other embodiments, at least one of the grip portions may be formed to be substantially parallel to the face of the LCD 51. In the input device 220, the shape of the connecting portion is different from that of the input device 200. The connecting portion 2〇9 shown in Fig. 17 is connected to the upper side and the lower side 2' of the first grip portion 2A, and is connected to the upper side (upper end) of the second grip portion 200b. Further, the input device 22A is also formed in the same manner as the input device 2A, and the φ connecting portion 209 is formed to protrude further forward than the second grip portion 200b. The input device 220 is also the same as the input device 2A, and the support portion 2〇5 is provided in the connection member 2〇9 that connects the first grip portion 20〇a and the second grip portion 200b. Therefore, the user can easily hold the operating device by holding and holding the two grip portions 200a and 200b. Further, the connecting portion 209 has a member extending downward from the connecting portion with the supporting portion 2〇5. When the face of the LCD 51 of the terminal device 7 connected to the branch unit 2〇5 is substantially vertically oriented, the member is oriented in a substantially vertical direction. That is, the above-described members are oriented substantially in parallel with the respective grip portions 200b. Therefore, when the user holds the member as the grip portion, by holding the member substantially vertically, it is possible to operate using the wheel-engaging device while viewing the kneading surface of the LCD 51. Further, since the member is disposed below the branch portion 2〇5, by holding the member, it is possible to easily view the face of the user. - The input device 220 is also the same as the A-transfer device 2A, and one of the grip portions (the first grip portion 200a) is placed in front of the terminal device 7 of the 323330 61 201220109 which is mounted on the input device 22A. At the side position, the other grip portion (second grip portion 200b) is placed at the rear side of the screen. Therefore, similarly to the input device 200, the input device 220 is particularly suitable for a shooting game in which the grip portion is easily gripped by the grip and the operation device is used as a gun to perform a game operation. Further, the input device 220 as the operation unit further includes a fourth key 207 in addition to the second key 202 and the rocker 204 which are the same as the input device 200. The second key 202 and the rocker 204 are provided on the upper side of the first grip portion 200a, similarly to the input device 200. The fourth key 207 is a key (button) that can be pressed by the user. The fourth key 207 is provided on the upper side of the second grip portion 200b. That is, the fourth key 207 is provided at a position where the index finger of the hand holding the second grip portion 200b can be operated. The input device 220 is provided with an imaging element (imaging device). Here, the input device 220 is configured similarly to the imaging information computing unit 35 included in the controller 5. At this time, the imaging element φ of the imaging information computing unit can be set to face the front side of the input device 220 (the rear side of the terminal device 7). Specifically, a window portion (infrared filter) 208 is provided in front of the input device 220 (the front end portion of the connection portion 206), and an imaging element is provided inside the window portion 208, and is provided to image the front side from the window portion 208. Orientation. According to the above, the user can use the front of the input device 220 toward the television 2 (the pointing device 6), so that the game device 3 can calculate the orientation or position of the input device 220. Therefore, the user can operate the input device 220 in a desired direction, and can intuitively and easily operate using the input device 220. 62 323330 201220109 The β I+ wheel-in device 220 may be configured to be provided with the same as the camera 56 instead of the camera information computing unit. According to this, the user can image the image in the imaging direction opposite to the camera 56 with the front side facing the television 2 (the reticle 6). The check device 220 is the same as the input device 200, and is provided with a connector (not shown). When the terminal is mounted on the input device 22, the expansion connector 58 of the terminal device 7 is installed. connection. Thereby, data can be received and transmitted between the input device 220 and the terminal device 7. Therefore, the information showing the operation performed on the input device 220 and the data showing the imaging result of the above-described imaging device can be transmitted to the game device 3 via the terminal device 7. Further, in other embodiments, the input device 220 may be configured to directly communicate with the game device 3. That is, the information showing the operation performed on the input device 220 is, for example, the same as the wireless communication between the controller 5 and the game device 3, using Bluetooth (registered trademark) technology or the like, • directly transmitted from the input device 220 to the game device. 3. At this time, the information showing the operation performed on the terminal device 7 is transmitted from the terminal device 7 to the game device 3. Further, the input device 220 may be the same as the input device 200, and may include a charging terminal that is connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 220. Further, in other embodiments, an operation device in which the terminal device 7 is integrated with the input device 200 (or the input device 220) may be provided. At this time, it is not necessary to provide the respective locking holes 50a, 50b, 59a, and 59b in the terminal device 7, and the claws in the wheel-in device 200 to detachably connect the terminal device γ 323330 63 201220109 and the input device 200. The institution. Fig. 20 is a view showing another example in which the attachment device is attached to the terminal device 7. In Fig. 20, the terminal device 7 is attached (mounted) to the holder 210 as an example of an attachment device. The bracket 210 is a supporting device for placing (supporting) the terminal device 7 at a predetermined angle. The bracket 210 is provided with a support member 21, a charging terminal 212, and guiding members 213a and 213b. In the present embodiment, the holder 210 also has a function as a charger and has a charging terminal 212. The charging terminal 212 is a terminal that can be connected to the charging ® terminal 66 of the terminal device 7. In the present embodiment, each of the charging terminals 66 and 212 is a metal terminal, but one of the connectors may be connected to the other. When the terminal device 7 is connected to the holder 210, the charging terminal 212 of the holder 210 comes into contact with the charging terminal 66 of the terminal device 7, and electric power can be supplied from the holder 210 to the terminal device 7 for charging. The support member 211 is for supporting the back side of the terminal device 7 at a predetermined angle. The support member 211 supports the predetermined surface (here, the back surface) of the outer cover 50 when the terminal (charging terminal φ 66) of the terminal device 7 is connected to the terminal (charging terminal 212) of the holder 210. As shown in Fig. 20, the support member 211 is provided with a wall portion 211a and a groove portion 211b. The support member 211 supports the outer cover 50 by the wall portion 211a such that the rear surface of the outer cover 50 is placed along a predetermined support surface (here, the surface formed by the wall portion 211a). Further, the groove portion 211b is a portion into which a portion (lower side portion) of the outer cover 50 is inserted when the terminal device 7 is connected to the bracket 210. Therefore, the groove portion 211b is formed to substantially fit the above-described partial shape of the outer cover 50. The groove portion 211b extends in a direction parallel to the support surface. 64 323330 201220109 Further, the 'guide members 213a and 213b are members that can be inserted into the second locking holes 5a and 50b of the terminal device 7, and the terminal device 7 is connected to the position of the holder 210. Each of the guide members 213a and 213b is provided at a position corresponding to the locking holes 5a and 5b of the terminal device 7. That is, the respective guide members 213a and 213b are provided at positions where the terminal device 7 and the holder 210 are correctly connected, and are inserted into the locking holes 50a and 50b. When the terminal device 7 is properly connected to the bracket 210, it means that the charging terminal 212 of the bracket 210 is connected to the charging terminal 66 of the terminal device 7. Further, a part of the guiding members 213a and 213b is provided to protrude from the bottom surface of the groove portion 211b. That is, the guide members 213a and 213b are partially provided to protrude upward from the surface of the support member 211. When the terminal device 7 is connected to the holder 210, a part of the guiding members 213a and 213b is inserted into the locking holes 50a and 50b, respectively. In the present embodiment, each of the guide members 213a and 213b is a rotatable wheel member (roller portion). Each of the guiding members 213a and 213b is rotatable in a predetermined direction. Here, the predetermined direction is a direction parallel to the support surface (in the horizontal direction), in other words, the left-right direction of the terminal device 7 when the terminal device 7 is connected to the support 210. The guiding member may be any rotating member that can rotate in a predetermined direction. For example, in other embodiments, the 'guide member may be a sphere that is rotatably supported by a spherical recess. Further, in the present embodiment, the number of the guide members is two, but the number of the guide members can be provided in accordance with the number of the locking holes provided in the lower surface of the terminal device 7. The bracket 210 can also have one or three. More than one guiding member. 323330 '65 201220109 When the terminal device 7 is connected to the cradle 210, the terminal device 7 can be placed on the cradle 210 at a predetermined angle by abutting the back surface of the terminal device 7 against the support member 211'. That is, a part of the lower side of the outer cover 50 is inserted into the groove portion 211b, and the wall portion 211a supports the back surface of the outer cover 50, whereby the terminal device 7 can be placed on the holder 210 at a predetermined angle. Therefore, in the present embodiment, the position of the terminal device 7 can be positioned at the correct position by the branch member 211 in the direction perpendicular to the predetermined direction. Here, when the terminal device 7 is connected to the cradle 210, if the terminal device │ 7 and the cradle 210 are not in the correct positional relationship, the positions of the terminal device 7 are corrected by the respective guiding members 213a and 213b. That is, when the locking holes 50a and 50b are deviated from the guiding members 213a and 213b in the predetermined direction, the guiding members 213a and 213b contact the outer cover 50 around the locking holes 5a and 50b. In response to this, the terminal device 7 is slidably moved in a predetermined direction by the rotation of the guiding members 213a and 213b. In the present embodiment, since the two guide members 213a and 213b are arranged in the predetermined direction, the lower surface of the terminal device 7 can be brought into contact with only the guide members 213a and 213b, and the terminal device 7 can be smoothly moved. Further, when the inclination (concave inclination) is provided around the locking holes 50a and 50b, the terminal device 7 can be moved more smoothly. As described above, each of the guide members 213a and 213b is inserted into the locking holes 50a and 50b as a result of the sliding movement of the terminal device 7. Thereby, the charging terminal 212 of the holder 21 is brought into contact with the charging terminal 66 of the terminal device 7, and charging is surely performed. As described above, the user can easily connect the terminal device 7 to the cradle 210 even if the terminal device 7 is not placed in the correct position 66 323330 201220109. According to the present embodiment, the positioning of the terminal device 7 with respect to the holder 210 can be performed by the simple configuration of the locking hole of the terminal device 7 and the guiding member of the holder 210. Therefore, the holder 210 can be formed in a compact and simple configuration. In the present embodiment, the terminal device 7 is a relatively large type of transportable device. However, even if it is such a large transportable device, the holder 210 itself can have a small configuration as shown in Fig. 20. Further, since the bracket 210 can be connected to terminal devices of various shapes or sizes, it is possible to provide a support device having high versatility. Further, in the present embodiment, the locking holes 50a and 50b are used as holes for locking the claw portions of the attachment means, and can be used as objects for inserting the guide members. Therefore, the number of holes provided in the outer cover 50 of the terminal device 7 can be reduced, and the shape of the outer cover 50 can be simplified. In the above-described embodiment, the hole into which the guide member of the holder 210 is inserted is a hole (the locking holes 50a and 50b) provided on the lower side surface of the outer cover 50, but the position of the hole may be any position. For example, a hole may be provided on the other side of the outer cover 50, or a hole may be provided on the front or back surface of the outer cover 50. Since the guide portion must be disposed at the position corresponding to the position of the hole, when the hole is provided on the front surface or the back surface of the outer cover 50, the guide portion of the bracket 210 can be disposed, for example, at the position of the wall portion 211a. Further, holes may be provided in a plurality of faces of the outer cover 50. At this time, the terminal device 7 can be placed on the holder 210 in various directions. [5. Game Processing] Next, the details of the game processing executed in the game system will be described in detail. First, various materials used in game processing will be described. Figure 21 323330 201220109 A diagram showing the various materials used in game processing. Fig. 21 is a view showing main data stored in the main memory (external main memory ι2 or internal main memory lie) of the game device 3. As shown in Fig. 21, the game memory 9 is stored in the main memory of the game device 3, and the received data 91 and the processing data 106 are stored. In addition to the information shown in Fig. 21, the information required for the game such as the image data of various objects appearing in the game or the sound data used in the game is stored in the main memory.

The game program 90 is read from the optical disc 4 in the appropriate timing after the power is turned on for the game device 3 and is recorded in the main memory. The game program 90' can also be retrieved from the flash memory or the game device 3 externally (e.g., via the Internet) instead of reading from the disc 4. In addition, one part of the game program 90 (for example, the program for calculating the posture of the controller 5 and/or the terminal device 7, and the β machine W' are stored in advance in the game device 3. The received data 9 is (4) Qing 5 The meal data received by the meal is received by the book. The received data 91 includes: the controller operation data 92 operation data 97, the camera image data 1 () 4, and the microphone sound capital 105. When a plurality of controllers 5 are connected, There are a plurality of controller operation data 。'. When the remaining terminals are connected, the terminal half 97, the camera image data, and the microphone sound data 1〇5 also have several. The controller operation data 92, In order to display the information of the operation performed by the user (player) on the controller 5, the controller operation data 92 is transmitted from the controller 5 and acquired in the game device 3, and is memorized in the main memory. 323330 68 201220109 The controller operation data 92 includes: the first operation key data 93, the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96. In the main memory, the latest (final) can be sequentially ordered. Memory reservation The number of controller operation data. The first operation key data 93 is for displaying information on the input states of the operation keys 32a to 32i provided in the controller 5. Specifically, the first operation key data 93 indicates each operation. Whether the keys 32a to 32i are depressed. The first acceleration data 94 is data showing the acceleration (acceleration vector) detected by the acceleration sensor 37 of the controller 5. Here, the first acceleration data 94 The acceleration in the three-axis direction of the XYZ shown in FIG. 3 is shown as the three-dimensional acceleration of each component. However, in other embodiments, the acceleration in any one direction or more may be displayed. The angular velocity data 95 is data showing the angular velocity detected by the rotation sensor 48 of the controller 5. Here, the first angular velocity data 95 is displayed on the 3 axes around the XYZ shown in Fig. 3. In the other embodiments, the angular velocity may be displayed around any one axis or more. The marker coordinate data 96 is displayed by the image processing circuit 41 of the imaging information computing unit 35. The coordinates, that is, the upper The information of the marker coordinates. The marker coordinates are represented by a 2-dimensional coordinate system for displaying the position on the plane corresponding to the captured image, and the marker coordinate data 96 shows the coordinate value on the 2-dimensional coordinate system. The device operation data 92 may be any one of the operators of the user who operates the controller 5, and may only include a part of 69 323330 201220109 of each of the above-mentioned materials 93 to 96. In addition, when the controller 5 has - or analogy When the lever or the like is used, other input means (for example, the touch panel other input means data line 92 Φ may include information indicating the operation of the controller 5 itself. As in the present embodiment, when the ith acceleration data device operating data 92 1 is used, the material 96' may include the action of the controller 5 itself:

The terminal operation data 97 is information showing the operation of the terminal by the user. The terminal operation data and the booklet are transmitted from the terminal device 7 and acquired in the game device 3, and are recorded in the main note (4). The terminal operation data 97 includes: a second operation key data 98, a joystick data 99, a touch position body 100, a second acceleration data 1 (n, a second angular velocity data 1〇2, and azimuth data. In the main memory A predetermined number of terminal operation data can be sequentially stored from the latest (final acquisition). The second operation key data 98' is for displaying information on the input states of the operation keys 54A to 54L provided in the terminal device 7. In other words, the second operation key data 98 indicates whether or not each of the operation keys μα to 54L is depressed. The joystick data 99' is a rocker portion showing the analog rocker 53 (analog rocker 53A and analog rocker 53B). The direction and amount of the sliding (or dumping) direction, for example, can be displayed as a 2-dimensional coordinate or a 2-dimensional vector. The touch position data 100 is used to display the input position on the input surface of the touch panel 52. (Touch position) data. In the embodiment, the touch position data 100 displays a coordinate value on a two-dimensional coordinate system for displaying the position on the input surface. When the touch panel 52 is multi-point Touch mode 70 323330 201220109, touch The position data 100 also displays a plurality of touch positions. The second acceleration data 101 is data showing the acceleration (acceleration vector) detected by the acceleration sensor 73. In the present embodiment, the second acceleration data 101 The acceleration in the three-axis direction of the xyz shown in Fig. 8 is shown as the three-dimensional acceleration of each component. However, in other embodiments, the acceleration in any one or more directions may be displayed. The angular velocity data 10 2 ' is a data showing the angular velocity measured by the gyro sensor 74. In the present embodiment, the second angular velocity data 102 ® shows the three axes around the xyz shown in Fig. 8. In the other embodiments, the angular velocity may be displayed around any one axis or more. The orientation data 103 is data indicating the orientation detected by the magnetic sensor 62. In the form, the orientation data 103 displays the orientation of the predetermined orientation (for example, the north) based on the terminal device 7. However, in the place where the magnetic field other than the geomagnetism is generated, the orientation data 1〇3 is not strictly shown. Absolute orientation (north or the like), but showing the relative direction of the direction of the magnetic field of the terminal device 7 with respect to the location, even in this case, the posture change of the final Λ device 7 can be calculated. Terminal operation material / material 97 ' As long as the operator of the user who operates the terminal device 7 can be displayed, only one of the above-mentioned respective materials 98 to (10) can be included. Further, when the terminal device 7 has other input means (for example, a touch pad) In the case of the imaging means of the controller 5, etc., the terminal operation data 97 may also include information on operations performed by the other wheeler means. As in the present embodiment, the operation of the terminal device 7 itself is used in the game. In operation, the terminal 323330 71 201220109 operating data 97 is such that the second acceleration data 101, the second angular velocity data 102, or the orientation data 103 may include information that changes the value in response to the operation of the terminal device 7 itself. The camera image data 104 is data for displaying an image (camera image) imaged by the camera 56 of the terminal device 7. The camera image data 104 is an image data obtained by decompressing the compressed image data from the terminal device 7 by the codec LSI 27, and is stored in the main memory by the output/output processor 11a. In the main memory, a predetermined number of camera image data can be memorized from the latest (final). The microphone sound data 105 is data for displaying the sound (microphone sound) detected by the microphone 79 of the terminal device 7. The microphone sound data 105 is sound data decompressed from the compressed sound data transmitted from the terminal device 7 by the codec LSI 27, and is memorized in the main memory by the output input processor 11a. The processing material 106 is data of φ used in the game processing (Fig. 22) to be described later. The processing data 106 includes control data 107, controller posture data 108, terminal posture data 109, image recognition data 110, and sound recognition data 111. In addition to the information shown in Fig. 21, the processing material 106 also includes various materials used for game processing such as displaying various parameters set in various objects appearing in the game. The control data 107 is information for displaying a control instruction to the constituent elements included in the terminal device 7. The control data 107 displays, for example, an instruction to light the control indicator unit 55 or an instruction to control imaging of the camera 56. The control data 107 is transmitted to the terminal at the appropriate time, 72 323330 201220109. The controller posture data 108 is a material showing the posture of the controller 5. In the present embodiment, the controller posture data 108 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96 included in the controller operation data 92. The method of calculating the controller posture data 108 will be described in step S23. The terminal posture data 109 is a material for displaying the posture of the terminal device 7. In the present embodiment, the terminal posture data 109 is calculated based on the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 included in the terminal operation data 97. The method of calculating the terminal posture data 109 will be described in step S24. The image identification data 110 is data for displaying a result of performing predetermined image recognition processing on the camera image. The image recognition processing can be any processing as long as a certain feature can be detected from the camera image and outputting the result, for example, a predetermined object can be extracted from the camera image (for example, the face of the user or the user's face or Mark, etc., and calculate the processing of the information related to the captured object. The voice recognition data m is data showing the result of performing predetermined voice recognition processing on the microphone sound. The voice recognition process can be arbitrarily processed as long as a certain feature can be detected from the microphone sound and outputted, for example, the process of detecting the user's speech, or only the output volume. Next, the details of the game processing performed in the game device 3 will be described with reference to Fig. 22. Fig. 22 is a main flow chart showing the flow of the processing performed by the game device 3 in the game device 73 323330 201220109. When the power of the game device 3 is turned on, the CPU 10 of the game device 3 executes an activation program stored in a boot ROM (not shown), thereby performing initialization of each unit of the main memory or the like. Then, the game program stored on the disc 4 is read to the main memory, and the game program is started by the CPU 10. In the game device 3, the game program stored in the disc 4 can be executed immediately after the power is turned on, or the built-in program that displays the predetermined menu face is first executed after the power is turned on, and then the game is instructed by the user. At the beginning, the game program stored on the disc 4 is executed. The flowchart shown in Fig. 22 is a flowchart of the processing performed after the above processing is completed. The processing of each step of the flowchart shown in Fig. 22 is only an example, and the processing order of each step can be replaced as long as the same result can be obtained. In addition, the variable value or the threshold used in the judgment step is only an example, and other values may be used as necessary. Further, in the present embodiment, the processing of each step of the above-described flowchart is executed by the CPU 10. However, the processing of some of the above steps may be performed by a processor or a dedicated circuit other than the CPU 10. First, in step S1, the CPU 10 performs initial processing. The initial processing, for example, constructs a virtual game space, and arranges the objects that appear in the game at the initial position, or sets the initial values of various parameters used in the game processing. Further, in the present embodiment, in the initial processing, the CPU 10 controls the lighting of the pointing device 6 and the indicator portion 55 in accordance with the type of the game program. Here, the game system 1 includes both the indicator device 6 and the indicator portion 74 323330 201220109 55 of the terminal device 7 as an imaging means (imaging information computing unit 35) of the controller 5. Any one or both of the device 6 and the labeling unit 55 are used depending on the content of the game (the type of the game program). The game program 卯 / middle package 3 shows whether or not the information of the marking device 6 and the indicator portion 55 is turned on. The CPU 1G reads the data and judges whether or not the lighting is to be performed. When the standard device 6 and/or the indicator portion 55 are turned on, the following processing is performed. That is, when the indicator is placed on the 6th light, the cpu 10 will transmit the control signal of each infrared LED lighting of the marking device to the marking device 6. The transmission of the control can be a power supply only. In response to this, each of the infrared LEDs of the marking device 6 is lit. On the other hand, when the indicator portion 55 is f, the CPU 1G generates a control material indicating that the indicator portion 55 is turned on, and the money is stored in the main memory. The generated control data is transmitted to the terminal device 7 in step S1G which will be described later. The control data received in the wireless mode of the terminal device 7 and 70 is transmitted to the UI controller 75 via the codec (5), and the w controller 75 performs the pointing of the indicator portion 55. In this case, the infrared LED lighting of the indicator portion 5 5 indicates that the device 6 and the indicator portion 55 are turned on, and the "off" of the indicator device 6 and the indicator portion 55 can be used by lighting. For the same processing. After the above step S1, the processing of step S2 is performed. Hereinafter, the processing circuit constituted by the series of processes of the steps S2 to S11 is repeatedly executed in a ratio of the predetermined time period α frame time. In the middle, the CPU 1G obtains the control transmitted from the controller 5 as a bedding material. Since the controller 5 repeatedly transmits the controller operation data to the game device 3, the game device 3 receives the (4) pass 323330 75 201220109 times to receive the (four) work, and the received controller operation data is successively memorized in the main device. The interval of : is better than the processing time of the game ^, - seconds. Fine middle side. From = to two, ^ step S3, CPU 10 & γ " a kind of data. The terminal device 7 is connected by :::::

The material and microphone sound data are repeatedly transmitted to the camera 2 资q < 芏 戏 装置 3, so the game is installed = 3, the receiving material is tilted. Tour (four) is set to 3, the terminal communication module finds people to receive such poor materials, II is edited and exhausted (5) 27 pairs of camera images, wheat, wind sound data are decompressed one by one. In addition, the output wheel 11a sequentially memorizes the operation data, the camera image data, and the microphone sound data in the main memory. In step s3, cpU 1〇 reads the latest final job data from the main = hidden body, and then the processing of step S4. In step S4, the CPU 1 executes the game control processing. The game control unit j performs processing for moving the objects in the game space or the like in accordance with the game operation of the user to cause the game to be processed. In the present embodiment, various games can be performed using the control 11 5 and/or the final 7 using . The following describes the game control processing according to the 23rd. The + _ diagram shows a streamlined diagram of the detailed flow of the game control process. In the first place, the series of processes can be executed in order to use the controller 5 and the terminal device 7 as two devices. However, it is not necessary to perform various processes of the jade, and the game can be played. The type or content is only executed - part 76 32333〇201220109 processing. In the game control processing, first, in step S21, the CPU 10 determines whether or not to change the marker to be used. As described above, in the present embodiment, at the start of the game processing (step S1), the processing of the lighting of the control signing device 6 and the indicator portion 55 is performed. Here, depending on the game, it is also considered that the object used (lighting) in the marking device 6 and the indicator portion 55 is changed in the middle of the game. In addition, depending on the game, although the use of both the marking device 6 and the indicator portion 55 is considered, when both lights are turned on, the marker of one of the ® detectors may be erroneously detected as the other marker. doubt. Therefore, in the game, it is preferable to use only one of the lights to switch the lighting. In the process of step S21, in order to consider this situation, it is determined whether or not the processing of the lighting object needs to be changed in the middle of the game. The determination of the above step S21 can be performed, for example, by the following method. That is, the CPU 10 can perform the above determination in accordance with whether or not the game situation (the stage of the game, the operation target, and the like) changes. This is because it is possible to change the operation method between the operation method of operating the controller 5 toward the pointing device 6 and the operation method of operating the controller 5 toward the indicator portion 55 since it is possible to change the situation of the game. Further, the CPU 10 can make the above determination based on the posture of the controller 5. That is, the above determination can be made by judging that the controller 5 faces the pointing device 6 or the pointing portion 55. The posture of the controller 5 can be calculated, for example, based on the detection results of the acceleration sensor 37 or the rotation sensor 48 (refer to step S23 described later). Further, the CPU 10 can also perform the above determination by judging whether or not there is a change instruction of the user. When the result of the determination in the above step S21 is affirmative, the processing of step S22 77 323330 201220109 is performed. On the other hand, if the result of the determination in the above step S21 is negative, the processing of step S23 is executed while the processing of step S22 is skipped. In step S22, the CPU 10 controls the lighting of the marking device 6 and the indicator portion 55. That is, the lighting state of the marking device 6 and/or the indicator portion 55 is changed. The specific processing of turning on or off the marking device 6 and/or the indicator portion 55 can be performed in the same manner as in the above-described step S1. The process of step S23 is performed after step S22. As described above, according to the present embodiment, the illumination (lighting) of the pointing device 6 and the indicator portion 55 can be controlled in accordance with the type of the game program by the above-described step S1, and by the above steps S21 and S22 The processing controls the lighting (lighting) of the marking device 6 and the indicator portion 55 in response to the game situation. In step S23, the CPU 10 calculates the posture of the controller 5. In the present embodiment, the posture of the controller 5 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96. The method of calculating the posture of the φ controller 5 will be described below. First, the CPU 10 calculates the posture of the controller 5 based on the first angular velocity data 95 stored in the main memory. The method of calculating the posture of the controller 5 from the angular velocity may be any method in which the previous posture (the previously calculated posture) and the current angular velocity (the current processing loop obtained in step S2) can be used. The angular velocity is calculated. Specifically, the CPU 10 calculates the posture by rotating the previous posture by a certain unit time portion by the angular velocity at this time. The previous posture is displayed by the controller posture data 108 memorized in the main memory. The angular velocity of this time is read from the main memory by the first angular velocity 78 323330 201220109 stored in the main memory. Controller potential. It is not possible to display the "in accordance with the data stored in the main memory." and "-." That is, when the posture of the controller 5 is released, 10 can first calculate the potential. The initial posture of the controller 5 can be made true based on the acceleration or the player is allowed to perform a predetermined operation while the controller 5 is in a specific posture, and the hour (4) posture is used as the initial posture when the predetermined operation is performed. When the posture of the controller 5 is calculated as an absolute posture based on a predetermined direction in space, the initial posture can be calculated, and the posture of the controller 5 can be calculated as, for example, the posture of the controller 5 at the time of the start of the game. When the relative posture of the reference is made, the initial posture may not be calculated. Next, the CPU 10 uses the second acceleration data 94 to correct the posture of the controller 5 calculated based on the angular velocity. Specifically, the CPU 10 first reads the first from the memory! The acceleration data 94 calculates the posture of the controller 5 based on the first (acceleration data 94. Here, in the state where the controller 5 is almost stationary, the acceleration applied to the controller 5 means the acceleration of gravity. Therefore, In the state, the direction (gravity direction) of the gravitational acceleration can be calculated using the first acceleration data 94 output from the acceleration sensor 37. Therefore, the controller 5 with respect to the gravity direction can be calculated from the first acceleration data 94 The orientation (posture) shows that the "data based on the acceleration posture" calculated in the above manner is memorized in the main memory. 79 323330 201220109 When calculating the posture according to the acceleration, the CPU 10 then uses the posture according to the acceleration. Specifically, the CPU 10 reads data showing a posture according to the angular velocity and data showing a posture according to the acceleration from the main memory, and approximates the posture according to the angular velocity data at a predetermined ratio. Correcting according to the posture of the acceleration data. The predetermined ratio may be a predetermined fixed value Alternatively, it is set in accordance with the acceleration displayed by the first acceleration data 94. Further, since the posture according to the acceleration cannot be calculated in the direction of rotation in the direction of the gravity direction, the CPU 10 does not correct the rotation direction. In the present embodiment, the data showing the corrected posture obtained as described above is stored in the main memory. After the posture according to the angular velocity is corrected as described above, the CPU 10 further uses the marker coordinate data 96 to correct the posture. First, the CPU 10 calculates the posture of the controller 5 based on the marker coordinate data 96 (according to the posture of the marker coordinates). Since the marker coordinate data 96 shows the positions of the markers 6R and 6L of the imaging φ image. Therefore, the posture of the controller 5 related to the roll direction (rotation direction around the Z axis) can be calculated from these positions. That is, the position of the marker 6R and the marker 6L can be obtained from the captured image. In the slope of the straight line of the position, the posture of the controller 5 related to the roll direction is calculated. Further, when the controller 5 is specified relative to the pointing device At the position of 6 (assuming, for example, that the controller 5 is located on the front side of the pointing device 6), the posture of the controller 5 related to the pitch direction and the yaw direction can be calculated from the position of the pointing device 6 of the captured image. When the position of the marker 6R and 6L in the captured image moves to the left, it can be determined that the controller 5 changes 80 323330 201220109 to the right direction (posture). Thus, it can be calculated from the position of the marker device and 6L. The posture of the controller 5 related to the pitch direction and the yaw direction. As described above, the posture of the controller 5 can be calculated based on the marker coordinate data 96. When calculating the posture according to the marker coordinates, the cpu 1 〇 then borrows The corrected posture (the posture corrected according to the posture of the force port speed) is corrected by the basis of the heart: the potential of the seat. That is, the cpu 1 system corrects the corrected posture by a predetermined ratio close to the posture of the marker coordinates. The predetermined ratio may be a predetermined fixed value. Further, the correction by the posture according to the target coordinates may be performed in only one of the roll direction, the pitch direction, and the tilt direction, or two directions. For example, when the marker coordinate data 96 is used, the posture can be accurately calculated in the pan direction, so that the CPU 10 can correct only the pan direction using the posture according to the marker coordinate data 96. Further, when the pointing device 6 or the indicator portion 55 is not imaged by the image pickup device 40 of the controller 5, the posture according to the marker coordinate data 96 cannot be calculated, so that the correction using the marker holder information 96 may not be performed at this time. deal with. According to the above, the CPU 10 uses the first acceleration data 94 and the marker coordinate data 96 to correct the according to the first! The posture of the controller 5 calculated by the angular velocity data 95. Here, in the method of calculating the posture of the controller 5, in the method of using the angular velocity, the posture can be calculated regardless of how the controller 5 operates. On the other hand, in the method of using the angular velocity, since the posture is calculated by the angular velocity measured by the cumulative price, the accuracy is deteriorated due to the accumulation of errors or the like, or the rotary sensor is caused by the so-called temperature drift problem. The doubt that the accuracy is deteriorating. In addition, in the method of using acceleration, although 323330 81 201220109 does not accumulate errors, in the state where the controller 5 is in a state of intense operation, the posture cannot be accurately calculated (the direction of gravity cannot be accurately detected). Further, in the method using the marker coordinates, the posture can be accurately calculated (especially in the pan direction), but the posture cannot be calculated when the indicator portion 55 cannot be imaged. On the other hand, according to the present embodiment, since the three methods of different characteristics are used as described above, the posture of the controller 5 can be accurately calculated. In other embodiments, the posture may be calculated using either or both of the above three methods. Further, when the lighting control of the indicator is performed in the processing of the above step S1 or step S22, the CPU 10 preferably calculates the posture of the controller 5 using at least the marker coordinates. The processing of step S24 is performed after the above step S23. In step S24, the CPU 10 calculates the posture of the terminal device 7. In other words, since the terminal operation data 97 acquired from the terminal device 7 includes the second acceleration data 101, the second angular velocity data 102, and the orientation data 103, the CPU 10 can calculate the posture of the terminal device 7 based on the data. . Here, the CPU 10 φ can know the amount of rotation (the amount of change in posture) per unit time of the terminal device 7 by the second angular velocity data 102. Further, in the state where the terminal device 7 is almost stationary, the acceleration applied to the terminal device 7 means the gravitational acceleration, so that the direction of gravity applied to the terminal device 7 can be known by the second acceleration data 101 (that is, by gravity) The direction of the terminal device 7 whose direction is the reference). Further, the orientation information 103 can be used to know the predetermined orientation based on the terminal device 7 (i.e., the posture of the terminal device 7 based on the predetermined orientation). Even when a magnetic field other than geomagnetism is generated, the amount of rotation of the terminal device 7 can be known. Therefore, the CPU 10 can calculate the posture of the terminal device 7 based on the second acceleration data 10, 82 323330 201220109, the second angular velocity data 102, and the orientation data 103. In the present embodiment, the posture of the terminal device 7 is calculated based on the above three types of data. However, in another embodiment, the posture may be calculated based on either or both of the above three types of data. The specific calculation method of the posture of the terminal device 7 may be any method. For example, the second acceleration data 101 and the orientation data 103 may be used to correct the posture calculated based on the angular velocity displayed by the second angular velocity data 102. Specifically, the CPU 10 first calculates the posture of the terminal device 7 based on the second angular velocity data 102. The method of calculating the posture based on the angular velocity can be the same as the method in the above step S23. Next, the CPU 10 corrects the angular velocity based on the posture calculated from the second acceleration data 101 and/or the posture calculated from the orientation data 103 at an appropriate timing (for example, when the terminal device 7 is near the stationary state). Calculate the posture. The method of correcting the posture according to the angular velocity in accordance with the posture of the acceleration can be the same as the method of calculating the posture of the controller 5 described above. Further, when the posture according to the angular velocity is corrected in accordance with the posture of the azimuth φ data, the CPU 10 can also correct the posture according to the angular velocity close to the posture based on the orientation data at a predetermined ratio. Based on the above, the CPU 10 can correctly calculate the posture of the terminal device 7. Since the controller 5 is provided with the imaging information processing unit 35 as an infrared detecting means, the game device 3 can acquire the marker coordinate data 96. Therefore, with respect to the controller 5, the game device 3 can know from the marker coordinate data 96 the absolute posture in the actual space (in the coordinate system set in the actual space, the posture of the controller 5). On the other hand, the terminal device 7 does not have an infrared detecting means such as the image information calculating unit 35. Therefore, the game device 3 can know the absolute posture in the real space from the second acceleration data 101 and the second angular velocity data 102 with respect to the direction of rotation about the direction of gravity. Therefore, in the present embodiment, the terminal device 7 is configured to include the magnetic sensor 72 so that the game device 3 acquires the orientation data 103. According to this, the game device 3 can calculate the absolute posture in the real space from the orientation data 103 with respect to the rotation direction of the gravity direction, and can more accurately calculate the posture of the terminal device 7. In the specific processing of the above step S24, the CPU 10 reads the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 from the main memory, and calculates the posture of the terminal device 7 based on the data. The data showing the calculated posture of the terminal device 7 is stored in the main memory as the terminal posture data 109. The process of step S25 is performed after step S24. In step S25, the CPU 10 performs recognition processing of the camera image. That is, the CPU 10 performs predetermined identification processing on the camera image data 104. The identification processing can be arbitrary as long as a certain feature can be detected from the camera image and the result is output. For example, when the face of the player is included in the camera image, the process of recognizing the face can be recognized. Specifically, it is possible to recognize the treatment of a part of the face (eye or nose or mouth, etc.) or to detect the expression of the face. Further, the data showing the result of the identification processing is stored as the image identification data 110 in the main memory. The process of step S26 is performed after step S25. In step S26, the CPU 10 performs recognition processing of the microphone sound. That is, the CPU 10 performs predetermined identification processing on the microphone sound material 105. 84 323330 201220109 This identification process can be arbitrarily processed as long as it can detect a certain characteristic from the microphone sound and output the result. For example, the process of detecting the player's indication from the microphone sound, or simply detecting the volume of the microphone sound. Further, the data showing the result of the recognition processing is stored in the main memory as the sound recognition data 111. The process of step S27 is performed after step S26. In step S27, the CPU 10 executes game processing in response to the game input. Here, the game input may be data sent from the controller 5 or the terminal device 7 or data obtained from the data. Specifically, the game input may be data obtained from the data (controller posture data 108, terminal posture data 109, map) in addition to the data included in the controller operation data 92 and the terminal operation data 97. Image identification data 110, and voice recognition data 111). In addition, the content of the game processing in step S27 may be any content, for example, processing for moving an object (character) appearing in the game, controlling the processing of the virtual camera, or moving the φ cursor displayed on the screen. . Further, the camera image (or a part thereof) is used as a process for processing a game image or a process for using a microphone sound as a game sound. An example of the above game processing will be described later. In step S27, for example, the data of various parameters set in the object (character) appearing in the game, or the data of the parameter related to the virtual camera arranged in the game space, the data of the score, etc., are displayed as the game control result. The data is stored in the main memory. After the step S27, the CPU 10 ends the game control processing of the step S4. Returning to the description of FIG. 22, in step S5, the television game image for display on the television 2 85 323330 201220109 is generated by the CPU 10 and the GPU lib. That is, the CPU 10 and the GPU lib read the data showing the result of the game control processing of step S4 from the main memory, and read the data necessary for generating the game image from the VRAM lid to generate Game image. The game image may be displayed as long as the result of the game control processing of step S4 can be displayed, and can be generated by any method. For example, the method of generating the game image may be a method of generating a 3-dimensional CG image by arranging the virtual camera in a virtual game space and calculating a game space viewed from the virtual camera, or (without using virtual Camera) A method of generating a 2-dimensional image. The generated TV game image is memorized in the VRAM lid. The processing of step S6 is performed after the above step S5. In step S6, the game image for the terminal displayed on the terminal device 7 is generated by the CPU 10 and the GPU lib. The game image for the terminal is also the same as the game image for the television described above, and may be generated by any method as long as the result of the game control process of step S4 can be displayed. Further, the game image of the terminal φ can be generated by the same method as the above-described game image for television, or can be generated by a different method. The generated game image for the terminal is recorded in the VRAM lid. The game image for the television and the game image for the terminal may be the same depending on the content of the game. In this case, the process of generating the game image may not be executed in step S6. The processing of step S7 is performed after the above step S6. In step S7, the television game sound for outputting to the racquet 2a of the television 2 is generated. That is, the CPU 10 generates a game sound in response to the result of the game control processing of the step S4 in the DSP 11c. The generated game sound is, for example, the sound of the game, the sound of the character appearing in the game, 86 323330 201220109 or BGM. The processing of step S8 is performed after the above step S7. In step S8, a terminal game sound for outputting to the terminal device 7 is generated. That is, the CPU 10 generates a game sound in response to the result of the game control processing of step S4 in the DSP 11c. The game sound for the terminal can be the same as or different from the game sound for the above television. In addition, for example, the effect sounds are different, but the BGM can be different only in part. When the television game sound is the same as the terminal game sound, the game sound generation processing may not be executed in step S8. The processing of step S9 • is performed after the above step S8. In step S9, the CPU 10 outputs the game image and the game sound to the television 2. Specifically, the CPU 10 transmits the data of the television game image stored in the VRAM lid and the data of the television game sound generated by the DSP 11c in step S7 to the AV-IC 15. In response to this, the AV-IC 15 outputs the image and sound data to the television 2 via the AV connector 16. Thereby, the television game image is displayed on the television 2, and the television game sound is output from the φ speaker 2a. The process of step S10 is performed after step S9. In step S10, the CPU 10 transmits the game image and the game sound to the terminal device 7. Specifically, the image data of the game image for the terminal stored in the VRAM lid and the data of the sound generated by the DSP 11c in step S8 are transmitted to the codec LSI 27 by the CPU 10, and by The codec LSI 27 performs predetermined compression processing. Further, the image and sound data subjected to the compression processing are transmitted to the terminal device 7 via the antenna 29 via the terminal communication core group. The terminal device 7 receives the data of the image and sound transmitted from the game device 3 by the wireless module 80, and performs predetermined decompression processing by editing the code 872330 201220109. The image data subjected to the decompression processing is output to the LCD 51, and the sound data subjected to the decompression processing is output to the sound 1C 78. Thereby, the terminal game image is displayed on the LCD 51, and the terminal game sound is output from the speaker 77. The process of step S11 is executed after step S10. In step S11, the CPU 10 determines whether or not to end the game. The determination in step S11 is performed, for example, by whether or not the game is over or the user is instructed to suspend the game. When the determination in step S11 is negative, the processing of step ® S2 is executed again. On the other hand, when the determination in step S11 is affirmative, the CPU 10 ends the game processing shown in Fig. 12. Hereinafter, a series of processes of steps S2 to S11 are repeatedly executed until the end of the game in step S11. As described above, in the present embodiment, the terminal device 7 includes the inertial sensor of the touch panel 52 and the acceleration sensor 73 or the rotation sensor 74, and the output of the touch panel 52 and the inertial sensor is used as the operation data. It is transmitted to the game device 3 and used as an input to the game (steps S3, S4). Further, the final φ end device 7 is provided with a display device (LCD 51), and the game image obtained by the game processing is displayed on the LCD 51 (steps S6, S10). Therefore, the user can directly perform a touch operation on the game image by using the touch panel 52. In addition, (because the action of the terminal device 7 can be detected by the inertial sensor, the game image can be displayed) The operation of the LCD 51 itself moves. The user can perform the game by directly operating the game image in such a manner as to operate the game image. Therefore, it is possible to provide a game such as the new operation feelings of the first and second game examples described later. In addition, in the present embodiment, the terminal device 7 is provided with a flipping device 3 that can be operated in a state in which the terminal device 832330201220109 end device 7 is held, and the game is input to the game. S3, S4) = the operation performed by the key 54 is used as the image directly (4), (4) = this is described above, even if the game is detailed (four) thank you (four). In the present embodiment, the microphone Φ ^ , the microphone 79, and the camera 56 are mounted on the H-unit 7 with the money-carrying data and the microphone (8) produced by the camera 56 and the wind 79. Therefore, the check-in stock is transmitted to the game device 3 (the step sound is used as the image and/or the microphone 56 is used as the image county, and the camera can also be used as the camera to perform the game.) ^t The operation of the microphone 79 m % r / , L 4 can be directly manipulated by the image when the terminal device 7 is held by the image, θ: 'Tai 22 makes the user perform more diverse game operations. The line end mm' is displayed on the transportable type terminal device 7 because the game image is displayed. Because of the::::), the user can arrange the position in the free position by the user 7 and the user makes the control 5 direction. Freedom to rape the 5 degrees of freedom of operation. The control button (4) is placed at any position. For example, the fifth game 1 described later can be placed in the position suitable for the game content, and the wide and end device 7 can be used. Device, 3H, game. Since the terminal device 7 acquires the operation data and the like (step '" the air compressors 5 and S3), the user can use the two devices of the controller 5 and the terminal device 7 as the operation means by the 323330 89 201220109. Therefore, in the game system 1, It is also possible for a plurality of users to play a game by using a plurality of individuals using a plurality of devices, and one user can play the game using two farms. In addition, according to the present embodiment, the game device 3 generates two kinds of game maps. Like (steps S5, S6), the game image can be displayed on the TV 2 and the terminal 裴, 7 (step S9, sl〇). Thus, by displaying the two kinds of game images on the IS, the use can be provided. It is easier to watch the game image: the operability. For example, 'When two people play the game, as in the case of the fourth game, the game image of the use of one side = the point of view is displayed on the TV. 2. The use of a game that is easy for the viewer to view is, for example, '=: play the game. In addition, when the game is booked in the .t, as in the following, the first, the 筮9 + Du-like can be different. The point of view towel shows 5 4 5 games [6. Game examples] Then say In the game system 、中, in the example of the game of Ming, the specific example of the game of the game is more likely to grasp the game = lang image, and thus, "6_You Rui also 丨1, 袠 楗 楗 楗 游戏 游戏 游戏 。. In the case of the - part of the system, in addition to the system 1 [shown as a series of 1 pairs: the 22nd picture and the 23rd system 1 may not have all of the above configurations. That is, the game does not execute the first In addition, as shown in Fig. 22 and Fig. 23, the game device 3 may be a part of the series of processing (first game example). 90 32333〇201220109 The first game example is in the game space by operating the terminal device 7. The game of the object (dart) is emitted. The player can instruct the direction of the shooting by changing the posture of the terminal device 7 and the operation of drawing the line on the touch panel 52. Fig. 24 is a display In the first game example, the screen of the television 2 and the terminal device 7 are shown. In Fig. 24, the game image indicating the game space is displayed on the LCD 51 of the television 2 and the terminal device 7. The television 2 displays The dart 121, the control surface 122, and the target 123 are displayed on the LCD 51. In the first game example, the player uses the operation of the terminal device 7 to shoot the darts 121 and hit the target 123 to play the game. When the darts 121 is shot, the player first uses the game. The posture of the control device 122 disposed in the virtual game space is changed to the desired posture by operating the posture of the terminal device 7. That is, the CPU 10 is based on the inertial sensor (the acceleration sensor 73 and the rotation sensor 7 4 And the output of the magnetic sensor 7 2, φ calculates the posture of the terminal device 7 (step S24), and changes the posture of the control surface 122 based on the calculated posture (step S27). In the first game example, the posture of the control plane 122 is controlled in accordance with the posture of the posture of the terminal device 7 in the actual space. That is, the player can change the posture of the control plane 122 in the game space by changing the posture of the terminal device 7 (displayed on the control plane 122 of the terminal device 7). In the first game example, the position of the control plane 122 is fixed at a predetermined position in the game space. Next, the player performs an operation of drawing a line on the touch panel 52 using the stylus pen 124 or the like (refer to an arrow shown in Fig. 24). Here, in the game example of the first 91 323330 201220109, the LCD 51 of the terminal device 7 displays the control surface 122 such that the input surface of the touch panel 52 corresponds to the control surface 122. Therefore, the direction on the control surface 122 (the direction in which the line is displayed) can be calculated by drawing the line drawn on the touch panel 52. The dart 121 is shot in the direction determined so. From the above, the CPU 1 calculates the direction on the control surface 122 from the touch position data 100 of the touch panel 52, and performs a process of moving the flying brocade 121 to the calculated direction (step S27). The CPU 10, for example, can control the speed of the lining 121 in response to the length of the line or the speed at which the line is drawn. As described above, according to the game example of the third game, the game device 3 can use the input of the inertia sensor as the game input, and the & face 122 can be moved in response to the movement (posture) of the terminal device 7. And by using the output of the touch panel ,, the direction on the control surface 122 can be specified. According to this, the image) or 1 is a game image of the terminal device 7 (the control surface 122

闰w Article 2 疋二亥 game image for touch operation, because 匕 can play the game with the novel operation feeling of the game, 仃 operation. In the game example, by using the inertial sensor and the touch panel as the game input, the direction of the 3-dimensional space can be easily indicated. That is, In the posture, the other player actually adjusts the terminal device 7 by one hand, and the hand line is input to the touch panel 52, and the intuitive operation is actually performed by the direction of the actual input direction in the square θ. In the future, the operation of the crucible and the touch surface are the same: the rounding operation of the posture of the terminal device 7 ^ τ - 3 ^ ^ Φ ^ is performed, so that the 3D space can be quickly performed. In the first game example, in order to facilitate the touch input operation on the control surface 122, the terminal device 7 displays the control surface 122 on the entire surface. It is easy to grasp the posture of the control surface 122 and the manner in which the target 123 is easily grasped, and the image of the game space including the entire control surface 122 and the target 123 is displayed on the television 2 (see Fig. 24). In the above step S27, used to generate a game map for television The first virtual camera is set such that the entire control surface 122 and the target 123 are included in the field of view, and the second virtual camera for generating the terminal game image® is used to display the LCD 51 ( The input surface of the touch panel 52 is set to match the control surface 122 on the top surface. Therefore, in the first game example, an image of the game space viewed from different viewpoints is displayed on the television 2 and The terminal device 7 can make the game operation easier. (Second game example) The game in which the sensor output of the inertial sensor and the touch panel 52 is used as the game input is not limited to the first game example described above. Various game examples can be considered. The second game example is a game in which an object (projectile) is fired in the game space by operating the terminal device 7 in the same manner as the first game example. The player changes the posture of the terminal device 7. The operation and the operation of designating the position on the touch panel 52 can indicate the direction in which the projectile is fired. Fig. 25 is a view showing the face of the television 2 in the second game example and the terminal device 7. In Fig. 25, Displayed on TV 2 The cannon 131, the projectile 132, and the target 133. The projectile 132 and the target 133 are displayed on the terminal device 7. The terminal game image displayed on the terminal device 7 is used to view the game space from the position of the cannon 131 93 323330 201220109. In the second game example, the player can change the display range of the terminal device 7 as the terminal game image by operating the posture of the terminal device 7. That is, the CPU 10 is based on the inertial sensor (acceleration). The sensor 73 and the rotation sensor 74) and the output of the magnetic sensor 72 calculate the posture of the terminal device 7 (step S24), and control the generation of the game image for the terminal based on the calculated posture. 2 Position and posture of the virtual camera (step S27). Specifically, the second virtual camera is set in the position of the cannon 131, and the orientation (posture) is controlled in accordance with the posture of the terminal device 7. Thus, the player can change the range of the game space displayed on the terminal device 7 by changing the posture of the terminal device 7. Further, in the second game example, the player performs an operation of inputting a point (touch operation) on the touch panel 52, thereby specifying the emission direction of the projectile 132. Specifically, in the processing of the above step S27, the CPU 10 calculates a position (control position) in the game space corresponding to the touch position, and takes a predetermined position (for example, the position of the cannon 131) from the game φ space to the control position. The direction is calculated as the emission direction. Then, the process of moving the projectile 132 in the direction of emission is performed. As described above, in the first game example described above, the player performs an operation of drawing a line on the touch panel 52. However, in the second game example, an operation of designating a point on the touch panel 52 is performed. The control position can be calculated by setting the same control surface as the first game example (but the control surface is not displayed in the second game example). In other words, the control surface is arranged in accordance with the posture of the second virtual camera in accordance with the display range in the terminal device 7 (specifically, the control surface is centered on the position of the cannon 131, and the terminal is installed in accordance with the terminal installation 94 323330 201220109 The posture of 7 is changed to rotate (moving), whereby the position on the control surface corresponding to the touch position can be calculated as the control position. According to the second game example described above, the game device 3 can change the display range of the terminal game image in response to the movement (posture) of the terminal device 7 by using the output of the inertial sensor as the game input, and by specifying the The touch input of the position within the display range is used as a game input to specify the direction within the game space (the direction in which the projectile 132 is fired). Therefore, in the second game example, as in the first game example, the player can move the game image displayed on the terminal device 7 or perform a touch operation on the game image, so that the game image can be directly played. A game-like operation feels like a game. Further, in the second game example, as in the first game example, the player actually adjusts the posture of the terminal device 7 with one hand and touch input the touch panel 52 with the other hand. The direction is easily indicated by an intuitive operation of the actual input direction in the space. Further, the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52, so that the operation of instructing the direction in the three-dimensional space can be quickly performed. Further, the image displayed on the television 2 in the second game example may be an image viewed from the same viewpoint as the terminal device 7, but in FIG. 25, the game device 3 displays a view viewed from a different viewpoint. image. In other words, the second virtual camera for generating the game image for the terminal is set at the position of the cannon 131. On the other hand, the first virtual camera for generating the game image for the television is set behind the cannon 131. position. Here, for example, a range that cannot be seen in the face of the terminal device 7 is displayed on the television 2, whereby 95 323330 201220109 can be realized to view the face of the television 2 to be in the screen of the terminal device 7" "Seeing the gameplay of the standard light 133. Thus, by setting the display range of the TV 2 and the terminal device 7 to be different, not only can the appearance of the game (4) be accommodated, but also the fun of the game can be improved. According to the present embodiment, the terminal device 7 including the touch panel 2 and the inertia can be used as an operation device.

As in the first and second game examples described above, the operation of directly dropping the game image is reduced. (4) (4) Quest (3rd game example) Game 27th _3 game example. The third player uses a baseball game in the form of a shouting. That is, the first device 7 operates the operation 5 and the second user uses the terminal to play the game 2: In addition, the television image 2 and the terminal m display the game image of the game operation. The movie image shown in the third game example is displayed on the television of the TV 2. The game player image shown in the figure is mainly used for the i-th and the image. In other words, the game image for the television is a pitcher (pitcher object) of the game space operation target of the hitter (player object) 141 142 belonging to the operation target of the player = 142 142. The first virtual shot of the game image ~ This configuration is placed at the rear position of the fr: 1 > 1 1 hand 14? dip + 丄 1, and is placed from the hitter 141 toward the cast. On the other hand, the first? 7 isi ^ The terminal of the 7th, the figure shows the example of the image displayed in the terminal game in the third game example. The terminal video shown in Fig. 27 323330 96 201220109 The play image is mainly used for the second player's play image, and the image belonging to the second place is displayed. That is, the terminal uses the side of the game 142 to view the pitcher space of the player's operation object as the first player. Specifically, in the above-described step S27, the posture of the game player 7 of the target hitter 141 is controlled to generate a cpu 10 system based on the terminal mounted camera. The second virtual game image for the second virtual camera is calculated in the same manner as the second game example described above, in accordance with the posture of the material device 7. In addition, the second

The position of the virtual camera is fixed at a predetermined position of the pre-mosquito. The game image for the terminal includes a cursor 143 for displaying the game. Ding • The direction in which the hand 142 throws the ball is the first player to control the snoring 141 (four) and the second player to control the pitcher 142. For example, the CPU 1 detects the swinging operation of the control II 5 according to the output data of the inertial sensor of the controller 5, and responds to the swinging operation to enter the 2 player 141 swing action. In addition, for example, the CPU 1G moves the cursor 143 according to the operation of the analogy box 3, and when the operation key 54 is pressed = the button is pressed, the pitcher 142 is brought toward the position indicated by the cursor 143. Work. Further, the 'cursor 143' may be adapted to the operation of the analog rocker 53 in response to the posture of the terminal device 7. Zhongsheng: In the above-mentioned '3rd game example, the game image is displayed on the TV 2 and the terminal and the + is a different viewpoint, thereby providing easy viewing and easy operation for each game. Game image. Further, in the third game example of the virtual camera, two game spaces are set in a single game space, and two kinds of game images (Fig. 26 and Fig. 27) of the game space 323330 97 201220109 are displayed from the respective virtual cameras. Therefore, regarding the two types of game images generated in the third game example, the game processing (the control of the objects in the game space, etc.) performed in the game space is almost common, and only the common game space has to be drawn twice. The processing can generate each game image, and therefore has an advantage of higher processing efficiency than when the game processing is performed separately. Further, in the third game example, since the cursor 143 indicating the pitching direction is displayed only on the terminal device 7, the first player cannot see the position indicated by the cursor 143 ® . Therefore, there is no loss in the game in which the pitching direction is known by the first player and which is disadvantageous to the second player. As described above, in the present embodiment, when a game in which one of the players sees the game image is lost to the other player, the game image needs to be displayed on the terminal device 7. . This prevents the lack of strategic reduction of the game. In other embodiments, the game device 3 may also use the game image for the terminal and the television depending on the content of the game (for example, even if the game image for the terminal is not seen by the first player). Displayed on TV 2 together with the game image. (Fourth game example) A fourth game example will be described below with reference to Figs. 28 and 29. The fourth game example is a shooting game in the form of a cooperation between two players. That is, the first player uses the controller 5 to perform the operation of the mobile aircraft, and the second player uses the terminal device 7 to perform the operation of controlling the launching direction of the aircraft's cannon. In the fourth game example, similarly to the third game example, the game image of the game operation 98 323330 201220109 is displayed on the television 2 and the terminal device 7 for each player. Fig. 28 is a view showing an example of a television game image displayed on the television 2 in the fourth game example. In addition, Fig. 29 is a view showing an example of a terminal game image displayed in the final game 7 in the fourth game example. As shown in Figure 28, in the fourth game example, the aircraft (aircraft object) 丨η and the target (balloon object) I53 appear in the virtual game space. In addition, the aircraft 151 has a cannon (cannon article) 152.

As shown in the first figure, the image of the game space containing the aircraft 151 is not used as an electric game image. The first virtual camera for generating the video view is set such that the game space of the aircraft 151 is viewed from the rear. That is, the 帛1 virtual camera is placed at the = square position of the airplane 以5ΐ in a posture in which the aircraft is called photographic range (field of view). Further, the first virtual camera is controlled to move along with the movement of the aircraft 151. That is, the CPU 1Q, in the processing of the above-described step milk, controls the movement of the aircraft 151 and controls the position and posture of the first camera. Thus, the virtual photography placement and posture are controlled by the operation of the first player. For the female t 1 wide face 'as shown in Fig. 29' 'from the aircraft 151 (more specifically, the image of the image (9) is shown as the terminal for the game, the two are used to generate the terminal game image. The second virtual photography is set to the position (more specific cannon). The CPU 10 controls the movement of the aircraft (5) in the above-mentioned steps in the process of controlling the movement of the aircraft (5) according to the controller operation position. The second virtual camera also controls the second virtual The J of the camera is placed at the side of the aircraft 151 or the cannon 152 weeks 323330 99 201220109 (for example, the position slightly behind the cannon 152). As described above, the position of the 2 virtual camera is determined by the operation of the player who operates the aircraft 151. Therefore, in the fourth game example, the camera 1 and the second virtual camera move in conjunction with each other. Virtually, the game space displayed in the direction of the launch of the cannon 152 is displayed as a game map for the terminal, and the cannon 152 The syllabic system is controlled to correspond to the posture of the final set. That is, the present embodiment _ 'the second virtual _ shadow money, the system _ for the second virtual machine midline direction and the direction of the run 152 - To the surface in the above step = In the above step %4_, the material = potential is calculated, and the orientation of the cannon 152 and the posture of the second virtual camera are controlled. ^ The posture of the second virtual camera is controlled by the operation of the second player. The 2-axis operator can change the direction of the launch of the cannon 152 by changing the posture of the terminal device 7. When the castle launches the projectile from the cannon 152, the second player presses the predetermined button of the terminal piercing 7. Pressing @# μ * ^ When the 疋 疋 key & 疋 疋 下 152 152 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ In the fourth game example, the alpha game player mainly views the television game image (Fig. 28) that displays the game space viewed in the traveling direction of the aircraft 151 (for example, it is desired to move in the direction of the 153 in the past). On the other hand, the two players are mainly watching the game image (Fig. 29) of the game space that is displayed in the direction of the launch of the cannon 152, while operating the cannon 152. So, 3233 30 100 201220109 In the fourth game example, in the game where the two players can each play a form for each player, the images are displayed on the TV 2 and the terminal device, respectively, and the game map of the money is made. In the fourth game example, the operation of the first virtual camera and the $2 virtual player is used to control the posture of the camera, that is, the actual position or posture is generated. The display range of the virtual camera play space is generated. (4) The display range of the play space of each display device is determined by the fact that the game players shown in the display device can feel the change from (4) in real time. Therefore, the progress is in progress. In the case where the television 2 displays a game viewed from the rear of the aircraft 151, the terminal device 7 displays the game image viewed from the large (four) of the aircraft 151. Here, in other game examples, the game, the terminal device 7 may display the game image viewed from the rear of the aircraft 151, and the television 2 displays the game image viewed from the position of the robes 152 of the aircraft 151. . At this time, the "player of each player" is replaced with the fourth game example described above, and the first player can be set to operate the cannon 152 using the controller 5. The second player uses the terminal device 7 to carry out the aircraft. 151 operation. (Fifth Game Example) A fifth game example will be described below with reference to Fig. 30. In the fifth game example, the game is operated by the player using the control ϋ 5, and the terminal device 7 is not a device, but is used as a display device. Specifically, the 帛5 game example is the same as the player who plays the controller 5 like a golf club, and the game player 3 plays the game in the virtual game space. The towel performs the swing of the golf ball.

★ Figure 30 shows the pattern of the game system used in the fifth game example. In Fig. 3G, an image of the game space including the player character (object) 161 and the high club (object) 162 is displayed on the face of the television 2. In the figure, although hidden in the high club 162 and not displayed, the ball (object) 163 placed in the game space is also displayed on the television 2. On the other hand, as shown in the figure, the final position 7 is placed on the floor on the front side of the TV 2 in such a manner as to make the face of the coffee 51. The terminal device 7 displays: an image of the ball 163, an image showing the 〆 portion of the high club 162 (specifically, the head 162a of the same cup 162), and a map showing the ground of the game space. The image for the image and the terminal is an image of the surroundings of the ball viewed from above. When the game is played, the player 160 stands in the vicinity of the terminal device 7, and the control device 11 is swiped as if it were high. At this time, the CPU 10, =2 Γ: in the process of controlling the position of the high club 162 in the game space in accordance with the "texture of the bit # calculated by the processing of the above-mentioned step" (third circle. Specifically, high The club 162, when the front end (10) of the controller 5 is directed toward the ball 163 0 displayed by the LCD 5i, controls the high club 162 in the game space to hit the ball 323330 102 201220109 164 (refer to Fig. 30). Regarding the game image for the terminal, in order to increase the sense of presence, the image of the ball 163 can be displayed in a large size, or the orientation of the head image 164 can be rotated in response to the rotation of the controller 5 about the z-axis. Further, the "terminal game image" can be generated using a virtual camera installed in the game space or generated using image materials prepared in advance. When the image data prepared in advance is used for generation, it is not necessary to construct the terrain model of the golf course in detail, and it is possible to generate a detailed and realistic image with a small processing load.

The golfer 162 is swung by the player 160 performing the above-described swing operation, and as a result, when the high club 162 hits the ball 163, the ball 163 moves (flies out). That is, cpu ίο determines whether the high club 162 is in contact with the ball 163 in the above step S27, and moves the ball 163 in contact. Here, the television game image is generated in such a manner as to include the moved ball 163. That is, the CPU 1 is used to control the position and posture of the first virtual camera used for the video game to make the moving ball included in the shooting range, and the terminal device 7 When the high club 162 hits the ball 163, the image of the ball 163 moves and immediately disappears outside the plane. Therefore, in the fifth game example, the appearance of the 'ball movement is mainly displayed on the TV 2' player 16 〇 can be determined by the TV screen county to take care of the swing (4). If the system is zero, the player 16Q can wave the coffee 162 by waving control = 5 (making the player character i6i swing the high club η two this) in the fifth game example, when the controller 5 When the front end direction is toward the image of the ball 163 displayed, it is controlled that the croquet club 162 of the 323330 103 201220109 in the game space hits the ball 163. Therefore, the player can actually swing out by the swing operation. The feeling of a high club can make the swing operation more realistic.

In the fifth game example, the head image is displayed on the LCD 51 when the front end direction of the controller 5 faces the terminal device 7. Therefore, the player can obtain the feeling that the posture of the high club 162 in the virtual space corresponds to the posture of the controller 5 in the real space by directing the front end direction of the controller 5 toward the terminal device 7, and can make the wave The pole operation is more realistic. = On the above, the fifth game example, when will the terminal device? As a display, by properly arranging the position of the final material 7, the operation of the controller 5 can be made more realistic. The above-mentioned _5th kind, the terminal device 7 is placed on the ground, and the figure 7 shows only the game space around the ball 163. Therefore, in the terminal device 7, the position in the game space cannot be displayed. Posture, in addition, in the terminal device = fine mulberry after the ball 163 moves. Therefore, before the fifth game example ’=163 moves, the TV 2 displays the entire golf club off, after the movement of the ball 163, on the TV? Turn this one, (4) the appearance of the flute ball 163. For example, according to the fifth game example, a more smasher can be used, and the game 2 can be used for the game to be played by the TV 2 and the terminal. In addition, in the fifth game example, in the fifth game example, in order to calculate the posture of the control system using the terminal m (4), the display portion 55 is in the initial processing of the display unit 55, and the lamp is not marked. The device 6 is not lit), 323330 104 201220109, and the CPU 10 exits the posture of the controller 5 according to the marker coordinate data 9f at the above step S23. According to this, it is possible to accurately determine whether or not the month of the controller 5 is in the posture toward the indicator portion 55. In the fifth game example described above, the above steps S21 and S22 may not be executed. However, in other game examples, the marker to be turned on in the middle of the game may be changed by executing the processing of steps S21 & S22 described above. For example, in the step cutting, the root=acceleration data 94 determines whether the front end direction of the controller 5 is toward the force direction, and in step S22, the system controls to turn on the light toward the :: portion, not facing the direction of gravity. When the front end direction of the controller 5 is directed toward the direction of gravity, it is possible to accurately calculate the body of the controller 5 and the controller 5 by taking the marker coordinate data of the f5. When the front end direction is toward the television 2, the controller 5 can accurately calculate the controller 5 by obtaining the marker coordinate data of the device 6, and the game system 1 can place the terminal device in a free position and apply it as a display device. The indicator coordinates data is used as a game wheel. This field will be used in addition to the controller 5, and the controller 5 can be used to face the free direction. In other words, according to the present embodiment, the orientation of (4).5 is not controlled by the degree of operation of the device 5. Therefore, the Tk rise control [7. Other operation examples of the game system] The above game system 1 , can enter the action of the game. Terminal device 7 can also be used to carry out various J-use as a portable display or 323330 105 201220109 2nd display, the main input (4): f can also be expected to be a touch-sensitive input or game based on action. The game system can be implemented in a variety of ways (the player only uses the use other than the play). The following actions can also be performed. This is a real example of the operation of the terminal device 7 to play the game. The function with the display device 'has also has the end device 7 used as 1 'not using the TV and control 11 5 and only the final game ~ Μ ^ according to the 29th 〇 _ the game processing shown in the object in step S3 It is to be noted that in the coffee 10 S4, only the terminal operation 7 acquires the terminal operation data 97, and the step operation data 97 is used as the game input (the controller image is not used, and the processing is performed in the step. Then, the game map is generated in step S6, and step S2 may not be performed. ::The game image is sent to the grain setting 7. At this time, the end of the farm, n, and dip ° according to the above content, due to the final processing results of the game to play the game, and will display the game

Used as a line game, but) can also be used to estimate the field. Therefore, according to the present embodiment, when Lang # can't set the navigation = such as: the person is watching the TV broadcast, etc., the user can also use the terminal to install the tLG, which is not limited to the marriage image, and after the power is turned on. According to this, the image can also be transmitted to the terminal device 7 for display. The root can be played without the need to use the TV 2 from the beginning, so the pole 323330 106 201220109 is convenient. Further, in the above, the display device is also installed in the middle of the money from the terminal, and the display of the game image can be performed by changing the above-described step s to the television 2. Specifically, the CPU 10 is output to the television 2 and outputs the game image to the television 2. Step S9 The game image of the device 7: the image is transmitted to the input of the terminal 3 in step S10. According to this, it is possible to change the display device with the terminal device 7 to the electric power* 2 by displaying the round-up of the game image displayed on the game device from the game device, so that the display of the game image can be turned off. Terminal installed Jin 7 ^ 2. After the game image is displayed on the TV 2, the game system 1 is displayed. In the case of the infrared ray output means (red = external communication module 82 which is emitted from the indication device 2), the display unit 55/ is configured to output a aligning signal to the terminal device. According to this, the game device 3 can operate the television 2 without the need to operate the television i from the infrared output means by the operation of the infrared remote control. At this time, since the television 2 is used, the terminal device 7 can be used as the remote controller of the ", so that when the input of the television 2 is changed, the operation example of communicating with other devices via the network is as follows. As described above, since the game device 3 has a connection, the game system i can also be applied to a function that is strong via the network and the outside. The third figure shows a connection diagram of each device included in the communication system 1 Yin via the network. And the game of k, the game «set 3 can be via the network〗 90 and external devices such as the & figure as described above, when the external device 191 is connected to the play H. When 罝3 is available for communication 323330 201220109, in the game system 1, the terminal device 7 can be used as an interface to communicate with the external device 191. For example, the game system 1 can be used as a videophone by receiving and transmitting images and sounds between the external device 191 and the terminal device 7. Specifically, the game device 3 receives an image and sound (image and sound of the telephone partner) from the external device 191 via the network 190, and transmits the received image and sound to the terminal device 7. Thereby, the terminal device 7 can display an image from the external device 191 on the LCD 51, and output the sound from the external device 191 from the speaker 77. Further, the game ® device 3 receives the camera image captured by the camera 56 and the microphone sound detected by the microphone 79 from the terminal device 7, and transmits the camera image and the microphone sound to the external device 191 via the network 190. The game device 3 can use the game system as a videophone by repeating the above-described reception of the image and sound with the external device 191. In the present embodiment, since the terminal device 7 is of a transportable type, the user can use the terminal device 7 at a free position and face the camera 56 in the φ free direction. Further, in the present embodiment, since the terminal device 7 is provided with the touch panel 52, the game device 3 can also transmit the input information (touch position data 100) input to the touch panel 52 to the external device 191. For example, when the image and sound from the external device 191 are output by the terminal device 7, and characters or the like written by the user on the touch panel 52 are transmitted to the external device 191, the game system 1 can also be used as a so-called E-learning system (electronic learning system). (Example of the operation in conjunction with the television broadcast) In addition, the game system 1 can also operate in conjunction with the television broadcast when the television is viewed by the television 2 as 108 323330 201220109. That is, the game system 1 can output information related to the television program to the terminal device 7 when viewing the television program on the television 2. An example of the operation when the game system 1 operates in conjunction with the television broadcast will be described below. In the above operation example, the game device 3 can communicate with the server via the network (in other words, the external device 191 shown in Fig. 31 is a server). The servo device stores various information (television information) related to television playback for each channel of the television broadcast. The television information may be information related to the program such as subtitles or artist information, or information of an EPG (Electronic Program List), or information played as a material broadcast. In addition, television information can be information such as images, sounds, text, or a combination of these. Further, the server does not have to be one, and the server can be set for each channel or each program of the television broadcast, and the game device 3 can also communicate with each server. When the video and sound of the television broadcast are output on the television 2, the game device 3 causes the user to input the frequency φ channel of the television broadcast in the viewing using the terminal device 7. The server is then required to transmit television information corresponding to the entered channel via the network. In response to this, the server transmits data corresponding to the television information of the above channel. When receiving the data transmitted from the server, the game device 3 outputs the received data to the terminal device 7. The terminal device 7 displays the image and the text data in the above data on the LCD 51. And output sound data from the speaker. By the above means, the user can use the terminal device 7 to enjoy information related to the currently watched television program and the like. As described above, the game system 1 communicates with an external device (server) via the network, whereby the information associated with the television broadcast 109 323330 201220109 can be provided to the user via the terminal device 7. In particular, in the present embodiment, since the terminal device 7 is of a transportable type, the user can use the terminal device 7 at a free position, which is highly convenient. As described above, in the present embodiment, the user can use the terminal device 7 in various uses and forms in addition to the game. [8. Modifications] The above embodiment is an embodiment for carrying out the invention, and in other embodiments, the invention may be embodied, for example, in the configuration described below. (Modification in which a plurality of terminal devices are included) In the above embodiment, the game system 1 is configured to have one terminal device, but the game system 1 may be configured to have a plurality of terminal devices. That is, the game device 3 can wirelessly communicate with a plurality of terminal devices, and can transmit the data of the game image, the data of the game sound, and the control data to each terminal device, and receive the operation data and the camera image from each terminal device. Data and microphone sound data. The game device 3 wirelessly communicates with each of a plurality of terminal devices. In this case, the game device 3 can perform wireless communication with each terminal device in a time division manner or divide the frequency band. When there are a plurality of terminal devices as described above, the game system can be used to perform a wider variety of games. For example, when the game system 1 has two terminal devices, since the game system 1 has three display devices, game images for each of the three players can be generated and displayed on each display device. Further, when the game system 1 has two terminal devices, in a game in which the controller and the terminal device are used as one group (for example, the fifth game example described above), 2 110 323330 201220109 players can simultaneously play the game. Furthermore, when the game processing of the above step S27 is performed based on the marker coordinate data output from the two controllers, the two players can respectively cause the controller to face the marker (the pointing device 6 or the indicator portion 55). Game operation. That is, the player of one of the parties causes the controller to perform the game operation toward the pointing device 6, and the other player causes the controller to perform the game operation toward the indicator portion 55. (Modification of Function of Terminal Device) In the above embodiment, the terminal device 7 has a function of a so-called compact terminal that does not execute game processing. Here, in another embodiment, a part of the series of game processing executed by the game device 3 in the above embodiment may be executed by another device such as the terminal device 7. For example, a part of the processing (e.g., generation processing of the game image for the terminal) is executed by the terminal device 7. In other words, the terminal device can also perform game processing based on the operation of the operation unit, and generate a game image in accordance with the game processing and display it on the display unit as a function as a portable game device. Further, for example, in a game system having a plurality of information processing devices (game devices) that can communicate with each other, the plurality of information processing devices can share the execution of the game device. (Modification of Configuration of Terminal Device) The terminal device in the above embodiment is an example, and the shape of each operation key or cover 50 of the terminal device 7, or the number of components and the installation position, etc., may be merely an example. For other shapes, numbers, and settings. For example, the terminal device can be configured as shown below. Hereinafter, a modification of the terminal device will be described with reference to Figs. 32 to 35. Fig. 32 is a view showing the appearance of 111 323330 201220109 of the terminal device according to the modification of the above embodiment. (4) in Fig. 32 is a front view of the terminal device, (6) is a plan view, (c) is a right side view, and (4) is a bottom view. Further, Fig. 33 is a view showing a state in which the user holds the terminal device. In the first and third figures, the constituent elements of the constituent elements of the terminal device 7 according to the above-described embodiment are denoted by the same reference numerals as those of the eighth embodiment, but need not necessarily be the same. As shown in Fig. 32, the terminal device 8 is provided with a cover 5 that is substantially long in the lateral direction and has a long plate-like shape. The cover 5G is of a size that the user can hold. Therefore, the user can hold the terminal device 8 to move or change the arrangement position of the terminal device 8. The terminal device 8 has an LCD 51 on the surface of the housing 50. The LCD 51 is disposed near the center of the outer cover 50. Therefore, as shown in Fig. 9, the user can move while holding the terminal device while viewing the screen of the LCD 51 by holding the cover 5's on both sides of the LCD 51. In Fig. 9, the user is shown holding the outer cover 5's on the left and right sides of the LCD 51, and holding the terminal device 8 in a horizontally held manner (horizontal direction), but it is also possible to The gripping means (longitudinal orientation) holds the terminal device 8. As shown in Fig. 32(a), the terminal device 8 has a touch panel 52 as an operation means (operation portion) on the side of the ICD 51. In the present modification, the touch panel 52 is a resistive touch panel. However, the touch panel is not limited to the resistive film method, and for example, a touch panel of any type such as a capacitive method can be used. In addition, the touch panel 52 can be a single touch method or a multi-touch method. In the present modification, the touch panel 52 is applied to the same resolution (detection accuracy) as the resolution of the LCD 51. However, the resolution of the touch panel 52 is 323330 112 201220109 degrees indefinite/page is consistent with the resolution of the LCD 51. The input to the touch panel 52 is performed by a stylus, but it is not limited to the stylus, and the touch panel 52 can be input by the user's finger. On the outer cover 50, a receiving hole for accommodating the stylus for operating the touch panel 52 can be disposed. Thus, since the terminal device 8 has the touch panel 52, the user can operate the touch panel 52 while moving the terminal device 8. That is, the user can move the face of the Lcd 51 and directly enter the face (by the touch panel 52). As shown in Fig. 32, the terminal device 8 is provided with two analog rockers 53A and ???, and a plurality of operation keys 54A to 54L as operation means (operation 7). Various types of ratio rockers 53A and 53B are means for indicating directions. The various types of ratios 2, 53A and 53B are configured such that the rocking by the user's fingers can slide or tilt the surface of the outer cover 5G in any direction (up, down, left, right, and oblique directions, angles). Further, the left analog stick 53A and the right analog stick 53B are respectively disposed on the left and right sides of the LCD 51 screen. Therefore, the use of the Lu can be used to input the direction of the direction by using the analog joystick. Further, as shown in Fig. 33, the various types of ratio rockers 53A and 53B are provided at positions where the user can operate while holding the left and right portions of the terminal device 8, and therefore, even if the user holds the terminal device When 8 comes to move, 'also $ easy to operate all kinds of joysticks 53A and 53B. ^ Each of the operation keys 54A to 54L is an operation hand 1 for performing a predetermined input. As shown below, each of the operation keys 54A to 54L is provided at a position where the user can grip the left and right portions of the terminal device S to operate. Therefore, even if the user holds the terminal device 8 to move 323330 113 201220109, the operation means can be easily operated. . As shown in Fig. 32(a), on the surface of the outer cover 5, a cross key (direction input key) 54A and keys 54β to 54H among the keys 54A to 54L are provided. That is, the keys 54A to 54G are disposed at positions where the user's thumb can operate (see Fig. 33). The ten sub-key 54A is disposed on the left side of the LCD 51 and on the lower side of the left analog stick 53A. That is, the cross key 54A is disposed at a position where the user's left hand can operate. The cross key 54A has a cross shape and is a key that can indicate the directions of up, down, left, and right. Further, 'keys 54B to 54D are provided on the lower side of the LCD 51. These three keys 54B to 54D are disposed at positions where the left and right hands can be operated. Further, four keys 54E to 54H are provided on the right side of the LCD 51 and on the lower side of the right analog rocker 53B. That is, the four keys 54E to 54H are disposed at positions where the user's right hand can operate. Further, the four keys 54E to 54H are arranged such that they are in a positional relationship of up, down, left, and right (with respect to the center positions of the four keys 54E to 54H). Therefore, the terminal device φ 8 can have four functions of the keys 54E to 54H for indicating the up, down, left and right directions to the user's keys.

Further, as shown in (a), (b), and (c) of Fig. 32, the first L key 541 and the first R key 54J are provided at an obliquely upper portion (the upper left portion and the upper right portion of the outer cover 50). ). Specifically, the first L-key 541 is provided on the left side end of the upper side of the plate-shaped outer cover 50, and is exposed from the upper side and the left side. Further, the first R key 54J is provided on the right side end of the upper side of the plate-shaped outer cover 50, and is exposed from the upper side and the right side surface. In this manner, the first L key 541 is disposed at a position where the user's left index finger can be operated, and the first R key 54J 114 323330 201220109 is disposed at a position where the user's right index finger can be operated (refer to Fig. 9). Further, as shown in (b) and (c) of Fig. 32, the first key 54K and the second R key 54L' are disposed on the back surface of the plate-like cover 5 (i.e., the opposite surface of the LCD 51 is provided). On the side of the side, the foot portions 59Α and 59Β are provided on the protrusions. Similarly to the crotch portion 59 of the above-described embodiment, each of the leg portions 59A and 5 is provided in a region including the opposite side positions of the operation portions (the types of the respective rockers 53A and 53B) provided on the left and right sides of the display portion. Further, the first call key 54K is provided slightly above the left side of the back cover 50 (the left side when viewed from the front side), and the second R key 54L is provided on the right side of the back of the cover 50 (the right side when viewed from the front side) Slightly above. In other words, the second key 54κ is disposed on the substantially opposite side of the left analog rocker 53A on the surface. The second R key 54L is disposed on the substantially opposite side of the right analog rocker 53A provided on the surface. Thus, the 帛(3) key 54κ酉 is placed at a position where the user's left middle finger can be operated, and the second R key 54L is placed at a position where the user's right middle finger can be operated (see Fig. 9). Further, as shown in Fig. 32 (c), the first button 54 and the button 5 are provided on the upper surface of the foot portion 59A and obliquely upward, and have a key surface which is obliquely upward. Since the user assumes that the middle finger is moving in the up and down direction when the user holds the terminal device 8, the user can easily press down the second L key 54K and the second pool by bringing the button face upward. Further, by providing the foot on the back surface of the outer cover 50, the user can easily hold the outer cover 5, and by providing a key on the foot, it is easy to operate under the outer cover 50. Regarding the terminal shake 8 shown in FIG. 32, the second L key 541 (and 323330 115 201220109, the 2nd R key 54L is disposed on the back side, and is downloaded in a state where the face of the LCD 51 (the surface of the cover 50) faces upward). When the terminal device 8 is placed, the kneading surface may not be completely horizontal. Therefore, in other embodiments, three or more leg portions may be provided on the back surface of the outer cover 5A. According to this, the face of the LCD 51 may be faced. In the upper state, the foot is placed on the ground and the floor device is placed on the ground. Therefore, the terminal device 8 can be placed horizontally. Further, the detachable foot portion can be added to horizontally mount the terminal device 8. The operation keys 54A to 54L' are appropriately assigned functions corresponding to the game program®. For example, the 'cross key 54A and the keys 54E to 54H can be used for the direction indication operation or the selection operation, etc., and the keys 54B to 54E can be used for the decision operation or the cancel operation. Although not shown, the terminal device 8 has a power button for turning on/off the power of the terminal device 8. Further, the terminal device 8 may have a button for turning on/off the face display of the LCD 51, or Used to perform with the game device 3 Connect the setting (pairing) button or the button for adjusting the volume of the speaker (the speaker 77 shown in Fig. φ). As shown in (a) of Fig. 32, the terminal device 8 is attached to the surface of the cover 50. The indicator portion (the indicator portion 55 shown in Fig. 10) composed of the marker 55A and the marker 55B is provided. The indicator portion 55 is provided on the upper side of the LCD 51. Each marker 55A and marker 55B, and the pointing device 6 The same signs are used for one or more infrared LEDs. The indicator portion 55 is the same as the above-described indicator device 6, and is used to cause the game device 3 to calculate the operation of the controller 5, etc. Further, the game device 3 It is possible to control the lighting of each of the infrared LEDs provided in the indicator portion 55. 116 323330 201220109 The terminal is a money-like hand camera 56. The camera 56 includes an imaging element having a resolution (for example, a coffee sensor or (10) sensing as described above. As shown in Fig. 32, the camera 53 is placed on the surface of the mask 50. Therefore, the camera can capture the face of the user holding the terminal device 8, for example, while playing the game on the side 51. The user takes a picture. Terminal device 8' A microphone 79) as shown in the imaginary image of the sound input means is provided. The surface of the outer cover 5 () is provided with a mak 1050c. The microphone 79 is provided in the outer hole 5c of the microphone hole. The user's voice and the sound of the inner circumference are measured. The terminal device 8 is provided with a speaker (the speaker 77 shown in the figure) as a sound output means. The figure (d) of Fig. 32 is shown. A horn hole 57 is provided on the lower side surface of the outer cover U. The output sound hole 57 of the horn 77 is output. In the present modification, the terminal device 8 has two horns f provided at each position of the horn speaker and the right horn σ. La π eight holes 57. In addition, the terminal device 8 is provided with an expansion connector 58 for connecting another device to the terminal port 8. In the present modification, as shown in Fig. 32, the expansion connector 58 is provided on the lower side surface of the outer cover 50. Other devices connected to the expansion connector 58 may be any device, such as a controller used in a game (gun type controller, etc.) or a wheel such as a keyboard. If there is no need to connect other devices, the connector 58 may not be provided. For the terminal device 8 shown in Fig. 32, the shape of each operation button or cover such as 32333〇117 201220109, or the number of each component and the installation position, etc. are merely examples, and other shapes and numbers may be used. Set the location. As described above, in the above-described modification, the two leg portions 59A and 59B which are provided on the right and left sides of the outer cover 50 are provided as projections. At this time, in the same manner as in the above-described embodiment, the terminal device 8 can be easily held by the user while holding the terminal device 8 with the ring finger or the middle finger hooked on the lower surface of the protruding portion (see Fig. 33). Further, in the same manner as in the above embodiment, since the second L key 54K and the second R key 54L are provided on the upper surface of the projection, the user can easily operate the keys in the above state. In the above-described embodiments and modifications, the protruding portion is preferably provided on the back side of the outer cover, and is provided on the upper side of the outer cover and is provided at least at the left and right sides. According to this, when the user grips the left and right sides of the outer cover, the protruding portion can hook the finger, and the terminal device can be easily held. Further, by providing the protrusion on the upper side, the user can support the cover by the palm (refer to Fig. 10 or the like), and the operation device can be gripped tightly. The φ protrusion may not be provided on the upper side of the center of the outer cover. For example, when the operation portion is provided on the right and left sides of the display portion, the protrusion portion can be set to any finger other than the thumb in a state where the user can hold the cover with the operation of each of the operation portions. The position that can be hooked. Thereby, the user can easily hold the terminal device by hooking the finger to the protruding portion. Fig. 34 and Fig. 35 are views showing the external configuration of a terminal device according to another modification of the above embodiment. Fig. 34 is a right side view showing the terminal device, and Fig. 35 is a plan view. The terminal device 118 323330 201220109 shown in Figs. 34 and 35 is the same as the terminal device 7 in the above embodiment except that the convex portions 230a and 230b are provided. The configuration of the terminal device 9 in the present modification will be described below focusing on differences from the above-described embodiments. The convex portions 230a and 230b are convex in cross section, and are disposed on the left and right sides of the outer cover 50, respectively. Here, the convex portion 230a is provided on the left side of the outer cover 50 (the left side when viewed from the front side), and the convex portion 230b is provided on the right side of the outer cover 50 (the right side when viewed from the front side). As shown in Fig. 35, each of the convex portions 230a and 230b is provided on the left and right sides (both ends) of the outer cover 50. In addition, each of the convex portions 230a and 230b is disposed below the relatively protruding portion (the crotch portion 59). Each of the convex portions 230a and 230b is provided at an interval from the protruding portion. That is, in the outer cover 50, the portion between the convex portions 230a and 230b and the protruding portion is configured to be thinner than the respective portions. The protruding portions of the convex portions 230a and 230b extend in the vertical direction, and have a convex shape in a cross section perpendicular to the vertical direction. In the present modification, the user holds the convex φ portions 230a and 230b with the little finger (and the ring finger), and the terminal device 9 can be held tightly. That is, the convex portions 230a and 230b have the function of the grip portion. The convex portion (grip portion) may have any shape, and it is preferable to easily hold the terminal device 9 when it is formed to extend in the vertical direction. Further, the height of each of the convex portions 230a and 230b may be any height, and may be formed to be lower than the protruding portion. According to this, in the state in which the terminal device 9 is placed such that the face of the LCD 51 faces upward, the lower side of the screen is lower than the upper side, so that the terminal device 9 can be placed in an easily viewable state. Further, since the convex portions 230a and 230b are provided at intervals from the protruding portion, the user can hold the finger against the lower surface of the protruding portion 119 323330 201220109 to hold the terminal device 9, and the convex portion does not become the finger. Obstruction. As described above, according to the above modification, by providing the convex portion below the protruding portion, the user can hold the terminal device tightly. In other embodiments, the protrusion may be provided not on the back surface of the outer cover 50. In this case, the user may hold the outer cover 50 tightly by the convex portion (grip portion). In addition, the surface of the convex portion (grip portion) can be made of a material that is not slippery in order to enhance the gripping function. Even if there are no projections, the material on the back of the cover can be easily slid. (Modification of the apparatus using the present configuration) In the above embodiment, the terminal device used together with the fixed game device is taken as an example. However, the configuration of the operation device described in the present specification can be applied to the user. Hold in any device used. For example, the operating device can also be implemented as an information terminal such as a portable game machine, a mobile phone, a smart phone, and an e-book terminal device. The present invention has been described in detail above, but is not intended to limit the scope of the invention. Various modifications and variations are of course possible without departing from the scope of the invention. (Industrial Applicability) As described above, the present invention is intended to be easily held by a user, for example, an operation device (terminal device) or the like that can be applied to a game system. BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is an external view of the game system 1. Fig. 2 is a block diagram showing the internal structure of the game device 3. Fig. 3 is a perspective view showing the appearance of the controller 5. 120 323330 201220109 Fig. 4 is a perspective view showing the appearance of the controller 5. Fig. 5 is a view showing the internal configuration of the controller 5. Fig. 6 is a view showing the internal configuration of the controller 5. Fig. 7 is a block diagram showing the configuration of the controller 5. Fig. 8 is a view showing the appearance of the terminal device 7. Fig. 9 is a view showing the appearance of the terminal device 7. Fig. 10 is a view showing a state in which the user holds the terminal device 7 laterally. ® Figure 11 shows a picture of the user holding the terminal device γ horizontally. Fig. 12 is a view showing a state in which the user holds the terminal device 7 in the longitudinal direction. Fig. 13 is a block diagram showing the internal configuration of the terminal device 7 in which the user holds the terminal device 7 in the longitudinal direction. Fig. 15 is a view showing an example in which an attachment device (input device 200) is attached to the terminal device 7. Fig. 16 is a view showing an example in which an attachment device (input device 2A) is attached to the terminal device 7. Fig. 17 is a view showing another example of the input device. Fig. 18 is a view showing the appearance of the input device 220 shown in Fig. 17 attached to the terminal device 7. Fig. 19 is a view showing the appearance of the input device 22 shown in Fig. 17 attached to the terminal device 7. 323330 121 201220109 Fig. 20 is a view showing another example of attaching an attachment device (bracket 210) to the terminal device 7. Figure 21 is a diagram showing various materials used in the recreation process. Fig. 22 is a main flowchart showing the flow of the game processing executed in the game device 3. Figure 23 is a flow chart showing the detailed flow of the game control process. Fig. 24 is a view showing the face of the television 2 and the terminal device 7 in the first game example.

Fig. 25 is a view showing the face of the television 2 and the terminal device 7 in the second game example. Fig. 26 is a view showing an example of a television game image displayed on the television 2 in the 苐3 game example. Fig. 27 is a view showing an example of a game image for a terminal displayed in the terminal 7 in the third game example. Fig. 28 is a view showing an example of a television game image displayed on the television 2 in the fourth game example. Fig. 29 is a view showing an example of a game image for terminal used in the terminal device 7 in the fourth game example. Fig. 30 is a view showing the appearance of the game system in the fifth game example. Fig. 31 shows the connection relationship between the devices included in the external seismic system 1 via the network. The external view of the terminal device of the modified example is shown in Fig. 32. 323330 122 201220109 Figure 33 is a diagram showing the appearance of the user when holding the terminal device shown in Figure 32. Fig. 34 is a view showing the appearance of the terminal splitting of another modification of the embodiment. Fig. 35 is a view showing the appearance of the terminal device according to another modification of the embodiment. [Main component symbol description]

1 Game system 2 TV 2a 3 Game device 4 Disc 5 Controller 6 Marking device 6L Marker 6R Marker 7 Terminal device 8 Terminal device 9 Terminal device 10 CPU 11 System LSI 11a Output input processor lib GPU 11c DSP lid VRAM lie Internal Main memory 12 External main memory 13 R0M/RTC 14 CD player 15 AV-IC 16 AV connector 17 Flash memory 18 Network communication module 19 Controller communication module 20 Expansion connector 21 Memory card connector 22 Antenna 23 Antenna 24 Power button 25 Reset button 26 Rewind button 323330 123 201220109

27 Codec LSI 28 Terminal communication module 29 Antenna 30 Substrate 31 Cover 31a Sound emission hole 32 Operation part 32a Cross key 32b 1st key 32c 2nd key 32d A key 32e Minus key 32f Home key 32g Positive key 32h Power supply Key 32i B key 33 Connector 33a Locking hole 34a LED 34b LED 34c LED 34d LED 35 Imaging information computing unit 35a Light incident surface 36 Communication section 37 Acceleration sensor 38 Infrared filter 39 Lens 40 Imaging element 41 Image processing circuit 42 Microcomputer 43 Memory 44 Wireless Module 45 Antenna 46 Vibrator 47 〇 Pierce Bay 48 Swive Sensor 50 Cover 50a Snap Hole 50b Snap Hole 50c Microphone Hole 51 LCD 52 Touch Panel 53 Analog Rocker 53A Analogy Mast 53B analog rocker 54 operation keys 54A to 54M operation keys (buttons) 124 323330 201220109

55 Marking part 55A Marker 55B Marker 56 Camera 57 〇 口 口 口 58 58 58 58 58 58 58 58 58 58 58 59 59 59 59 59 59 59 59 59 59 59 59 59 59 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 Detector 63 Window 64 Rotary Sensor 65a Hole 65b Hole 66 Charging Terminal 67 Battery Cover 69 Microphone 71 Touch Panel Controller 72 Magnetic Sensor 73 Acceleration Sensor 74 Rotary Sensor 75 User Interface Controller 76 Decoding Is LSI 77 〇 口 八 78 78 Sound 1C 79 Microphone 80 Wireless Module 81 Antenna 82 Infrared Communication Module 83 Flash Memory 84 Power Supply 1C 85 Battery 86 Charger 87 CPU 88 Internal Memory 90 Game Program 91 Receive Data 92 Controller operation data 93 1st operation key data 94 1st acceleration data 95 1st angular velocity data 96 marker coordinate data 97 terminal operation data 98 2nd operation key data 125 323330 201220109

99 joystick data 100 touch position data 101 second acceleration data 102 second angular velocity data 103 orientation data 104 camera image data 105 microphone sound data 106 processing data 107 control data 108 controller posture data 109 terminal posture data 110 image Identification data 111 Sound identification data 121 Flight recording 122 Control surface 123 Target 124 Stylus 131 Cannon 132 Cannonball 133 Target 141 Hitler (Hitman object) 142 Pitcher (Pitcher object) 143 Cursor 151 Aircraft (aircraft object) 152 Cannon 153 and Target 154 Crosshair 160 Player 161 Character 162 High club 162a Head 163 Ball 164 Image (head image) 190 Network 191 External device 200 Input device 200a First grip portion 200b Second grip portion 201 1 key 202 2nd key 203 3rd key 204 Rocker 205 Support part 205a Claw part 205b Claw part 205c Claw part 206 Connection member 207 4th key 126 323330 201220109 208 Window part 209 Connection part 210 Bracket 211 Support member 211a Wall part 211b Slot portion 212 charging terminal 213a guiding member 213b guiding member 220 input device 23 0 convex portion 230a convex portion 230b convex portion 127 323330

Claims (1)

  1. 201220109 VII. Patent application scope: 1. An operation device, which is an operation device for allowing a user to operate, comprising: an outer cover having a substantially plate shape; a display portion disposed on a surface side of the outer cover; and a protrusion portion, It is disposed on the back side of the outer cover, and protrudes at a position above the center of the outer cover and at least on the left and right sides. 2. The operation device according to claim 1, wherein the first operation unit and the second operation unit are provided on the upper side of the center of the outer cover and on the left and right sides of the display unit. 3. The operation device according to claim 2, wherein the protruding portion is provided in a region including the opposite side of the first operation portion and the second operation portion. 4. The operation device according to claim 2, further comprising: a fifth operation unit disposed on a surface on a surface side of the outer cover and below the first operation unit; and a sixth operation unit The surface on the surface side of the outer cover is disposed below the second operation portion. 5. An operating device comprising: a cover having a substantially plate shape; a display portion provided on a surface side of the cover; and a first operation portion and a second operation portion respectively provided on the right and left sides of the display portion; and a protrusion When the user holds the cover so that the first operation unit and the second operation unit can be operated with the thumb of both hands, the 1 323330 201220109 is provided at a position other than the thumb on the back side of the cover. The hand can be hooked: The operation department described in item 5 of the patent application scope is set in the area including the aforementioned i-th operation unit =, which is described in the context of the Hungarian. The operation unit described in the fifth aspect of the invention, wherein the fifth operation unit is disposed below the first operation unit of the outer tomb; and the sixth operation side The surface of the front surface of the front surface is mounted on the surface of the front side, and the second operation unit is disposed in the outer surface of the apparatus. The cover ' is substantially plate-shaped, and the display portion is provided on the surface side of the cover. The protrusion is provided at a position on the left and right sides of the cover; and is raised at least in the operation portion, and is provided at the protrusion The above-mentioned phase is a kind of operation device, which is used to make the user, the side of the side. The device includes: #订操作操作装置, the cover is substantially plate-shaped; the display portion is provided on the back side of the cover, and the front and rear sides of the cover are extended in the vertical direction. The left and right sides of the outer cover are as follows: The surface of the plum as described in claim 9 is convex.备备: The protrusion is disposed on the back surface of the outer cover, wherein the upper side of the double shoulder portion and at least the left and right sides are protruded from the front grip 323330 2 201220109 11. An information processing device is a flat type information The processing device includes: an outer cover having a substantially plate shape; a display portion provided on a surface side of the outer cover; and a protruding portion provided on an inner surface side of the outer cover and located above the center of the outer cover and at least on the left and right sides The position of the side is raised. 12. The operation device according to claim 1 or 5, further comprising: a third operation portion and a fourth operation portion provided on the upper and lower sides of the outer cover on the upper surface of the protrusion. The operation device according to claim 1 or 5, wherein the protrusion has a dome shape extending left and right. The operating device according to claim 1 or 5, wherein a first locking hole that can be locked by an additional device different from the operating device is provided on the lower surface of the protruding portion. 15. The operation device according to claim 14, wherein a surface of the lower side of the outer cover is provided with a second locking hole that the attachment device can lock. The operating device according to claim 1 or 5, wherein a convex portion having a convex cross section is provided on a right and left sides of the back surface of the outer cover. 17. The operating device according to claim 16, wherein the protruding portion and the protruding portion are provided at intervals. 18. The operating device according to claim 1 or 5, further comprising: a grip portion provided on the left and right sides of the back surface of the outer cover. 19. The operating device according to the first, fifth, eighth or ninth aspect of the patent application, wherein the third operating portion of the upper side of the outer cover is provided on the left and right sides of the outer cover 8 operation department. 20. The operating device according to claim 1, 5, 8 or 9, wherein the touch panel provided on the screen of the display unit is provided. 21. The operating device of claim 1, wherein the inertial sensor is provided inside the outer cover. 22. The operating device according to claim 1, wherein the communication unit transmits the operation data showing the operation performed by the own machine to the game device in a wireless manner. And receiving image data transmitted from the game device; and a display control unit that displays the received image data on the display unit. 23. The operating device according to claim 1, wherein the game processing unit executes the game processing according to the operation of the own machine; and the display control unit performs the game processing according to the foregoing A game image is generated and displayed on the display unit. The operating device according to any one of claims 1, 5, 8 or 9, wherein the display unit has five or more pairs of faces. 4 323330
TW100126152A 2010-11-01 2011-07-25 Controller device and information processing device TWI442963B (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
JP2010245299A JP4798809B1 (en) 2010-11-01 2010-11-01 Display device, game system, and game processing method
JP2010245298 2010-11-01
JP2011092506 2011-04-18
JP2011092612A JP6103677B2 (en) 2010-11-01 2011-04-19 Game system, operation device, and game processing method
JP2011102834A JP5837325B2 (en) 2010-11-01 2011-05-02 Operating device and operating system
JP2011103704A JP6005907B2 (en) 2010-11-01 2011-05-06 Operating device and operating system
JP2011103705 2011-05-06
JP2011103706A JP6005908B2 (en) 2010-11-01 2011-05-06 Equipment support system and support device
JP2011118488A JP5936315B2 (en) 2010-11-01 2011-05-26 Information processing system and information processing apparatus

Publications (2)

Publication Number Publication Date
TW201220109A true TW201220109A (en) 2012-05-16
TWI442963B TWI442963B (en) 2014-07-01

Family

ID=46518614

Family Applications (2)

Application Number Title Priority Date Filing Date
TW100126151A TWI440496B (en) 2010-11-01 2011-07-25 Controller device and controller system
TW100126152A TWI442963B (en) 2010-11-01 2011-07-25 Controller device and information processing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
TW100126151A TWI440496B (en) 2010-11-01 2011-07-25 Controller device and controller system

Country Status (3)

Country Link
CN (7) CN202355829U (en)
AU (2) AU2011213765B2 (en)
TW (2) TWI440496B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI502355B (en) * 2012-06-01 2015-10-01 Nvidia Corp Methodology for using smartphone and mobile computer in a mobile compute environment

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8913009B2 (en) 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
AU2011204815B2 (en) 2010-02-03 2013-05-30 Nintendo Co., Ltd. Game system, controller device, and game process method
US8339364B2 (en) 2010-02-03 2012-12-25 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US8814686B2 (en) 2010-02-03 2014-08-26 Nintendo Co., Ltd. Display device, game system, and game method
JP6243586B2 (en) 2010-08-06 2017-12-06 任天堂株式会社 Game system, game device, game program, and game processing method
US10150033B2 (en) 2010-08-20 2018-12-11 Nintendo Co., Ltd. Position calculation system, position calculation device, storage medium storing position calculation program, and position calculation method
JP5840385B2 (en) 2010-08-30 2016-01-06 任天堂株式会社 Game system, game device, game program, and game processing method
JP5840386B2 (en) 2010-08-30 2016-01-06 任天堂株式会社 Game system, game device, game program, and game processing method
KR101364826B1 (en) 2010-11-01 2014-02-20 닌텐도가부시키가이샤 Operating apparatus and operating system
TWI440496B (en) * 2010-11-01 2014-06-11 Nintendo Co Ltd Controller device and controller system
JP5689014B2 (en) 2011-04-07 2015-03-25 任天堂株式会社 Input system, information processing apparatus, information processing program, and three-dimensional position calculation method
TWI487554B (en) * 2013-02-06 2015-06-11 Univ Southern Taiwan Sci & Tec Game machine control method
CN105302232A (en) 2014-06-04 2016-02-03 振桦电子股份有限公司 Tablet computer having detachable handle
JP6341568B2 (en) * 2014-08-05 2018-06-13 アルプス電気株式会社 Coordinate input device
TWI645314B (en) * 2016-10-06 2018-12-21 宏達國際電子股份有限公司 System and method for detecting hand gesture
US20180188816A1 (en) * 2017-01-04 2018-07-05 Htc Corporation Controller for finger gesture recognition and method for recognizing finger gesture
CN108031111A (en) * 2017-12-29 2018-05-15 安徽科创智慧知识产权服务有限公司 Have wireless and wired connection handle system concurrently

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69731903T2 (en) * 1996-03-05 2005-10-06 Sega Enterprises, Ltd. Controller and expansion unit for it
US6966837B1 (en) * 2001-05-10 2005-11-22 Best Robert M Linked portable and video game systems
AU2002354677A1 (en) * 2001-07-12 2003-01-29 Gary L. Friedman Portable, hand-held electronic input device and combination with a personal digital device
US6773349B2 (en) * 2002-07-31 2004-08-10 Intec, Inc. Video game controller with integrated video display
US20060252537A1 (en) * 2005-04-21 2006-11-09 Wen-An Wu Portable wireless control apparatus
JP4778267B2 (en) * 2005-05-16 2011-09-21 任天堂株式会社 Game machine operating device and portable game machine
TWM278452U (en) * 2005-06-03 2005-10-21 Weistech Technology Co Ltd Game controlling handle having a display device
GB2493606B (en) * 2008-03-07 2013-03-27 Milwaukee Electric Tool Corp Visual inspection device
US8384680B2 (en) * 2008-12-23 2013-02-26 Research In Motion Limited Portable electronic device and method of control
CN201572520U (en) * 2009-12-23 2010-09-08 周建正 Three-in-one support for game consoles
TWI440496B (en) * 2010-11-01 2014-06-11 Nintendo Co Ltd Controller device and controller system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI502355B (en) * 2012-06-01 2015-10-01 Nvidia Corp Methodology for using smartphone and mobile computer in a mobile compute environment

Also Published As

Publication number Publication date
CN102600614A (en) 2012-07-25
CN202398092U (en) 2012-08-29
CN102600612A (en) 2012-07-25
AU2011213764B2 (en) 2013-10-24
CN102600611B (en) 2015-03-11
TWI442963B (en) 2014-07-01
AU2011213764A1 (en) 2012-05-17
CN102600612B (en) 2015-12-02
CN202355827U (en) 2012-08-01
CN202355829U (en) 2012-08-01
TWI440496B (en) 2014-06-11
CN102600614B (en) 2015-11-25
TW201219093A (en) 2012-05-16
AU2011213765B2 (en) 2013-07-11
CN102600611A (en) 2012-07-25
AU2011213765A1 (en) 2012-05-17
CN202398095U (en) 2012-08-29

Similar Documents

Publication Publication Date Title
JP5669336B2 (en) 3D viewpoint and object designation control method and apparatus using pointing input
US9901828B2 (en) Method for an augmented reality character to maintain and exhibit awareness of an observer
US9799121B2 (en) Control device for communicating visual information
TWI414335B (en) Determining location and movement of ball-attached controller
US8192285B2 (en) Method and apparatus for simulating games involving a ball
CN106104361B (en) The head-mounted display eyeshade being used together with mobile computing device
US8882596B2 (en) Game program and game apparatus
US7815508B2 (en) Game device and storage medium storing game program
US9526981B2 (en) Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9746921B2 (en) Signal generation and detector systems and methods for determining positions of fingers of a user
JP5806469B2 (en) Image processing program, image processing apparatus, image processing system, and image processing method
US20070211027A1 (en) Image processing apparatus and storage medium storing image processing program
US9199168B2 (en) Game system, game apparatus, storage medium having game program stored therein, and game process method
CN106664401B (en) System and method for providing a user feedback when interacting with content
JP2010259611A (en) Information processing program and information processor
CN103249461B (en) For enabling the handheld device capable of capturing video interactive applications system
AU2011204816B2 (en) Display device, game system, and game process method
EP2422854B1 (en) Game system, game device, storage medium storing game program, and game process method
US8535132B2 (en) Game apparatus for setting a moving direction of an object in a game space according to an attitude of an input device and game program
CN106233227A (en) There is the game device of volume sensing
CA2748627C (en) Controller device, controller system, and information processing device
US8747222B2 (en) Game system, game device, storage medium storing game program, and image generation method
US10150033B2 (en) Position calculation system, position calculation device, storage medium storing position calculation program, and position calculation method
US20120119992A1 (en) Input system, information processing apparatus, information processing program, and specified position calculation method
US20120238363A1 (en) Information processing system, information processing apparatus, storage medium having information processing program stored therein, and image display method