TWI442963B - Controller device and information processing device - Google Patents

Controller device and information processing device Download PDF

Info

Publication number
TWI442963B
TWI442963B TW100126152A TW100126152A TWI442963B TW I442963 B TWI442963 B TW I442963B TW 100126152 A TW100126152 A TW 100126152A TW 100126152 A TW100126152 A TW 100126152A TW I442963 B TWI442963 B TW I442963B
Authority
TW
Taiwan
Prior art keywords
game
operation
terminal device
device
provided
Prior art date
Application number
TW100126152A
Other languages
Chinese (zh)
Other versions
TW201220109A (en
Inventor
Ken-Ichirou Ashida
Yositomo Gotou
Takanori Okamura
Junji Takamoto
Masato Ibuki
Shinji Yamamoto
Hitoshi Tsuchiya
Fumiyoshi Suetake
Akiko Suga
Naoya Yamamoto
Daisuke Kumazaki
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2010245298 priority Critical
Priority to JP2010245299A priority patent/JP4798809B1/en
Priority to JP2011092506 priority
Priority to JP2011092612A priority patent/JP6103677B2/en
Priority to JP2011102834A priority patent/JP5837325B2/en
Priority to JP2011103706A priority patent/JP6005908B2/en
Priority to JP2011103704A priority patent/JP6005907B2/en
Priority to JP2011103705 priority
Priority to JP2011118488A priority patent/JP5936315B2/en
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Publication of TW201220109A publication Critical patent/TW201220109A/en
Application granted granted Critical
Publication of TWI442963B publication Critical patent/TWI442963B/en

Links

Description

Operating device and information processing device

The present invention relates to an operating device that can be gripped by a player.

In the past, there was a player who held the operating device in his hand. For example, the portable game device described in the specification of Japanese Patent No. 3,703,473 is of a folding type, and an operation key is provided on the lower side cover. According to this game device, the user can perform the game operation using the operation keys provided on both sides of the screen while viewing the screen, and the game operation can be easily performed under the grip game device.

In recent years, with regard to the portable terminal device (operating device), the screen and the like have become larger, and the device itself has also been increased in size. Here, when the device itself used by the user to hold the hand becomes large, there is a possibility that the device is not easily held.

Accordingly, it is an object of the present invention to provide an operating device that can be easily held by a user.

In order to solve the above problems, the present invention adopts the following configurations (1) to (18).

(1)

An example of the present invention is an operating device for allowing a user to operate. The operation device includes a cover having a substantially plate shape, a display portion, and a protruding portion. The display portion is provided on the surface side of the cover. The protruding portion is provided on the back side of the outer cover, and is located closer to the upper side than the center of the outer cover and is provided at least at positions on the left and right sides.

The "operating portion" may be any device that can be operated by the user. For example, a rocker (analog rocker), a button (key), a touch panel, a touch pad (touch pad) of an embodiment to be described later. )Wait.

The above-mentioned "positions on the left and right sides" mean that the protrusions are provided on the left and right sides of the center of the cover in the left-right direction, and the protrusions may be provided at the left and right ends, or closer to the left and right ends. The positions of the centers are respectively provided with protrusions.

According to the configuration of the above (1), since the protruding portion is provided on the back side of the cover, when the user holds the right and left covers of the display portion, the protruding portion can be grasped by the finger, and the operating device can be easily held. Further, since the protruding portion is provided on the upper side of the outer cover, when the user holds the outer cover by pressing the index finger, the middle finger, or the ring finger against the lower surface of the protruding portion, the outer cover can be supported by the palm (refer to FIGS. 10 and 11). ), the operating device can be held tightly. Therefore, according to the configuration of the above (1), it is possible to provide an operation device which is easy for the user to hold.

(2)

The operation device may include a first operation unit and a second operation unit provided on the left and right sides of the display unit at a position closer to the upper side than the center of the outer cover.

According to the configuration of the above (2), since the operation portion is provided on the left and right sides of the display portion, the user can easily operate the operation portion with, for example, a thumb when gripping the left and right outer covers of the display portion. Therefore, according to the configuration of the above (2), it is possible to provide an operation device which can be easily held by the user and which is easy to operate.

(3)

Another example of the present invention includes a cover having a substantially plate shape, a display portion, a first operation portion and a second operation portion, and an operation device for the protrusion portion. The display portion is provided on the surface side of the cover. The first operation unit and the second operation unit are respectively disposed on the left and right sides of the display unit. When the user can operate the first operation portion and the second operation portion with the thumb of both hands to grip the cover, the protrusion portion is provided at a position where the finger other than the thumb can be hooked on the back side of the cover.

According to the configuration of the above (3), since the protruding portion is provided on the back side of the cover, when the user holds the right and left covers of the display portion, the protruding portion can be grasped by fingers other than the thumb, and the operating device can be easily held ( Refer to Figure 10 and Figure 11). Further, since the operation unit is provided on the left and right sides of the display unit, the user can easily operate the operation unit with the thumb when holding the left and right outer cover of the display unit. Therefore, according to the configuration of the above (3), it is possible to provide an operation device which can be easily held by the user and which is easy to operate.

(4)

The protruding portion may be provided in a region including the opposite side of the first operation portion and the second operation portion.

The "opposite side position" is not limited to the state in which the position of the operation portion and the protrusion portion are coincident, but is also included in the cover when the area on which the operation portion is provided is projected on the back side of the surface of the cover. The back side is provided with a region of the protrusion and a state partially overlapping the projected region.

According to the configuration of the above (4), when the operation portions are operated, the user can hold the terminal portion 7 with the index finger, the middle finger, or the ring finger to support the terminal device 7 (see FIGS. 10 and 11). Thereby, the terminal device 7 can be easily held, and each operation portion can be easily handled.

(5)

The operation device may include a third operation unit and a fourth operation unit provided on the upper and lower sides of the cover on the upper surface of the protrusion.

According to the configuration of the above (5), the user can operate the third and fourth operation portions with the index finger or the middle finger, for example, while holding the left and right outer covers of the display unit. That is, more operations can be performed in the above state, and an operation device having better operability can be provided. Further, the user can grip the operation portion from the upper and lower sides to grip the operation device, so that it is easier to hold the operation device.

(6)

The protrusions may have a meander shape extending toward the left and right.

According to the configuration of the above (6), the user can hold the operation device by the finger supporting the protrusion along the lower surface of the protrusion, so that it is easier to hold the operation device. Further, the protruding portion is formed to extend in the left-right direction, and when the user holds the operating device such that the protruding portion is in the longitudinal direction, the finger other than the thumb can be abutted regardless of the position of one of the operating devices. Live the protrusions. Therefore, even if the operating device is held in such a manner that the protruding portion is longitudinal, the user can hold the operating device tightly.

(7)

A first locking hole that can be locked by an additional device different from the operating device may be provided on the lower surface of the protruding portion.

According to the configuration of the above (7), the operation device and the attachment device can be firmly connected using the first locking hole. In addition, when the configuration of the above (6) and the configuration of the above (7) are combined, the first locking hole can be provided in the vicinity of the center in the left-right direction of the operation device, so that the right and left balance can be uniformly maintained and the connection can be stably connected. Device.

(8)

A second locking hole that the attachment device can lock can be provided on the surface on the lower side of the cover.

According to the configuration of the above (8), since the operation device and the attachment device can be connected by using the first locking hole and the second locking hole provided at different positions, the connection can be made stronger.

(9)

The operation device is provided with a convex portion having a convex shape in cross section on the left and right sides of the back surface of the cover under the protrusion.

According to the configuration of the above (9), the user can hold the cover by hooking the finger (for example, the ring finger or the little finger) to the convex portion, so that the operation device can be gripped tightly.

(10)

The protrusion and the protrusion may be provided at intervals.

According to the configuration of the above (10), the user can support the protruding portion with the middle finger or the ring finger without causing the convex portion to be obstructed, and the other finger can be caught by the convex portion to grip the operating device. This makes it easier to hold the operating device.

(11)

The operation device may include a grip portion provided on the right and left sides on the back surface of the cover.

According to the configuration of the above (11), the user can hook the finger (for example, the ring finger or the little finger) to the grip portion to grip the cover, so that the operating device can be gripped tightly.

(12)

The operation device may include a fifth operation unit and a sixth operation unit. The fifth operation unit is disposed below the first operation portion on the surface on the surface side of the cover. The sixth operation portion is disposed below the second operation portion on the surface on the surface side of the outer cover.

According to the configuration of the above (12), an operation device can be used to perform a wider variety of operations. Further, even when the fifth operation portion and the sixth operation portion are operated, the user can hold the operation device tightly, so that an operation device excellent in operability can be provided.

(13)

Another example of the present invention includes an outer cover having a substantially plate shape, a display portion, a protruding portion, and an operation device for the operation portion. The display portion is provided on the surface side of the cover. The protrusions are provided on the back side of the cover, and are provided at least at positions on the left and right sides. The operation portion is provided on a surface on the upper side of the protrusion.

According to the configuration of the above (13), since the protruding portion is provided on the back side of the cover, when the user holds the right and left covers of the display portion, the protruding portion can be hooked by the finger, and the operating device can be easily held (refer to the tenth Figure and Figure 11). Further, since the operation portion is provided on the upper surface of the protrusion portion, the operation portion can be easily operated when the finger hooks the protrusion portion to hold the operation device. At this time, the user can hold the operation device so as to sandwich the protrusion from above and below, so that it is easier to grip the operation device. As described above, according to the configuration of the above (13), it is possible to provide an operation device which can be easily held by the user and which is easy to operate.

(14)

Another example of the present invention is an operating device for allowing a user to operate. The operation device includes a cover having a substantially plate shape, a display portion, and a grip portion. The display portion is provided on the surface side of the cover. The grip portion is provided to extend in the vertical direction on the right and left sides of the outer cover on the back side of the outer cover, and has a convex cross section.

According to the configuration of the above (14), the user can hold the cover by hooking the finger (for example, the ring finger or the little finger) to the grip portion, so that the operation device can be gripped tightly. Therefore, according to the configuration of the above (14), an operation device which can be easily held by the user can be provided.

(15)

The operation device may include a protrusion provided on the back side of the cover on the upper side of the grip portion and at least at the left and right sides.

According to the configuration of the above (15), since the protruding portion is provided on the back side of the cover, when the user holds the left and right outer covers of the display portion, the protruding portion can hook the finger and easily hold the operating device, so that the holding device can be tightly held. Hold the operating device.

(16)

The operation device may include a seventh operation unit and an eighth operation unit provided on the upper and lower sides of the outer cover.

According to the configuration of the above (16), an operation device can be used to perform a variety of operations. Further, since the operation portion is disposed on the upper surface of the outer cover, the user can wrap the outer cover from the front side, the upper side, and the back side of the outer cover to grip the operation device tightly.

(17)

The operation device may include a touch panel provided on a screen of the display unit.

According to the configuration of the above (17), the user can use the touch panel to operate the image displayed on the display portion more intuitively and easily. Further, when the display portion is placed upward, the operation device is placed in a slightly inclined state by the protruding portion. Therefore, it is possible to easily operate the touch panel in a state where the operation device is placed.

(18)

The operating device can be equipped with an inertial sensor inside the housing.

According to the configuration of the above (18), the operation device itself can be swung or moved to operate, and the user can operate more intuitively and easily using the operation device. Further, according to this configuration, since it is assumed to be used by the mobile operating device, it is extremely important to firmly connect the operating device and the additional device when the additional device is connected to the operating device. Therefore, in the configuration of the above (18), the configuration of the above (7) or (8) is particularly effective for a strong connection between the operation device and the attachment device.

(19)

The operation device is provided with a communication unit and a display control unit. The communication unit wirelessly transmits the operation data showing the operation performed by the own device to the game device, and receives the image data transmitted from the game device. The display control unit displays the received image data on the display unit.

According to the configuration of the above (19), the user can perform the game operation using the operation device that is easy to grip and has good operability. Further, since the image transmitted from the game device is displayed on the display unit, the user can perform the game operation while viewing the image displayed on the display unit of the operation device.

(20)

The operation device may include a game processing unit and a display control unit. The game processing unit executes game processing in accordance with an operation on its own machine. The display control unit generates a game image based on the result of the game processing and displays it on the display unit.

According to the configuration of the above (20), the portable game device can be easily held, and the operability is good.

(twenty one)

The display unit may have a screen of 5 inches or more.

According to the configuration of the above (21), a large screen can be used to display an image that is easy to see and has a powerful image. When the display unit of the larger screen is used as the configuration of the above (21), the size of the operation device itself is inevitably increased. Therefore, the configuration of the above (1) to (20) which can be easily gripped by the user is particularly effective.

Further, in another example of the present invention, a flat type information processing device including each of the above (1) to (21) (a cover, a display portion, a protruding portion, and the like) may be provided.

According to the invention, the display portion is provided on the surface side of the outer cover, and on the back side of the outer cover, the protrusion is provided at a position closer to the upper side than at the center of the outer cover and at least at the left and right sides, whereby the user can easily hold the handle Operating device.

The above and other objects, features, aspects and advantages of the present invention will become more <RTIgt;

[1. Overall composition of the game system]

The game system 1 according to an embodiment of the present invention will be described below with reference to the drawings. FIG. 1 is an external view of the game system 1. In the first embodiment, the game system 1 includes a fixed display device (hereinafter referred to as "television") represented by a television receiver or the like, a fixed game device 3, a compact disc 4, a controller 5, a pointing device 6, and Terminal device 7. The game system 1 executes game processing in the game device 3 in accordance with the game operation using the controller 5, and displays the game image obtained by the game processing on the television 2 and/or the terminal device 7.

In the game device 3, a disc 4, which is an example of an information storage medium that can be used interchangeably, is detachably inserted into the game device 3. The disc 4 is a memory processing program (typically a game program) that is useful for execution in the game device 3. An insertion port of the optical disc 4 is provided in front of the game device 3. The game device 3 executes the game processing by reading and executing the information processing program stored in the optical disc 4 inserted in the insertion slot.

The television 2 is connected to the game device 3 via a connection line. The television 2 displays a game image obtained by the game processing executed in the game device 3. The television 2 has a horn 2a (Fig. 2), and the horn 2a outputs a game sound obtained as a result of the above-described game processing. In other embodiments, the game device 3 and the fixed display device may be integrally formed. Further, the communication between the game device 3 and the television 2 can be wireless communication.

A marking device 6 is provided around the screen of the television 2 (the upper side of the screen in Fig. 1). The details will be described later in detail, and the user (player) can perform the game operation of the mobile controller 5, and the pointing device 6 is used to calculate the motion, position, posture, and the like of the controller 5. The marking device 6 has two markers 6R and 6L at its both ends. The marker 6R (the same is true for the marker 6L), specifically, one or more infrared LEDs (light emitting diodes), and outputs infrared rays toward the front of the television 2. The pointing device 6 is connected to the game device 3, and the game device 3 can control the lighting of each of the infrared LEDs provided in the signing device 6. The marking device 6 is of a transportable type, and the user can set the marking device 6 in a free position. In Fig. 1, the type in which the marking device 6 is placed on the television 2 is shown, but the position and orientation of the setting indicating device 6 can be arbitrary.

The controller 5 assigns an operation material showing the contents of the operation performed by the own machine to the game device 3. Communication between the controller 5 and the game device 3 can be performed by wireless communication. In the present embodiment, wireless communication between the controller 5 and the game device 3 is, for example, Bluetooth (registered trademark) technology. In other embodiments, the controller 5 and the game device 3 can be connected in a wired manner. Further, in the present embodiment, the number of controllers 5 included in the game system 1 is one, but the game device 3 can communicate with a plurality of controllers 5, and by using a predetermined number of controllers at the same time, it is possible to Several users play the game. The detailed configuration of the controller 5 will be described in detail later.

The terminal device 7 is of a size that can be gripped by the user, and the user can move the terminal device 7 by hand or arrange the terminal device 7 at a free position. The detailed configuration will be described later, and the terminal device 7 includes an LCD (Liquid Crystal Display) 51 as a display means and an input means (a touch panel 52 or a rotation sensor 74 to be described later). Communication between the terminal device 7 and the game device 3 can be performed by wireless (or wired). The terminal device 7 receives the material of the image (for example, the game image) generated in the game device 3 from the game device 3, and displays the image on the LCD 51. In the present embodiment, an LCD is used as the display device. However, the terminal device 7 may have any other display device such as a display device using EL (Electro Luminescence). Further, the terminal device 7 transmits the operation data showing the content of the operation performed by the own device to the game device 3.

[2. Internal structure of game device 3]

Next, the internal configuration of the game device 3 will be described with reference to Fig. 2 . Fig. 2 is a block diagram showing the internal structure of the game device 3. The game device 3 includes a CPU (Central Processing Unit) 10, a system LSI 11, an external main memory 12, a ROM/RTC 13, an optical disk drive 14, and an AV-IC 15.

The CPU 10 performs game processing by executing a game program stored on the optical disc 4, and has a function of a game processor. The CPU 10 is connected to the system LSI 11. The system LSI 11 is connected to an external main memory 12, a ROM/RTC 13, a CD player 14, and an AV-IC 15 in addition to the CPU 10. The system LSI 11 performs processing such as control of data transfer between the constituent elements connected thereto, generation of an image to be displayed, and acquisition of data from an external device. The internal configuration of the system LSI 11 will be described in detail later. The volatile external main memory 12 is a program that memorizes a game program read from the optical disk drive 14 or a game program read from the flash memory, or a memory of various materials, and is used as a work area of the CPU 10. Or a buffer. The ROM/RTC 13 has a ROM (so-called boot ROM) in which a program for starting up the game device 3 is loaded, and a clock (RTC: Real Time Clock) for counting time. The optical disk drive 14 reads program data, material graphic data, and the like from the optical disk 4, and writes the read data into the internal main memory 11e or the external main memory 12 which will be described later.

The system LSI 11 is provided with an output input processor (I/O processor) 11a, a GPU (Graphics Processor Unit) 11b, a DSP (Digital Signal Processor) 11c, and a VRAM (Video RAM). 11d, and internal main memory 11e. Although omitted in the drawings, the constituent elements 11a to 11e are connected to each other by an internal bus bar.

The GPU 11b forms part of the drawing means and generates an image in accordance with a drawing command (drawing instruction) from the CPU 10. The VRAM 11d stores data (polygon data or material graphic data) required for the GPU 11b to execute a drawing instruction. When an image is generated, the GPU 11b uses the material stored in the VRAM 11d to create image data. In the present embodiment, the game device 3 generates both a game image displayed on the television 2 and a game image displayed on the terminal device 7. Hereinafter, the game image displayed on the television 2 is referred to as a "game image for television", and the game image displayed on the terminal device 7 is referred to as a "game image for the terminal".

The DSP 11c has the function of an audio processor, and uses sound data or sound waveform (tone) data stored in the internal main memory 11e or the external main memory 12 to generate sound data. In the present embodiment, the game sound is the same as the game image, and both the game sound output from the speaker of the television 2 and the game sound output from the speaker of the terminal device 7 are generated. Hereinafter, the game sound output from the television 2 is referred to as "game sound for television", and the game sound output from the terminal device 7 is referred to as "game sound for the terminal".

As described above, among the images and sounds generated by the game device 3, the images of the images and sounds outputted from the television 2 are read by the AV-IC 15. The AV-IC 15 outputs the read image data to the television 2 via the AV connector 16, and outputs the read sound data to the built-in speaker 2a of the television 2. Thereby, the image is displayed on the television 2, and the sound is output from the horn 2a.

Further, among the images and sounds generated by the game device 3, the images of the images and sounds outputted to the terminal device 7 are transmitted to the terminal device 7 by the output/output processor 11a or the like. The transmission method of transmitting data to the terminal device 7 by the output input processor 11a or the like will be described later in detail.

The output input processor 11a performs reception and transmission of data between components connected thereto, and downloads data from an external device. The output input processor 11a is connected to the flash memory 17, the network communication module 18, the controller communication module 19, the expansion connector 20, the memory card connector 21, and the codec LSI 27. In addition, the network communication module 18 is connected to the antenna 22. The controller communication module 19 is connected to the antenna 23. The codec LSI 27 is connected to the terminal communication module 28, and the terminal communication module 28 is connected to the antenna 29.

The game device 3 can be connected to a network such as the Internet to communicate with an external information processing device (for example, other game devices or various servers). That is, the output input processor 11a can be connected to a network such as the Internet via the network communication module 18 and the antenna 22, and communicate with an external information processing device connected to the network. The output input processor 11a periodically accesses the flash memory 17 to detect whether there is data to be transmitted to the network. When the data is present, it is transmitted to the network via the network communication module 18 and the antenna 22. In addition, the output input processor 11a receives the data transmitted from the external information processing device or the data downloaded from the download server via the network, the antenna 22, and the network communication module 18, and memorizes the received data. Flash memory 17. The CPU 10 reads the data stored in the flash memory 17 and applies it to the game program by executing the game program. In the flash memory 17, in addition to the data transmitted between the game device 3 and the external information processing device, the stored data of the game in which the game device 3 is used for the game may be memorized (the result data of the game or the midway data). ). In addition, a game program can also be memorized in the flash memory 17.

Further, the game device 3 can receive an operation material from the controller 5. That is, the output input processor 11a can receive the operation data transmitted from the controller 5 via the antenna 23 and the controller communication module 19, and memorize (temporarily memorize) the internal main memory 11e or the external main memory 12 Buffer.

Further, the game device 3 can perform reception and transmission of data such as images or sounds with the terminal device 7. The output/output processor 11a transfers the game image generated by the GPU 11b to the codec LSI 27 when the game image (the game image for the terminal) is transmitted to the terminal device 7. The codec LSI 27 performs predetermined compression processing on the image material from the output input processor 11a. The terminal communication module 28 performs wireless communication with the terminal device 7. Therefore, the image data compressed by the codec LSI 27 is transmitted to the terminal device 7 via the antenna 29 via the terminal communication module 28. In the present embodiment, the image data transmitted from the game device 3 to the terminal device 7 is used in the game, and if the image displayed in the game is delayed, the operability of the game is adversely affected. Therefore, regarding the transfer of image data from the game device 3 to the terminal device 7, it is preferable to form as much as possible without delay. Therefore, in the present embodiment, the codec LSI 27 compresses image data using, for example, a high-efficiency compression technique of the H.264 specification. Other compression techniques can also be used, and when the communication speed is sufficient, the image data can be transmitted without compression. Further, the terminal communication module 28 is, for example, a communication module that accepts Wi-Fi authentication, and can perform wireless communication with the terminal device 7 at a high speed by using a MIMO (Multiple Input Multiple Output) technology such as the IEEE 802.11n standard. Communication, or other means of communication.

Further, the game device 3 transmits the sound data to the terminal device 7 in addition to the image data. That is, the output/output processor 11a outputs the sound data generated by the DSP 11c to the terminal communication module 28 via the codec LSI 27. The codec LSI 27 also performs the same compression processing as the image data for the sound data. The method of compressing the sound data may be any method, but it is preferably a method in which the compression ratio is high and the sound deterioration is small. Further, in other embodiments, the sound data can be transmitted without compression. The terminal communication module 28 transmits the compressed image data and sound data to the terminal device 7 via the antenna 29.

Further, in addition to the image data and the sound data, the game device 3 may transmit various control data to the terminal device 7 as necessary. The control data is information indicating that the components of the terminal device 7 are instructed to be displayed, for example, an instruction to control the lighting of the indicator portion (the indicator portion 55 shown in FIG. 14), or a control camera (No. The instruction of photographing of the camera 56) shown in Fig. 14 and the like. The output input processor 11a transmits control data to the terminal device 7 in response to an instruction from the CPU 10. With respect to this control data, in the present embodiment, the codec LSI 27 does not perform data compression processing, but in other embodiments, compression processing can be performed. The above-described material transmitted from the game device 3 to the terminal device 7 may or may not be encoded as necessary.

Further, the game device 3 can receive various materials from the terminal device 7. The details will be described later. In the present embodiment, the terminal device 7 transmits operation data, image data, and sound data. The various materials transmitted from the terminal device 7 are received by the terminal communication module 28 via the antenna 29. Here, the image data and the sound data from the terminal device 7 are also subjected to the same compression processing as the image data and the sound data from the game device 3 to the terminal device 7. Therefore, the image data and the sound data are transmitted from the terminal communication module 28 to the codec LSI 27, decompressed by the codec LSI 27, and output to the output input processor 11a. On the other hand, regarding the operation data from the terminal device 7, since the amount of data is smaller than that of the image or the sound, the compression processing may not be applied. In addition, it may or may not be encoded as necessary. Therefore, the operation data is received by the terminal communication module 28, and then output to the output/output processor 11a via the codec LSI 27. The output/output processor 11a memorizes (temporarily memorizes) the data received from the terminal device 7 in the buffer of the internal main memory 11e or the external main memory 12.

Further, the game device 3 can be connected to other machines or external memory media. That is, the expansion connector 20 and the memory card connector 21 are connected to the output input processor 11a. The expansion connector 20 is a connector for an interface such as USB or SCSI. The network communication can be replaced by connecting a medium such as an external memory medium, a peripheral device such as another controller, or a wired communication connector to the expansion connector 20 to replace the network communication module 18 . The memory card connector 21 is a connector for connecting an external memory medium such as a memory card. For example, the output input processor 11a accesses the external memory medium via the expansion connector 20 or the memory card connector 21, and can store the data on the external memory medium or read the data from the external memory medium.

The game device 3 is provided with a power button 24, a reset button 25, and a drop button 26. The power key 24 and the reset key 25 are connected to the system LSI 11. When the power key 24 is turned on, power is supplied from the external power source to each component of the game device 3 by an AC adapter (not shown). When the reset button 25 is pressed, the system LSI 11 restarts the startup program of the game device 3. The eject button 26 is connected to the disc player 14. When the eject button 26 is pressed, the optical disc 4 is ejected from the optical disc drive 14.

In other embodiments, some of the components of the game device 3 may be configured as expansion devices that are different from the game device 3. At this time, the expansion machine can be connected to the game device 3 via the expansion connector 20, for example. Specifically, the expansion device includes, for example, the components of the codec LSI 27, the terminal communication module 28, and the antenna 29, and is detachable from the expansion connector 20. According to this, the game device can be configured to communicate with the terminal device 7 by connecting the expansion device to a game device that does not have the above-described components.

[3. Configuration of controller 5]

Next, the controller 5 will be described with reference to Figs. 3 to 7. Fig. 3 is a perspective view showing the appearance of the controller 5. Fig. 4 is a perspective view showing the appearance of the controller 5. 3 is a perspective view seen from the upper rear side of the controller 5, and FIG. 4 is a perspective view seen from the lower front side of the controller 5.

In Figs. 3 and 4, the controller 5 has an outer cover 31 formed of, for example, plastic molding. The outer cover 31 has a substantially rectangular parallelepiped shape in which the front-rear direction (the Z-axis direction shown in FIG. 3) is a longitudinal direction, and the entire size of the outer cover 31 can be gripped by a single hand of an adult or a child. The user can perform a game operation by pressing a button provided on the controller 5 and moving the controller 5 itself to change its position or posture (inclination).

A plurality of operation keys are disposed on the outer cover 31. As shown in FIG. 3, on the upper surface of the outer cover 31, a cross key 32a, a first key 32b, a second key 32c, an A key 32d, a minus key 32e, a top key 32f, a positive key 32g, and Power button 32h. In the present specification, the upper surface of the cover 31 provided with the keys 32a to 32h is referred to as a "key surface". On the other hand, as shown in Fig. 4, a concave portion is formed on the lower surface of the outer cover 31, and a B key 32i is provided on the inclined surface on the rear side of the concave portion. The operation keys 32a to 32i are appropriately assigned to the functions of the information processing program executed by the game device 3. Further, the power key 32h is used to turn the power of the main body of the game device 3 to the close-off in a remote manner. The upper surface of the home key 32f and the power key 32h is embedded in the upper surface of the outer cover 31. Thereby, the user can be prevented from accidentally pressing the first key 32f or the power key 32h.

A connector 33 is provided behind the outer cover 31. Connector 33 is used to connect other machines (e.g., other sensor units or controllers) to controller 5. Further, on both sides of the connector 33 on the rear side of the cover 31, a locking hole 33a for preventing the other machine from being easily detached is provided.

A plurality of (four in FIG. 3) LEDs 34a to 34d are provided behind the upper surface of the outer cover 31. Here, the controller 5 is given a controller type (number) in order to distinguish it from the other controllers 5. Each of the LEDs 34a to 34d is used for notifying the user of the type of the controller currently set in the controller 5 or notifying the user of the remaining amount of the battery of the controller 5. Specifically, when the controller 5 performs a game operation, any one of the plurality of LEDs 34a to 34d is turned on in response to the above-described controller type.

Further, the controller 5 includes an imaging information computing unit 35 (FIG. 6). As shown in FIG. 4, a light incident surface 35a of the imaging information computing unit 35 is provided on the front surface of the cover 31. The light incident surface 35a is made of a material that allows at least infrared rays from the markers 6R and 6L to penetrate.

A sound emitting hole 31a for discharging the sound from the speaker 47 (Fig. 5) built in the controller 5 to the outside is formed between the first key 32b on the upper surface of the outer cover 31 and the top key 32f.

Next, the internal structure of the controller 5 will be described with reference to FIGS. 5 and 6. 5 and 6 are views showing the internal structure of the controller 5. Fig. 5 is a perspective view showing a state in which the upper casing (part of the outer cover 31) of the controller 5 is removed. Fig. 6 is a perspective view showing a state in which the lower casing (part of the outer casing 31) of the controller 5 is removed. The perspective view shown in Fig. 6 is a perspective view of the substrate 30 shown in Fig. 5 as viewed from the back.

In the fifth embodiment, the substrate 30 is fixedly disposed inside the outer cover 31. On the upper main surface of the substrate 30, the operation keys 32a to 32h, the LEDs 34a to 34d, the acceleration sensor 37, and the antenna 45 are disposed. And the speaker 47 and so on. These are connected to a microcomputer (Micro Computer) 42 by wiring (not shown) formed on the substrate 30 or the like (see FIG. 6). In the present embodiment, the acceleration sensor 37 is disposed at a position deviated from the center of the controller 5 in the X-axis direction. Thereby, the operation of the controller 5 when the controller 5 is rotated about the Z axis can be easily calculated. Further, the acceleration sensor 37 is disposed at a position forward of the center of the controller 5 in the longitudinal direction (Z-axis direction). Further, the controller 5 has the function of a wireless controller by the wireless module 44 (Fig. 6) and the antenna 45.

On the other hand, in FIG. 6, the imaging information calculation unit 35 is provided on the front end edge of the lower main surface of the substrate 30. The imaging information computing unit 35 is provided with an infrared filter 38, a lens 39, an imaging element 40, and an image processing circuit 41 in this order from the front of the controller 5. These members 38 to 41 are attached to the lower main faces of the substrate 30, respectively.

Further, the microcomputer 42 and the vibrator 46 are provided on the lower main surface of the substrate 30. The vibrator 46 is, for example, a vibration motor or a solenoid, and is connected to the microcomputer 42 by wiring formed on the substrate 30 or the like. The vibrator 46 is operated by the instruction of the microcomputer 42, whereby the controller 5 generates vibration. Thereby, the vibration can be transmitted to the hand of the user holding the controller 5, and a so-called corresponding vibration game can be realized. In the present embodiment, the vibrator 46 is disposed slightly forward of the outer cover 31. That is, by arranging the vibrator 46 at the end side more than the center of the controller 5, the vibration of the vibrator 46 can cause greater vibration to the entire controller 5. Further, the connector 33 is mounted on the rear end edge of the lower main surface of the substrate 30. The controller 5 includes a crystal vibration element that generates a basic clock of the microcomputer 42 and an amplifier that outputs an audio signal to the speaker 47, in addition to those shown in FIGS. 5 and 6.

The shape of the controller 5 shown in FIGS. 3 to 6 , the shape of each operation key, the number of acceleration sensors or vibrators, and the installation position are merely examples, and may be other shapes, numbers, and installation positions. Further, in the present embodiment, the imaging direction of the imaging means is the Z-axis positive direction, but the imaging direction may be any direction. In other words, the position of the imaging information computing unit 35 in the controller 5 (the light incident surface 35a of the imaging information computing unit 35) may not be in front of the outer cover 31, and may be provided in the case where the light can be taken from the outside of the outer cover 31. Other faces.

Fig. 7 is a block diagram showing the configuration of the controller 5. The controller 5 includes an operation unit 32 (each operation key 32a to 32i), a imaging information calculation unit 35, a communication unit 36, an acceleration sensor 37, and a rotation sensor 48. The controller 5 will display the information on the contents of the operation performed by the own machine, and transmit it to the game device 3 as the operation data. Hereinafter, the operation data transmitted by the controller 5 will be referred to as "controller operation data", and the operation data transmitted by the terminal device 7 will be referred to as "terminal operation data".

The operation unit 32 includes the above-described operation keys 32a to 32i, and outputs the operation key data indicating the input state of each of the operation keys 32a to 32i (whether or not the operation keys 32a to 32i are depressed) to the microcomputer 42 of the communication unit 36. .

The imaging information computing unit 35 is a system for analyzing image data captured by the imaging means, determining a region in which the luminance is high, and calculating the position or size of the center of gravity of the region. Since the imaging information computing unit 35 has, for example, a sampling period of up to about 200 frames per second, even if the operation of the relatively high-speed controller 5 is performed, it can be analyzed.

The imaging information computing unit 35 includes an infrared filter 38, a lens 39, an imaging element 40, and an image processing circuit 41. The infrared filter 38 allows only infrared rays to pass through the light incident from the front of the controller 5. The lens 39 condenses the infrared rays that have passed through the infrared filter 38 and enters the image pickup element 40. The imaging element 40 is, for example, a solid-state imaging element of a CMOS sensor or a CCD sensor, and sensitizes the infrared light collected by the lens 39 to output an image signal. Here, the indicator portion 55 and the pointing device 6 of the terminal device 7 to be imaged are constituted by a marker that outputs infrared rays. Therefore, by providing the infrared ray filter 38, the image pickup device 40 can generate image data only by absorbing the infrared ray by the infrared ray filter 38, so that the image pickup object (the indicator portion 55 and/or the indicator device 6) can be accurately positioned. The image is taken. Hereinafter, an image captured by the imaging element 40 is referred to as a captured image. The image data generated by the image pickup device 40 is processed by the image processing circuit 41. The image processing circuit 41 calculates the position of the imaging target in the captured image. The image processing circuit 41 outputs the coordinates showing the calculated position to the microcomputer 42 of the communication unit 36. The coordinates of the coordinates are transmitted to the game device 3 by the microcomputer 42 as operational data. Hereinafter, the above coordinates are referred to as "marker coordinates". Since the marker coordinates change depending on the orientation (tilt angle) or position of the controller 5 itself, the game device 3 can use the marker coordinates to calculate the orientation or position of the controller 5.

In other embodiments, the controller 5 may be configured not to include the image processing circuit 41, and the captured image itself may be transmitted from the controller 5 to the game device 3. At this time, the game device 3 may have a circuit or a program having the same function as the image processing circuit 41, and calculate the marker coordinates.

The acceleration sensor 37 detects the acceleration of the controller 5 (including the gravitational acceleration), that is, detects the force applied to the controller 5 (including gravity). The acceleration sensor 37 detects the value of the acceleration (linear acceleration) in the linear direction along the direction of the sensing axis in the acceleration applied to the detecting portion of the acceleration sensor 37. For example, in the case of a multi-axis acceleration sensor of two or more axes, the acceleration of the component along each axis is detected as the acceleration applied to the detecting portion of the acceleration sensor. The acceleration sensor 37 is, for example, a capacitive MEMS (Micro Electro Mechanical System) type acceleration sensor, but other types of acceleration sensors may be used.

In the present embodiment, the acceleration sensor 37 is oriented in the up-down direction (the Y-axis direction shown in FIG. 3), the left-right direction (the X-axis direction shown in FIG. 3), and the front-rear direction (the third direction shown in FIG. 3). The linear acceleration is detected in the three-axis directions of the Z-axis direction shown in Fig. 3, respectively. Since the acceleration sensor 37 detects the acceleration in the linear direction along each axis, the output of the acceleration sensor 37 displays the value of the linear acceleration of each of the three axes. That is, the detected acceleration is displayed as a 3-dimensional vector on the XYZ coordinate system (controller coordinate system) set with reference to the controller 5.

The data (acceleration data) showing the acceleration detected by the acceleration sensor 37 is output to the communication unit 36. Since the acceleration detected by the acceleration sensor 37 changes in accordance with the orientation (tilt angle) or the motion of the controller 5 itself, the game device 3 can calculate the orientation or motion of the controller 5 using the acquired acceleration data. In the present embodiment, the game device 3 calculates the posture, the tilt angle, and the like of the controller 5 based on the acquired acceleration data.

It will be readily understood by those skilled in the art from the description of the present specification that the processor of the game device 3 is based on the signal of the acceleration output from the acceleration sensor 37 (the same applies to the acceleration sensor 73 to be described later). For example, the CPU 10) or a computer such as the processor (for example, the microcomputer 42) of the controller 5 performs processing, whereby further information about the controller 5 can be estimated or calculated (determined). For example, when the processing on the computer side is performed on the premise that the controller 5 of the acceleration sensor 37 is in a stationary state (that is, when the acceleration detected by the acceleration sensor has only the gravitational acceleration) As long as the controller 5 is substantially at a standstill, it can be known from the detected acceleration whether the posture of the controller 5 is tilted or tilted with respect to the direction of gravity. Specifically, when the detection axis of the acceleration sensor 37 is directly below the vertical detection state, whether the controller 5 is tilted with respect to the reference can be obtained by whether 1G (gravity acceleration) is applied or not. This size is used to know how much the tilt is relative to the reference. Further, in the case of the multi-axis acceleration sensor 37, it is possible to know in more detail how much the controller 5 is tilted with respect to the direction of gravity by applying processing to the acceleration signals of the respective axes. At this time, the processor can calculate the tilt angle of the controller 5 based on the output from the acceleration sensor 37, or calculate the tilt direction of the controller 5 without calculating the tilt angle. Thus, by using the acceleration sensor 37 in combination with the processor, the tilt angle or posture of the controller 5 can be determined.

On the other hand, when the controller 5 is in the operating state (the controller 5 is in the moving state), since the acceleration sensor 37 detects the acceleration of the action of the controller 5 in addition to the gravitational acceleration, The component of the gravitational acceleration is removed from the detected acceleration by a predetermined process, whereby the direction of operation of the controller 5 can be known. In addition, even if the controller 5 is in the operating state, the component of the acceleration corresponding to the motion of the acceleration sensor is removed from the detected acceleration by a predetermined process, thereby knowing that the controller 5 is relative to the controller 5 The slope of the direction of gravity. In other embodiments, the acceleration sensor 37 may further include: an in-line processing for pre-determining the acceleration signal before outputting the acceleration signal detected by the built-in acceleration detecting means to the microcomputer 42. A device or other type of specialized processing device. The integrated or dedicated processing device, for example, when used to cause the acceleration sensor 37 to detect a static acceleration (eg, gravitational acceleration), can convert the acceleration signal to a tilt angle (or other preferred parameter).

The swing sensor 48 detects the angular velocity around the three axes (the XYZ axis in the present embodiment). In the present specification, the rotation direction around the X axis is referred to as a pitch direction with reference to the imaging direction (Z-axis positive direction) of the controller 5, and the rotation direction around the Y axis is called a yaw (raw) The direction of rotation around the Z axis is called the roll direction. As long as the gyro sensor 48 can detect the angular velocity around the three axes, the number and combination of the gyro sensors used can be arbitrary. For example, the swing sensor 48 can be a 3-axis swivel sensor, or a combined 2-axis swivel sensor and a 1-axis swivel sensor to detect angular velocity about the 3 axes. The data showing the angular velocity detected by the gyro sensor 48 is output to the communication unit 36. In addition, the swing sensor 48 can also detect angular velocities about 1 or 2 axes.

The communication unit 36 includes a microcomputer 42, a memory 43, a wireless module 44, and an antenna 45. The microcomputer 42 controls the wireless module 44 so as to wirelessly transmit the data acquired by the microcomputer 42 to the game device 3 while using the memory 43 as a memory area during processing.

The data output from the operation unit 32, the imaging information calculation unit 35, the acceleration sensor 37, and the rotation sensor 48 to the microcomputer 42 is temporarily stored in the memory 43. These data are transmitted to the game device 3 as operation data (controller operation data). That is, the microcomputer 42 outputs the operation data stored in the memory 43 to the wireless module 44 when the transmission timing of the controller communication module 19 of the game device 3 comes. The wireless module 44, for example, uses Bluetooth (registered trademark) technology to modulate a transmission wave of a predetermined frequency with operation data, and broadcasts the weak electric wave signal from the antenna 45. That is, the operational data is modulated by the wireless module 44 into a weak electric wave signal and transmitted from the controller 5. The weak electric wave signal is received by the controller communication module 19 on the side of the game device 3. The game device 3 can acquire the operation data by demodulating or decoding the received weak electric wave signal. Further, the CPU 10 of the game device 3 performs game processing using the operation data acquired from the controller 5. The wireless transmission from the communication unit 36 to the controller communication module 19 is performed successively every predetermined period. Since the game processing is generally performed in units of 1/60 second (1 frame time), it is preferable to use The communication unit 36 of the controller 5 outputs the operation data to the controller communication module 19 of the game device 3 at a rate of 1/200 second, for example, at a time below this time.

As described above, the controller 5 can transmit the marker coordinate data, the acceleration data, the angular velocity data, and the operation key data as operation data showing the operation of the own machine. Further, the game device 3 performs game processing using the above-described operation data as a game input. Therefore, by using the controller 5 described above, the user can perform the game operation of the mobile controller 5 itself in addition to the conventional general game operation of pressing the operation keys. For example, an operation of tilting the controller 5 to an arbitrary posture, an operation of instructing an arbitrary position on the screen by the controller 5, an operation of the movement controller 5 itself, and the like can be performed.

In the present embodiment, the controller 5 does not have a display means for displaying a game image, but may have, for example, a display means for displaying an image indicating the remaining amount of the battery.

[4. Configuration of Terminal Device 7]

Next, the configuration of the terminal device 7 will be described with reference to Figs. 8 to 13 . Fig. 8 is a plan view showing the appearance of the terminal device 7. Fig. 8(a) is a front view of the terminal device 7, (b) is a plan view, (c) is a right side view, and (d) is a bottom view. Figure 9 is a rear view of the terminal device 7. Further, Fig. 10 and Fig. 11 are views showing a state in which the user holds the terminal device 7 laterally. Fig. 12 and Fig. 13 are views showing a state in which the user holds the terminal device 7 in the longitudinal direction.

As shown in Fig. 8, the terminal device 7 includes a cover 50 having a substantially rectangular plate shape that is substantially horizontally long. That is, the terminal device 7 can be referred to as a tablet type information processing device. The outer cover 50 may have a curved shape as a whole, and may have a curved surface or a part of a projection or the like. Since the outer cover 50 is of a size that can be gripped by the user, the user can move the hand device 7 by hand or change the arrangement position of the terminal device 7. The length in the longitudinal direction (z-axis direction) of the terminal device 7 is preferably 100 to 150 [mm], and is 133.5 [mm] in the present embodiment. The length in the lateral direction (x-axis direction) of the terminal device 7 is preferably 200 to 250 [mm], and is 228.26 [mm] in the present embodiment. The thickness of the terminal device 7 (the length in the y-axis direction) is preferably about 15 to 30 [mm] in the plate-like portion, and is about 30 to 50 [mm] in the thickest portion, and is 23.6 in the present embodiment. The thick part is 40.26) [mm]. Further, the weight of the terminal device 7 is approximately 400 to 600 [g], and is 530 [g] in the present embodiment. Although the details will be described later, even the relatively large terminal device (operating device) as described above is configured as a terminal device 7 that is easy for the user to grip and easy to operate.

The terminal device 7 has an LCD 51 on the surface (front side) of the cover 50. The size of the LCD 51 screen is preferably 5 inches or more, which is 6.2 inches here. The terminal device 7 in the present embodiment is easy to handle and easy to operate, and is easy to operate even if a large LCD is provided. In other embodiments, a smaller LCD 51 can be provided to set the size of the operating device 7 to be relatively small. The LCD 51 is disposed near the center of the surface of the outer cover 50. Therefore, as shown in Figs. 10 and 11, the user can hold the terminal device 7 and move while viewing the screen of the LCD 51 by holding the cover 50 on both sides of the LCD 51. In Fig. 10 and Fig. 11, an example in which the user holds the outer cover 50 on the left and right sides of the LCD 51 while holding the terminal device 7 in a horizontally held manner (longer lateral direction) is shown, but As shown in Fig. 12 and Fig. 13, the terminal device 7 is held in a longitudinally gripping manner (longitudinal direction).

As shown in (a) of FIG. 8, the terminal device 7 has a touch panel 52 as an operation means on the screen of the LCD 51. In the embodiment, the touch panel 52 is a resistive touch panel. However, the touch panel is not limited to the resistive film method, and for example, a touch panel of any method such as an electrostatic capacitance method can be used. In addition, the touch panel 52 can be a single touch method or a multi-touch method. In the present embodiment, the touch panel 52 is applied to the same resolution (detection accuracy) as the resolution of the LCD 51. However, the resolution of the touch panel 52 does not have to be consistent with the resolution of the LCD 51. The input to the touch panel 52 is usually performed by the stylus pen 60. However, the touch panel 60 is not limited to the stylus pen 60, and the touch panel 52 can be input by the user's finger. The housing 50 may be provided with a housing hole 60a for accommodating the stylus pen 60 for operating the touch panel 52 (see FIG. 8(b)). Here, the receiving hole 60a is provided on the upper surface of the outer cover 50 so that the stylus pen 60 does not fall, but may be provided on the side surface or the lower surface. As described above, since the terminal device 7 has the touch panel 52, the user can operate the touch panel 52 while moving the terminal device 7. That is, the user can input the screen directly (by the touch panel 52) while moving the screen of the LCD 51.

As shown in Fig. 8, the terminal device 7 includes two analog rockers 53A and 53B and a plurality of operation keys (keys) 54A to 54M as operation means (operation unit). The various types of rocker bars 53A and 53B are devices that indicate the direction. The various types of rocker bars 53A and 53B are configured such that the movable member (rocker portion) operated by the user's finger can slide in any direction (any angle of up, down, left, right, and oblique directions) with respect to the surface of the outer cover 50. That is, it is also referred to as a directional input device for a sliding pad. The movable members of the respective types of the rockers 53A and 53B may be of a type that can be tilted in any direction with respect to the surface of the outer cover 50. In the present embodiment, an analog rocker of a type in which the movable member is slidable is used. Therefore, even if the user does not move the thumb largely, the various types of ratio rockers 53A and 53B can be operated, and the outer cover 50 can be gripped tightly. Operate in the state. When the movable member is used as the type of dumping as the various types of ratio rockers 53A and 53B, it is easier for the user to understand the degree of input (degree of tilt), and it is easier to perform detailed operations.

Further, the left analog stick 53A and the right analog stick 53B are respectively disposed on the left and right sides of the LCD 51 screen. Therefore, the user can use the analog joystick to perform the direction input by either the left and right hands. Further, as shown in FIGS. 10 and 11, the various types of ratio rockers 53A and 53B are provided in a state where the user can hold the left and right portions of the terminal device 7 (portions on the left and right sides of the LCD 51). In position, therefore, even if the user holds the terminal device 7 to move, it is easy to operate the various types of the joysticks 53A and 53B.

Each of the operation keys 54A to 54L is an operation means (operation part) for performing predetermined input, and is a depressible button. As will be described below, each of the operation keys 54A to 54L is provided at a position where the user can operate while holding the left and right portions of the terminal device 7 (see FIGS. 10 and 11). Therefore, even when the user holds the terminal device 7 to move, the operation means can be easily operated.

As shown in Fig. 8(a), on the surface of the outer cover 50, a cross key (direction input key) 54A and keys 54B to 54H and a key 54M among the operation keys 54A to 54L are provided. That is, the keys 54A to 54H and the keys 54M are disposed at positions where the user's thumb can operate (see FIGS. 10 and 11).

The cross key 54A is disposed on the left side of the LCD 51 and is the lower side of the left analog rocker 53A. That is, the cross key 54A is disposed at a position where the user's left hand can operate. The cross key 54A has a cross shape and is a key that can indicate at least the up, down, left, and right directions.

Further, keys 54B to 54D are provided on the lower side of the LCD 51. These three keys 54B to 54D are disposed at positions where the left and right hands can be operated. Further, the terminal device 7 has a power key 54M for turning on/off the power of the terminal device 7. The power of the game device 3 can also be turned on/off in a remote manner by the operation of the power button 54M. The power key 54M, like the keys 54B to 54D, is disposed on the lower side of the LCD 51. The power key 54M is disposed on the right side of the keys 54B to 54D. Therefore, the power key 54M is disposed at a position where the right hand can be operated (easy to operate). Further, four keys 54E to 54H are provided on the right side of the LCD 51 and are the lower side of the right analog rocker 53B. That is, the four keys 54E to 54H are disposed at positions where the user's right hand can operate. Further, the four keys 54E to 54H are arranged such that they are in a positional relationship of up, down, left, and right (with respect to the center positions of the four keys 54E to 54H). Therefore, the terminal device 7 can have four functions of the keys 54E to 54H for the user to instruct the keys in the up, down, left, and right directions.

In the present embodiment, the various types of rocker bars 53A and 53B are disposed on the upper side of the cross key 54A and the respective keys 54E to 54H. Here, the various types of ratio rockers 53A and 53B protrude from the cross key 54A and the respective keys 54E to 54H in the thickness direction (y-axis direction). Therefore, when the position of the analog rocker 53A and the cross key 54A is reversed, when the user operates the cross key 54A with the thumb, there is a possibility that the thumb hits the analog rocker 53A and an erroneous operation is caused. The same problem occurs when the analog rocker 53B and the positions of the respective keys 54E to 54H are arranged oppositely. On the other hand, in the present embodiment, since the various types of rocker bars 53A and 53B are disposed on the upper side of the cross key 54A and the respective keys 54E to 54H, when the user operates the analog rockers 53A and 53B, the finger touches. The possibility of the cross key 54A and the respective keys 54E to 54H is lower than the above case. As described above, in the present embodiment, the possibility of erroneous operation can be reduced, and the operability of the terminal device 7 can be improved. However, in other embodiments, the analog rocker 53A and the cross key 54A may be arranged oppositely, or the analog rocker 53B and the respective keys 54E to 54H may be arranged oppositely.

Here, in the present embodiment, some of the operation portions (the various types of the joysticks 53A and 53B, the cross key 54A, and the three keys 54E to 54G) are disposed on the left and right sides of the display unit (LCD 51). The center of the upper and lower directions (y-axis direction) of the outer cover 50 is the upper side. When the operation portions are operated, the user mainly holds the center in the upper and lower directions of the terminal device 7 as the upper side. Here, when the user grips the lower side of the outer cover 50 (especially when the terminal device 7 has a relatively large size as in the present embodiment), the held terminal device 7 becomes unstable, and the user is not easy to use. The terminal device 7 is held. On the other hand, in the present embodiment, when the operation portion is operated, the user mainly holds the center in the upper and lower directions of the terminal device 7 in the upper direction, and the outer cover 50 can be supported from the lateral direction by the palm. Therefore, the user can grip the cover 50 in a stable state and easily hold the terminal device 7, so that the above-described operation portion becomes easier to operate. In other embodiments, at least one of the operation portions may be provided on the left and right sides of the display portion, respectively, on the upper side of the center of the outer cover 50. For example, only the various types of the rocker bars 53A and 53B may be disposed on the upper side of the center of the outer cover 50. Further, for example, when the cross key 54A is disposed on the upper side of the left analog stick 53A, and the four keys 54E to 54H are disposed on the upper side of the right analog stick 53B, the cross key 54A and the four keys 54E to 54H can be set. The upper side of the outer cover 50 is the upper side.

Further, in the present embodiment, a projection portion (the crotch portion 59) is provided on the back side of the outer cover 50 (the side opposite to the surface on which the LCD 51 is provided) (see (c) and ninth views of Fig. 8). . As shown in FIG. 8(c), the crotch portion 59 is a mountain-shaped member that is provided to protrude from the back surface of the substantially plate-shaped outer cover 50. The protrusion has a height (thickness) that allows a user's finger holding the back surface of the cover 50 to be hooked. The height of the protrusion is preferably 10 to 25 [mm], and in the present embodiment, it is 16.66 [mm]. Further, it is preferable that the lower surface of the protruding portion has an inclination of 45 or more (more preferably 60 or more) with respect to the back surface of the outer cover 50 so that the protruding portion is easily caught by the user's finger. As shown in Fig. 8(c), the lower surface of the protrusion can be formed to have a larger inclination angle than the upper surface. As shown in FIGS. 10 and 11, the user hooks the finger to the crotch portion 59 (the crotch portion 59 is carried on the finger), whereby even if the terminal device 7 has a relatively large size, The terminal device 7 can be held in a stable state without fatigue. That is, the crotch portion 59 can serve as a supporting member for supporting the outer cover 50 with fingers, and can also serve as a finger hooking portion. Further, the crotch portion 59 is provided on the upper side in the upper and lower direction of the outer cover 50. The crotch portion 59 is provided at a position substantially opposite to the operation portion (the various types of the rockers 53A and 53B) provided on the surface of the outer cover 50. In other words, the protruding portion is provided in a region including a position on the opposite side of the operation portion on the left and right sides of the display portion. Therefore, when the operation unit is operated, the user can hold the terminal device 7 by supporting the crotch portion 59 by the middle finger or the ring finger (refer to FIGS. 10 and 11). Thereby, the terminal device 7 becomes easy to hold, and the above-described operation portion also becomes easy to operate. Further, in the present embodiment, since the protruding portion has a meandering shape in which the (protruding portion) extends to the left and right, the user can hold the terminal device 7 along the lower surface of the protruding portion with the middle finger or the ring finger, and it is easier to hold. Terminal device 7. The crotch portion 59 is formed so that the (protruding portion) extends to the left and right, and is not limited to the shape extending in the horizontal direction as shown in FIG. In other embodiments, the crotch portion 59 may extend in a direction that is slightly inclined from the horizontal direction. For example, the crotch portion 59 may be disposed to be inclined upward (or downward) as it goes from the left and right ends toward the center.

In the present embodiment, the crotch portion 59 having a meandering shape is used as the protrusion formed on the back surface of the outer cover for the purpose of providing the locking portion to be described later, but the protruding portion may have any shape. For example, in another embodiment, the two protrusions may be provided on the left and right sides of the cover 50 (the protrusions are not provided in the center in the left-right direction) (see FIG. 32). Further, in other embodiments, the cross-sectional shape of the protruding portion (the shape perpendicular to the cross-section in the x-axis direction) may also allow the user's fingers to more tightly support the terminal device 7 (the finger is tightly hooked) The manner in which the protrusions are held) is formed in a hook shape (a shape in which the lower surface is recessed). The width of the protrusion (the crotch portion 59) in the upper and lower directions may be any width. For example, the protrusions may be formed to the upper side of the outer cover 50. That is, the upper surface of the protruding portion can be formed at the same position as the side surface on the upper side of the outer cover 50. At this time, the outer cover 50 is configured in two stages in which the lower side is thin and the upper side is thick. Thus, the outer cover 50 is preferably formed on the left and right sides of the back surface, and has a downward facing surface (the lower surface of the protruding portion). Thereby, the user can easily hold the terminal device 7 by holding the finger against the face. The "face facing downward" may be formed at any position on the back surface of the outer cover 50, but is preferably located on the upper side of the center of the outer cover 50. Further, as shown in FIGS. 8(a), (b), and (c), the first L key 54I and the first R key 54J are provided on the left and right sides of the upper surface of the outer cover 50, respectively. In the present embodiment, the first L-key 54I and the first R-key 54J are provided on the obliquely upper portion (the upper left portion and the upper right portion) of the outer cover 50. Specifically, the first L-key 54I is provided on the left side end of the upper side of the plate-shaped outer cover 50, and is exposed from the side surface on the upper left side (in other words, exposed from the side surfaces of both the upper side and the left side). Further, the first R key 54J is provided on the right side end of the upper side of the plate-shaped outer cover 50, and is exposed from the side surface on the upper right side (in other words, exposed from the side surfaces of both the upper side and the right side). In this manner, the first L key 54I is disposed at a position where the user's left index finger can be operated, and the first R key 54J is disposed at a position where the user's right index finger can be operated (see FIG. 10). In the other embodiments, the operation portions provided on the left and right sides of the upper surface of the outer cover 50 are not necessarily provided at the left and right end portions, and may be provided at positions other than the end portions. Further, the operation portions may be provided on the left and right side surfaces of the outer cover 50, respectively.

Further, as shown in FIG. 8(c) and FIG. 9, the second L key 54K and the second R key 54L are disposed on the protruding portion (the crotch portion 59). The second L key 54K is provided near the left end of the crotch portion 59. The second R key 54L is provided near the right end of the crotch portion 59. In other words, the second L-key 54K is provided slightly above the left side of the back cover 50 (the left side when viewed from the front side), and the second R-key 54L is provided on the right side of the back surface of the outer cover 50 (the right side when viewed from the front side). Above. In other words, the 2nd L key 54K is disposed at a position on the (substantially) opposite side of the left analog rocker 53A provided on the surface, and the 2nd R key 54L is disposed (substantially) opposite to the right analog rocker 53B provided on the surface. Side position. In this manner, the second L key 54K is disposed at a position where the user's left middle finger or forefinger can be operated, and the second R key 54L is disposed at a position where the user's right middle finger or index finger can be operated (refer to FIGS. 10 and 11). ). Further, as shown in FIG. 8(c), the second L key 54K and the second R key 54L are provided on the upper surface of the crotch portion 59. Therefore, the second L key 54K and the second R key 54L have a key surface that faces upward (obliquely upward). Since the middle finger or the index finger is estimated to move in the vertical direction when the user holds the terminal device 7, the user can easily press the second L key 54K and the second R key 54L by setting the button surface upward.

As described above, in the present embodiment, on the upper side of the center of the outer cover 50, the operation portions (the analog rockers 53A and 53B) are provided on the left and right sides of the display unit (LCD 51), and the back of the outer cover 50 is provided. On the side, other operation portions (the second L key 54K and the second R key 54L) are provided at positions on the opposite side of the operation portion. According to this, the operation unit and the other operation unit are disposed at positions facing each other on the front side and the back side of the outer cover 50. When the operation portions are operated, the user can hold the outer cover 50 from the front side and the back side and hold the operation. . Further, the user who operates the operation portions grips the center of the outer cover 50 in the upper and lower directions, so that the terminal device 7 can be held on the upper side and the terminal device 7 can be supported by the palm (see FIG. 10 and 11 figure). According to the above, the user can stably hold the terminal device 7 while operating at least four operation portions, and it is possible to provide an operation device (terminal device 7) which is easy for the user to grip and has excellent operability. As described above, in the present embodiment, the user can easily hold the terminal device 7 by holding the terminal device 7 while holding the finger against the lower surface of the projection (the crotch portion 59). Further, since the second L key 54K and the second R key 54L are provided on the upper surface of the protruding portion, the user can easily operate the keys in the above state. The user can easily hold the terminal device 7 in the following manner, for example.

That is, as shown in FIG. 10, the user can also hold the ring finger against the underside of the crotch portion 59 (the single-dot chain line shown in FIG. 10), and hold the terminal device (in the manner of the ring finger supporting the crotch portion 59). 7. At this time, the user can operate the four keys (the first L key 54I, the first R key 54J, the second L key 54K, and the second R key 54L) with the index finger or the middle finger. For example, in the required game operation, when the keys used are large and complicated, the majority of the keys can be easily operated by being held as shown in Fig. 10. Further, since the various types of joysticks 53A and 53B are disposed on the upper side of the cross key 54A and the keys 54E to 54H, the user can operate the analog rockers 53A and 53B by the thumb when relatively complicated operations are required. Can be easily carried out. Further, in Fig. 10, the user puts the thumb against the surface of the outer cover 50, presses the index finger against the upper surface of the outer cover 50, and presses the middle finger against the upper surface of the crotch portion 59 of the back surface of the outer cover 50 to press the ring finger against the crotch portion 59. Next, the little finger is pressed against the back surface of the outer cover 50 to hold the terminal device 7. In this way, the user can hold the terminal device 7 tightly by surrounding the outer cover 50 from four sides. Further, as shown in Fig. 11, the user can hold the terminal device 7 by pressing the middle finger against the lower surface of the crotch portion 59 (the single-dot chain line shown in Fig. 11). At this time, the user can easily operate the two keys (the second L key 54K and the second R key 54L) with the index finger. For example, when the required game operation is less and simpler, it can be held as shown in Fig. 11. In Fig. 11, since the user can grip the lower side of the outer cover 50 by two fingers (the ring finger and the little finger), the terminal device 7 can be gripped tightly. In the present embodiment, the lower surface of the crotch portion 59 is disposed between the various types of ratio rockers 53A and 53B, the cross key 54A, and the four keys 54E to 54H (located under the various types of rocker bars 53A and 53B). And the cross key 54A and the four keys 54E to 54H are upper). Therefore, when the ring finger is held against the crotch portion 59 to hold the terminal device 7 (Fig. 10), it is easy to operate the various types of ratio rockers 53A and 53B with the thumb, and the middle finger is pressed against the crotch portion 59 to hold the terminal device 7 (Fig. 11), it is easy to operate the cross key 54A and the four keys 54E to 54H with the thumb. That is, regardless of the above two cases, the user can perform the direction input operation while holding the terminal device 7 tightly.

Further, as described above, the user can also hold the terminal device 7 vertically. That is, as shown in Fig. 12, the user holds the upper side of the terminal device 7 with the left hand, whereby the terminal device 7 can be held longitudinally. Further, as shown in Fig. 13, the user holds the lower side of the terminal device 7 with the left hand, whereby the terminal device 7 can be held longitudinally. In the 12th and 13th drawings, the case where the terminal device 7 is held by the left hand is shown, but the terminal device 7 can also be held with the right hand. As described above, since the user can hold the terminal device 7 with one hand, for example, the terminal device 7 can be held by one hand while the touch panel 52 is input by the other hand. Further, when the terminal device 7 is held by the gripping method shown in Fig. 12, the user touches the lower surface of the crotch portion 59 with a finger other than the thumb (the middle finger, the ring finger, and the little finger in Fig. 12) (Fig. 12) The single-point chain line shown), whereby the terminal device 7 can be held tightly. In particular, in the present embodiment, since the crotch portion 59 is formed to extend left and right (upper and lower in FIG. 12), it is possible to hold the finger other than the thumb regardless of the position on the upper side of the terminal device 7 by the user. The crotch portion 59 can hold the terminal device 7 tightly. That is, when the user holds the terminal device 7 longitudinally for use, the crotch portion 59 can be used as a handle. On the other hand, when the terminal device 7 is held by the holding mode shown in Fig. 13, the user can operate the keys 54B to 54D with the left hand. Therefore, for example, the touch panel 52 can be input with one hand, and the keys 54B to 54D of the terminal device 7 can be operated by the hand, and more operations can be performed.

In the terminal device 7 of the present embodiment, when the projections (the crotch portion 59) are provided on the back surface, when the terminal device 7 is placed with the screen of the LCD 51 (the surface of the outer cover 50) facing upward, the screen is slightly The state of the tilt. Thereby, it is possible to view the screen more easily in the state in which the terminal device 7 is placed. Further, it is easy to perform an input operation on the touch panel 52 in a state where the terminal device 7 is placed. Further, in another embodiment, an additional protrusion having a height equivalent to the above-described crotch portion 59 may be formed on the back surface of the outer cover 50. According to this, the projections are brought into contact with the ground while the screen of the LCD 51 is facing upward, and the terminal device 7 can be placed to make the screen horizontal. In addition, the additional protrusions can be detachable (or foldable). According to this, the terminal device can be placed when both the screen is slightly tilted and the screen is horizontal. That is, the crotch portion 59 can be used as a foot when the terminal device 7 is placed for use. Each of the operation keys 54A to 54L is appropriately assigned with a function corresponding to the game program. For example, the cross key 54A and the keys 54E to 54H can be used for a direction indicating operation or a selection operation, etc., and the keys 54B to 54E can be used for a decision operation or a cancel operation or the like. Further, the terminal device 7 may have a button for turning on/off the screen display of the LCD 51 or a button for setting (pairing) the connection with the game device 3. As shown in FIG. 8(a), the terminal device 7 is provided with a indicator portion 55 composed of a marker 55A and a marker 55B on the surface of the cover 50. The indicator portion 55 is provided on the upper side of the LCD 51. Each of the marker 55A and the marker 55B is composed of one or more infrared LEDs, similarly to the markers 6R and 6L of the indicator device 6. The infrared LEDs constituting the markers 55A and 55B are disposed inside the window portion through which infrared rays can pass. The indicator unit 55 is used to cause the game device 3 to calculate the operation of the controller 5 or the like, similarly to the above-described pointing device 6. Further, the game device 3 can control the lighting of each of the infrared LEDs provided in the indicator portion 55.

The terminal device 7 is provided with a camera 56 as an imaging means. The camera 56 includes an imaging element (for example, a CMOS image sensor or a CCD image sensor) having a predetermined resolution and a lens. As shown in Fig. 8, in the present embodiment, the camera 56 is provided on the surface of the outer cover 50. Therefore, the camera 56 can image the face of the user holding the terminal device 7, for example, the user can perform imaging while watching the LCD 51 while playing the game. In the present embodiment, the camera 56 is disposed between the two markers 55A and 55B.

The terminal device 7 is provided with a microphone 79 as a voice input means. A microphone hole 50c is provided on the surface of the outer cover 50. The microphone 79 is disposed inside the outer cover 50 in the microphone hole 50c. The microphone 79 detects the sound of the user and the like, and the sound around the terminal device 7.

The terminal device 7 is provided with a speaker 77 as a sound output means. As shown in Fig. 8(a), a horn hole 57 is provided on the lower side of the surface of the outer cover 50. The output sound of the speaker 77 is output from the horn hole 57. In the present embodiment, the terminal device 7 has two speakers, and a horn hole 57 is provided at each position of the left horn and the right horn. The terminal device 7 is provided with a knob 64 for adjusting the volume of the speaker 77. Further, the terminal device 7 is provided with a sound output terminal 62 for connecting an audio output unit such as an earphone. Here, in consideration of the case where the attachment device is attached to the lower side surface of the outer cover, the sound output terminal 62 and the knob 64 are provided on the upper side surface of the outer cover 50, but may be provided on the left and right side surfaces or the lower side surface.

Further, a window 63 for emitting an infrared signal from the infrared communication module 82 to the outside of the terminal device 7 is provided on the outer cover 50. Here, the window 63 is provided on the upper side surface of the outer cover 50 so as to inject the infrared signal toward the front of the user when holding both sides of the LCD 51. However, in other embodiments, the window 63 may be provided at any position such as the back surface of the outer cover 50.

Further, the terminal device 7 is provided with an expansion connector 58 for connecting another device to the terminal device 7. The expansion connector 58 is a communication terminal for receiving and transmitting data (information) between other devices connected to the terminal device 7. In the present embodiment, as shown in FIG. 8(d), the expansion connector 58 is provided on the lower side surface of the outer cover 50. The other device connected to the expansion connector 58 may be any device such as a controller (gun type controller or the like) used in a specific game or an input device such as a keyboard. If there is no need to connect an add-on device, the expansion connector 58 may not be provided. The expansion connector 58 may include a terminal for supplying power to the additional device or a terminal for charging. Further, the terminal device 7 has a charging terminal 66 for taking power from the additional device in addition to the expansion connector 58. When the charging terminal 66 is connected to a stand 210 to be described later, power is supplied from the holder 210 to the terminal device 7. In the present embodiment, the charging terminal 66 is provided on the lower side surface of the outer cover 50. Therefore, when the terminal device 7 and the additional device (for example, the input device 200 shown in FIG. 15 or the input device 220 shown in FIG. 17) are connected, in addition to receiving and transmitting information via the expansion connector 58, Electricity is supplied from one party to the other. As described above, by providing the charging terminal 66 around the extension connector 58 (left and right sides), when the terminal device 7 and the attachment device are connected, information can be received and transmitted and power can be supplied. Further, the terminal device 7 has a charging connector, and the housing 50 has a cover portion 61 for protecting the charging connector. The charging connector can be connected to a charger 86 to be described later, and when the charging connector is connected to the charger, power is supplied from the charger 86 to the terminal device 7. In the present embodiment, it is considered that the attachment device is provided on the lower side surface of the outer cover 50, and the charging connector (cover portion 61) is provided on the upper side surface of the outer cover 50, but may be provided on the left and right side surfaces or the lower side surface.

Further, the terminal device 7 has a battery cover 67 that is detachable from the outer cover 50. A battery (battery 85 shown in Fig. 14) is disposed inside the battery cover 67. In the present embodiment, the battery cover 67 is provided on the back side of the outer cover 50, and is provided on the lower side of the protruding portion (the crotch portion 59). Further, on the outer cover 50 of the terminal device 7, holes 65a and 65b for fixing the straps of the slings are provided. As shown in Fig. 8(d), in the present embodiment, the holes 65a and 65b are provided on the lower surface of the outer cover 50. Further, in the present embodiment, two holes 65a and 65b are provided on the left and right sides of the outer cover 50, respectively. That is, the hole 65a is provided on the lower surface of the outer cover 50 to the left side from the center, and the hole 65b is provided on the lower side of the outer cover 50 to the right side. The user can tying the sling to any of the holes 65a and 65b and attaching the sling to the wrist of the user. Thereby, even when the user accidentally drops the terminal device 7 or detaches the terminal device 7 from the hand, the terminal device 7 can be prevented from falling or colliding with other articles. In the present embodiment, since the holes are provided on the left and right sides, the user can attach the sling to any of the hands, which is extremely convenient. Regarding the terminal device 7 shown in Figs. 8 to 13, the shape of each operation key or the cover 50, the number of each component, the installation position, and the like are merely examples, and may be other shapes, numbers, and installation positions.

Next, the internal configuration of the terminal device 7 will be described with reference to Fig. 14. Fig. 14 is a block diagram showing the internal configuration of the terminal device 7. As shown in FIG. 14, the terminal device 7 includes a touch panel controller 71, a magnetic sensor 72, an acceleration sensor 73, a rotation sensor 74, and a user in addition to the configuration shown in FIG. Interface controller (UI controller) 75, codec LSI 76, speaker 77, sound IC 78, microphone 79, wireless module 80, antenna 81, infrared communication module 82, flash memory 83, power supply IC 84, And battery 85. These electronic components are mounted on an electronic circuit board and housed in the housing 50.

The UI controller 75 is a circuit for inputting an output of control data to various output input sections. The UI controller 75 is connected to the touch panel controller 71, the analog rocker 53 (analog rocker 53A and 53B), the operation keys 54 (each operation keys 54A to 54L), the indicator portion 55, the magnetic sensor 72, The acceleration sensor 73 and the swing sensor 74. Further, the UI controller 75 is connected to the codec LSI 76 and the expansion connector 58. Further, the power source IC 84 is connected to the UI controller 75, and supplies power to the respective sections via the UI controller 75. The built-in battery 85 is connected to the power source IC 84 and supplies power. Further, a charger 86 or a cable that obtains electric power from an external power source via a charging connector or the like can be connected to the power source IC 84, and the terminal device 7 can be charged when the charger 86 or cable is used to supply electric power from an external power source. . The terminal device 7 can also be charged by attaching the terminal device 7 to a charger having a charging function (not shown). That is, although not shown in the drawing, a charger (a holder 210 shown in FIG. 20) that can obtain electric power from an external power source can be connected to the power source IC 84 via the charging terminal 66, and the terminal device 7 can use the charger. Power supply and charging from an external power source are performed.

The touch panel controller 71 is connected to the touch panel 52 and is a circuit for controlling the touch panel 52. The touch panel controller 71 generates a predetermined form of touch position data based on the signal from the touch panel 52 and outputs the touch position data to the UI controller 75. The touch position data indicates a coordinate at a position after input on the input surface of the touch panel 52. The touch panel controller 71 reads the signal from the touch panel 52 and generates the touch position data at a ratio of one time every predetermined time. Further, various control instructions to the touch panel 52 are output from the UI controller 75 to the touch panel controller 71. The analog rocker 53 outputs a rocker data indicating the direction and amount of sliding (or dumping) of the rocker portion operated by the user's finger to the UI controller 75. Further, the operation key 54 outputs an operation key data indicating the input condition (whether or not pressed) made to each of the operation keys 54A to 54L to the UI controller 75. The magnetic sensor 72 detects the orientation by detecting the magnitude and direction of the magnetic field. The orientation data showing the detected orientation is output to the UI controller 75. Further, a control instruction to the magnetic sensor 72 is output from the UI controller 75 to the magnetic sensor 72. Regarding the magnetic sensor 72, there are an MI (magnetic impedance) element, a fluxgate sensor, a Hall element, a GMR (major magnetoresistive) element, a TMR (tunneling magnetic resistance) element, or an AMR (anisotropy) A sensor such as a magnetic resistance component, but it can be used as long as it can detect the orientation. Strictly speaking, in the place where the magnetic field is generated in addition to the geomagnetism, the obtained orientation data does not show the orientation, but even in this case, since the orientation data changes when the terminal device 7 moves, the terminal device 7 can be calculated. Posture changes.

The acceleration sensor 73 is disposed inside the outer cover 50 and detects the magnitude of linear acceleration along the three axes (the xyz axis shown in (a) of Fig. 8). Specifically, the longitudinal direction of the outer cover 50 of the acceleration sensor 73 is the x-axis, the direction perpendicular to the surface of the outer cover 50 is the y-axis, and the short-side direction of the outer cover 50 is the z-axis to detect each axis. The value of the linear acceleration. The acceleration data showing the detected acceleration is output to the UI controller 75. Further, a control instruction to the acceleration sensor 73 is output from the UI controller 75 to the acceleration sensor 73. In the present embodiment, the acceleration sensor 73 is, for example, a capacitance type MEMS type acceleration sensor. However, in other embodiments, other types of acceleration sensors may be used. In addition, the acceleration sensor 73 can also detect an acceleration sensor in a 1-axis or 2-axis direction. The swing sensor 74 is disposed inside the outer cover 50 and detects angular velocities of three axes around the x-axis, the y-axis, and the z-axis. The angular velocity data showing the detected angular velocity is output to the UI controller 75. Further, a control instruction to the swing sensor 74 is output from the UI controller 75 to the swing sensor 74. The number and combination of the slewing sensors for detecting the angular velocities of the three axes can be any, and the gyro sensor 74 is the same as the gyro sensor 48, and can be a 2-axis gyro sensor and a 1-axis gyro sensor. Composition. In addition, the gyro sensor 74 can also be a gyro sensor that detects a 1-axis or 2-axis direction. The UI controller 75 outputs the operation data including the touch position data, the joystick data, the operation key data, the orientation data, the acceleration data, and the angular velocity data received from the respective constituent elements to the codec LSI 76. When other devices are connected to the terminal device 7 via the expansion connector 58, the operational data may further include information showing operations performed on the other devices.

The codec LSI 76 is a circuit that compresses data transmitted to the game device 3 and decompresses data transmitted from the game device 3. The codec LSI 76 is connected to an LCD 51, a camera 56, a sound IC 78, a wireless module 80, a flash memory 83, and an infrared communication module 82. Further, the codec LSI 76 includes a CPU 87 and an internal memory 88. The terminal device 7 is configured not to perform the game processing itself, but must execute a program for the minimum degree of management or communication of the terminal device 7. When the power is turned on, the program stored in the flash memory 83 is read to the internal memory 88 and executed by the CPU 87, whereby the terminal device 7 is activated. Further, a part of the area of the internal memory 88 is used as the VRAM of the LCD 51. The camera 56 captures an image in accordance with an instruction from the game device 3, and outputs the imaged image data to the codec LSI 76. Further, an instruction to control the camera 56, such as an image capturing instruction, is output from the codec LSI 76 to the camera 56. The camera 56 can also perform animated photography. That is, the camera 56 can perform repeated imaging and repeatedly output image data to the codec LSI 76.

The sound IC 78 is connected to the speaker 77 and the microphone 79, and is a circuit that controls the output of the sound data to the speaker 77 and the microphone 79. That is, when receiving the sound material from the codec LSI 76, the sound IC 78 outputs the sound signal obtained by D/A conversion of the sound data to the speaker 77, and outputs the sound from the speaker 77. Further, the microphone 79 detects the sound transmitted to the terminal device 7 (the user's voice or the like), and outputs an audio signal indicating the sound to the sound IC 78. The sound IC 78 A/D converts the sound signal from the microphone 79, and outputs the sound data of a predetermined form to the codec LSI 76. The codec LSI 76 transmits image data from the camera 56, sound data from the microphone 79, and operation data from the UI controller 75 to the game device 3 via the wireless module 80 as terminal operation data. In the present embodiment, the codec LSI 76 performs the same compression processing as the codec LSI 27 on the image data and the audio data. The terminal operation data and the compressed image data and sound data are output to the wireless module 80 as transmission data. An antenna 81 is connected to the wireless module 80, and the wireless module 80 transmits the transmission data to the game device 3 via the antenna 81. The wireless module 80 has the same function as the terminal communication module 28 of the game device 3. That is, the wireless module 80 has a function of being connected to the wireless LAN by, for example, a method according to the IEEE 802.11n standard. The information transmitted may or may not be encoded as necessary. As described above, the transmission data transmitted from the terminal device 7 to the game device 3 includes operation data (terminal operation data), image data, and sound data. When other devices are connected to the terminal device 7 via the expansion connector 58, the above-mentioned transmission data may further include data received from the other devices. In addition, the infrared communication module 82 can perform infrared communication according to the IRDA specification, for example, with other devices. The codec LSI 76 can transmit the data received via the infrared communication to the game device 3 by including the above-mentioned transmission data as necessary.

Further, as described above, the compressed image data and sound data are transmitted from the game device 3 to the terminal device 7. These data are received by the codec LSI 76 via the antenna 81 and the wireless module 80. The codec LSI 76 decompresses the received image data and sound data. The decompressed image data is output to the LCD 51, and an image is displayed on the LCD 51. That is, the codec LSI 76 (CPU 87) displays the received image data on the display unit. Further, the decompressed sound data is output to the sound IC 78, and the sound IC 78 outputs sound from the speaker 77.

Further, when the control data is included in the material received from the game device 3, the codec LSI 76 and the UI controller 75 instruct the respective units to follow the control information of the control data. As described above, the control data indicates each component included in the terminal device 7 (in the present embodiment, the camera 56, the touch panel controller 71, the indicator portion 55, the sensors 72 to 74, and the infrared rays) The data of the control indication performed by the communication module 82). In the present embodiment, the control instruction indicated by the control data may be an instruction to operate the above-described components or to stop (stop) the operation. In other words, the components that are not used in the game can be stopped in order to suppress power consumption. In this case, the data transmitted from the terminal device 7 to the game device 3 is not included in the data of the component that is suspended. Since the indicator portion 55 is an infrared LED, this control can be set only to turn on/off the power supply.

As described above, the terminal device 7 includes the operation means of the touch panel 52, the analog rocker 53, and the operation key 54, but in other embodiments, other operation means may be provided instead of or in addition to these operation means.

Further, the terminal device 7 includes a magnetic sensor 72, an acceleration sensor 73, and a rotation sensor 74 as a feeling for calculating the operation (including a change in position or posture or position or posture) of the terminal device 7. In another embodiment, the detector may be configured to include only one or two of the sensors. Further, in other embodiments, other sensors may be provided instead of or in addition to the sensors. Further, the terminal device 7 is configured to include the camera 56 and the microphone 79. However, in other embodiments, the camera 56 and the microphone 79 may not be provided or only one of them may be provided. Further, the terminal device 7 includes a marker 55 as a configuration for calculating a positional relationship between the terminal device 7 and the controller 5 (a position and/or a posture of the terminal device 7 when viewed from the controller 5, etc.), but other implementations In the form, the indicator 55 may not be provided. Further, in another embodiment, the terminal device 7 may include other means as a configuration for calculating the positional relationship. For example, in another embodiment, the controller 5 may include a labeling unit and the terminal device 7 may include an imaging element. Further, at this time, the indicator device 6 may be provided with an image pickup element instead of the infrared LED.

(Configuration of Add-on Device) Next, an example of an add-on device that can be attached (connected) to the terminal device 7 will be described with reference to FIGS. 15 to 20. The attachment device may have any function, for example, an additional operation device attached to the terminal device 7 for performing a predetermined operation, or a charger for supplying power to the terminal device 7, or for erecting the terminal device 7 to a predetermined posture. The bracket. As shown in Fig. 8 (d) and Fig. 9, on the lower surface of the projection (the crotch portion 59), locking holes 59a and 59b in which the claw portions of the attachment device can be locked are provided. The locking holes 59a and 59b are used when connecting other attachment means to the terminal device 7. That is, the attachment device has claw portions that can be locked to the locking holes 59a and 59b. When the attachment device is connected to the terminal device 7, the terminal device 7 is locked by locking the claw portions to the locking holes 59a and 59b. Fixed with attachments. Further, screw holes may be provided inside the locking holes 59a and 59b, and the attachments are firmly fixed by screws. Here, the protrusion provided on the back surface of the terminal device 7 is a crotch portion 59 having a meandering shape. That is, the crotch portion 59 is extended in the left-right direction. As shown in Fig. 9, the locking holes 59a and 59b are provided near the center of the lower surface of the crotch portion 59 (in the left-right direction). The number of the locking holes 59a and 59b provided on the lower surface of the crotch portion 59 may be any number. When it is 1, it is preferably provided at the center of the crotch portion 59. When it is plural, it is preferably arranged to be bilaterally symmetrical. According to this, the attachment device can be stably connected while maintaining the right and left balance equally. Further, when the locking hole is provided near the center, the size of the attachment can be reduced as compared with when it is disposed at the left and right ends. That is, the crotch portion 59 can be used as a locking member of the attachment. Further, in the present embodiment, as shown in Fig. 8(d), the lower surface of the outer cover 50 is provided with locking holes 50a and 50b. Therefore, when the attachment device is connected to the terminal device 7, the terminal device 7 and the attachment device are fixed by locking the four claw portions to the four locking holes. Thereby, it is possible to firmly fix the attachment to the terminal device 7. Screw holes may also be provided inside the locking holes 50a and 50b to screw the attachments together. Further, in other embodiments, the locking holes provided in the outer cover may be arbitrarily arranged.

Fig. 15 and Fig. 16 are views showing an example in which an attachment device is attached to the terminal device 7. Fig. 15 is a view of the terminal device 7 and the input device 200 viewed from the front side of the terminal device 7. Fig. 16 is a view of the terminal device 7 and the input device 200 viewed from the back side of the terminal device 7. In Figs. 15 and 16, the input device 200 as an attachment device is attached to the terminal device 7. The input device 200 includes a first grip portion 200a and a second grip portion 200b. Each of the grip portions 200a and 200b has a rod shape (columnar shape), and the user can hold it with one hand. The user can use one of the grip portions 200a and 200b to use the input device 200 (and the terminal device 7), or both of them to use the input device 200. The input device 200 may be configured to include only one grip portion. Further, the input device 200 includes a support portion 205. In the present embodiment, the support portion 205 supports the back surface (inner surface) of the terminal device 7. Specifically, the support portion 205 has four claw portions (protrusions), and the four claw portions can be locked to the locking holes 50a, 50b, 59a, and 59b, respectively. As shown in Fig. 15, when the input device 200 is connected to the terminal device 7, the terminal device 7 and the attachment device are locked by locking the four claw portions to the locking holes 50a, 50b, 59a, and 59b, respectively. fixed. Thereby, the input device 200 can be firmly fixed to the terminal device 7. Further, in another embodiment, in addition to the locking of the claw portion and the locking hole (or a method of locking instead), the input device 200 may be screwed to the terminal device 7 or the like, and the input device may be inserted. The 200 is more firmly fixed to the terminal device 7. The position where the screw is fixed can be any position. For example, the support portion 205 of the input device 200 that abuts against the back surface of the outer cover 50 can be screwed and fixed to the crotch portion 59.

As described above, in the present embodiment, the attachment device can be tightly fixed to the terminal device 7 by the locking holes 59a and 59b. The terminal device 7 has a sensor (magnetic sensor 72, acceleration sensor 73, and rotation sensor 74) for detecting the motion or tilt of the terminal device 7, so that the terminal device 7 itself can be moved. To use. For example, when the input device 200 shown in FIGS. 15 and 16 is connected to the terminal device 7, the user may also hold the grip portion 200a and/or the grip portion 200b of the input device 200, and The input device 200 is moved in a gun-like manner to operate. As in the present embodiment, when the terminal device 7 itself is moved and used, it is particularly effective for the attachment device to be tightly fixed by the locking holes 59a and 59b. Further, in the present embodiment, the support portion 205 detachably supports the terminal so that the screen of the LCD 51 is oriented substantially vertically when the first grip portion 200a (or the second grip portion 200b) faces in the vertical direction. Device 7. Each of the grip portions 200a and 200b is formed substantially in parallel with the display portion (the surface of the outer cover 50) of the terminal device 7 connected to the input device 200. In other words, each of the grip portions 200a and 200b is formed in the vertical direction of the display portion of the terminal device 7 connected to the input device 200. As described above, the input device 200 connects the display unit of the terminal device 7 to the terminal device 7 in a posture toward the user when the user holds the input device 200. The user can hold the screen of the display unit toward himself by holding the at least one of the grip portions 200a and 200b substantially vertically. Therefore, the user can operate using the input device 200 while viewing the screen of the display unit. In the present embodiment, the second grip portion 200b is oriented substantially parallel to the first grip portion 200a. However, in other embodiments, at least one grip portion may be formed substantially parallel to the screen of the LCD 51. Orientation. Thereby, the user can easily hold the input device 200 and the terminal device 7 by holding the grip portion so that the LCD 51 faces toward himself.

Further, in the above embodiment, the support portion 205 is provided in the connection member 206 that connects the first grip portion 200a and the second grip portion 200b. That is, since the support portion 205 is provided between the two grip portions 200a and 200b, the terminal device 7 connected to the input device 200 is disposed between the two grip portions 200a and 200b. At this time, the center of gravity of the operating device (operating system) constituted by the terminal device 7 and the input device 200 is located between the two grip portions 200a and 200b, so that the user can hold the two grip portions 200a and 200b by holding the grip portions 200a and 200b. Hold and hold the operating device easily. In the above embodiment, one of the grip portions (the first grip portion 200a) is provided at the position on the front side of the screen of the terminal device 7 mounted on the input device 200, and the other grip portion (the second grip portion) The grip portion 200b) is disposed at a position on the rear side of the screen. Therefore, the user can easily hold the operation device by placing one hand in front of the screen and the other hand behind the screen, and holding the two grip portions by the grip of the gun. Therefore, for example, in the shooting game in which the above-described operating device is used as a gun for a game operation, the above-described operating device is particularly suitable. Further, the input device 200 includes a first key 201, a second key 202, a third key 203, and a rocker 204 as an operation unit. Each of the keys 201 to 203 can be respectively pressed (key) by the user. The rocker 204 is a device that can indicate the direction. Preferably, the operation portion is located at a position where the finger of the hand can be operated when the user grips the grip portion. In the present embodiment, the first key 201, the second key 202, and the rocker 204 are provided at positions where the thumb of the hand holding the first grip portion 200a is operable. Further, the third key 203 is provided at a position where the index finger of the hand holding the second grip portion 200b can be operated.

The input device 200 may be provided with an imaging device (imaging unit). For example, the input device 200 may be configured similarly to the imaging information computing unit 35 included in the controller 5 described above. At this time, the imaging element of the imaging information computing unit can be set to face the front of the input device 200 (the rear of the screen of the terminal device 7). For example, instead of the third key 203, an infrared filter can be disposed at the position of the third key 203, and the imaging element can be disposed inside. According to this, the user can use the front side of the input device 200 toward the television 2 (the pointing device 6), and the game device 3 can calculate the orientation or position of the input device 200. Therefore, the user can perform the operation in which the input device 200 is operated in a desired direction, and the input device 200 can be used for intuitive and easy operation. Further, the input device 200 may be configured to include a camera similar to the camera 56 instead of the imaging information computing unit. At this time, the camera is the same as the imaging element described above, and can be set to face the front side of the input device 200. According to this, the user can use the front side of the input device 200 toward the television 2 (the pointing device 6) to capture an image in the imaging direction opposite to the camera 56 of the terminal device 7.

Further, the input device 200 includes a connector (not shown), and the connector is connected to the expansion connector 58 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. Thereby, data can be received and transmitted between the input device 200 and the terminal device 7. For example, the data showing the operation performed on the input device 200 or the data showing the imaging result of the imaging device can be transmitted to the terminal device 7. At this time, the terminal device 7 can also wirelessly transmit the data showing the operation performed on the terminal device 7 and the data transmitted from the input device to the game device 3. Further, the input device 200 may include a charging terminal that is connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. According to this, when the terminal device 7 is mounted on the input device 200, power can be supplied from one device to the other device. For example, the input device 200 can be connected to a charger, and the terminal device 7 can obtain power from the charger via the input device 200 to perform charging. The input device 200 can also be configured, for example, as follows. Fig. 17 is a view showing another example of the input device. Further, Fig. 18 and Fig. 19 are views showing the appearance of the input device 220 shown in Fig. 17 attached to the terminal device 7. 18 is a view of the terminal device 7 and the input device 220 viewed from the back side of the terminal device 7. FIG. 19 is a view of the terminal device 7 and the input device 220 viewed from the front side of the terminal device 7. In the terminal device 7, for example, the input device 220 shown in Fig. 17 may be mounted. The input device 220 will be described below. In the seventeenth and twenty-fifthth drawings, the constituent elements of the components corresponding to the input device 200 shown in Figs. 15 and 16 are denoted by the same reference numerals as those of Figs. 15 and 16 and are omitted. The detailed description.

As shown in Fig. 17, the input device 220 is provided with the first grip portion 200a and the second grip portion 200b similarly to the input device 200. Therefore, the user can use the input device 220 (and the terminal device 7) only by holding one of the grip portions 200a and 200b, or use the input device 220 by holding both. Further, the input device 220 includes the same support portion 205 as the input device 200. The support portion 205 has the same shape as the support portion of the input device 200, and has four claw portions (only three claw portions 205a to 205c are shown in Fig. 17). Among the claw portions, the upper two claw portions 205a and 205b are respectively locked to the locking holes 59a and 59b of the terminal device 7. The two claw portions on the lower side can be locked to the locking holes 50a and 50b of the terminal device 7, respectively. The claw portion not shown in the drawing is provided at a position symmetrical with the claw portion 205c in the left-right direction (the left-right direction of the terminal device 7 attached to the support portion 205). As shown in FIGS. 18 and 19, when the input device 220 is connected to the terminal device 7, the terminal devices are locked by locking the four claw portions to the locking holes 50a, 50b, 59a, and 59b, respectively. 7 is fixed to the input device 220. Thereby, the input device 220 can be firmly fixed to the terminal device 7. In addition, in other embodiments, in addition to the locking of the claw portion and the locking hole (or instead of locking), the input device 220 may be screwed to the terminal device 7 or the like, and the input device may be inserted. The 220 is fixed to the terminal device 7 more firmly. For example, screw holes may be provided inside the locking holes 50a and 50b, and the two lower claw portions may be screwed and fixed to the locking holes 50a and 50b. In addition, the position where the screw is fixed can be any position. As described above, the input device 220 is also the same as the input device 200, and can be tightly fixed to the terminal device 7.

Further, the input device 220 is also the same as the input device 200, and the support portion 205 is such that when the first grip portion 200a (or the second grip portion 200b) is oriented in the vertical direction, the screen of the LCD 51 is oriented substantially vertically. The terminal device 7 is detachably supported. Each of the grip portions 200a and 200b is formed substantially in parallel with the display portion (the surface of the outer cover 50) of the terminal device 7 connected to the input device 220. Therefore, by holding the at least one of the grip portions 200a and 200b substantially vertically, the user can face the screen of the display portion. Therefore, the user can operate using the input device 200 while viewing the screen of the display portion. Further, with respect to the input device 220, like the input device 220, the support portion 205 also supports the terminal device 7 above the grip portion, and therefore, for the user holding the grip portion, the configuration for easy viewing of the screen . In other embodiments, at least one of the grip portions may be formed to be substantially parallel to the screen of the LCD 51. In the input device 220, the shape of the connection portion is different from that of the input device 200. The connecting portion 209 shown in Fig. 17 is connected to the upper side and the lower side 2 of the first grip portion 200a, and is connected to the upper side (upper end) of the second grip portion 200b. Further, the input device 220 is also the same as the input device 200, and the connection portion 209 is formed to protrude forward from the second grip portion 200b. The input device 220 is also the same as the input device 200, and the support portion 205 is provided in the connection member 209 that connects the first grip portion 200a and the second grip portion 200b. Therefore, the user can easily hold the operating device by holding and holding the two grip portions 200a and 200b. Further, the connecting portion 209 has a member that extends downward from a connecting portion with the supporting portion 205. When the screen of the LCD 51 connected to the terminal device 7 of the support portion 205 has a substantially vertical orientation, the member has an orientation extending in a substantially vertical direction. That is, the above-described members are oriented substantially in parallel with the respective grip portions 200a and 200b. Therefore, when the user holds the member as the grip portion, the member can be operated by the input device 200 while viewing the screen of the LCD 51 by holding the member substantially vertically. Further, since the member is disposed below the support portion 205, by holding the member, it is possible to arrange the screen for the user.

The input device 220 is also the same as the input device 200, and one grip portion (first grip portion 200a) is provided at a position on the front side of the screen of the terminal device 7 mounted on the input device 220, and the other grip is provided. The portion (the second grip portion 200b) is provided at a position on the rear side of the screen. Therefore, similarly to the input device 200, the input device 220 is particularly suitable for a shooting game in which the grip portion is easily held by the grip of the gun and the operation device is used as a gun to perform a game operation. Further, the input device 220 as the operation unit further includes a fourth key 207 in addition to the second key 202 and the rocker 204 which are the same as the input device 200. The second key 202 and the rocker 204 are provided on the upper side of the first grip portion 200a similarly to the input device 200. The fourth key 207 is a key (button) that can be pressed by the user. The fourth key 207 is provided on the upper side of the second grip portion 200b. That is, the fourth key 207 is provided at a position where the index finger of the hand holding the second grip portion 200b can be operated. The input device 220 is provided with an imaging element (imaging device). Here, the input device 220 is configured similarly to the imaging information computing unit 35 included in the controller 5. At this time, the imaging element of the imaging information computing unit can be set to face the front of the input device 220 (the rear of the screen of the terminal device 7). Specifically, a window portion (infrared filter) 208 is provided in front of the input device 220 (the front end portion of the connection portion 206), and an imaging element is provided inside the window portion 208, and is provided to image the front side from the window portion 208. Orientation. According to the above, the user can use the front of the input device 220 toward the television 2 (the pointing device 6), so that the game device 3 can calculate the orientation or position of the input device 220. Therefore, the user can operate the input device 220 in a desired direction, and can intuitively and easily operate using the input device 220.

Further, the input device 220 may be configured to include a camera similar to the camera 56 instead of the imaging information computing unit. According to this, the user can use the front side of the input device 220 toward the television 2 (the pointing device 6) to capture an image in the imaging direction opposite to the camera 56 of the terminal device 7. Similarly to the input device 200, the input device 220 includes a connector (not shown), and the connector is connected to the expansion connector 58 of the terminal device 7 when the terminal device 7 is mounted on the input device 220. Thereby, data can be received and transmitted between the input device 220 and the terminal device 7. Therefore, the data showing the operation performed on the input device 220 and the data showing the imaging result of the imaging device can be transmitted to the game device 3 via the terminal device 7. Further, in other embodiments, the input device 220 may be configured to directly communicate with the game device 3. That is, the information showing the operation performed on the input device 220 is, for example, the same as the wireless communication between the controller 5 and the game device 3, and is directly transmitted from the input device 220 to the game device 3 using Bluetooth (registered trademark) technology or the like. . At this time, the information showing the operation performed on the terminal device 7 is transmitted from the terminal device 7 to the game device 3. Further, the input device 220 may be the same as the input device 200, and may include a charging terminal that is connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 220.

Further, in other embodiments, an operation device in which the terminal device 7 and the input device 200 (or the input device 220) are integrated may be provided. At this time, it is not necessary to provide each of the locking holes 50a, 50b, 59a, and 59b in the terminal device 7, and a mechanism for detachably connecting the terminal device 7 and the input device 200 to the claw portion of the input device 200. Fig. 20 is a view showing another example in which the attachment device is attached to the terminal device 7. In Fig. 20, the terminal device 7 is connected (mounted) to the holder 210 as an example of the attachment device. The bracket 210 is a supporting device for placing (supporting) the terminal device 7 at a predetermined angle. The holder 210 includes a support member 211, a charging terminal 212, and guiding members 213a and 213b. In the present embodiment, the bracket 210 also has a function as a charger and has a charging terminal 212. The charging terminal 212 is a terminal that can be connected to the charging terminal 66 of the terminal device 7. In the present embodiment, each of the charging terminals 66 and 212 is a metal terminal, but one of the connectors may have a connector connectable to the other. When the terminal device 7 is connected to the holder 210, the charging terminal 212 of the holder 210 comes into contact with the charging terminal 66 of the terminal device 7, and electric power can be supplied from the holder 210 to the terminal device 7 for charging. The support member 211 is for supporting the back side of the terminal device 7 at a predetermined angle. The support member 211 supports a predetermined surface (here, a back surface) of the outer cover 50 when the terminal (charging terminal 66) of the terminal device 7 is connected to the terminal (charging terminal 212) of the holder 210. As shown in Fig. 20, the support member 211 includes a wall portion 211a and a groove portion 211b. The support member 211 supports the outer cover 50 by the wall portion 211a such that the rear surface of the outer cover 50 is placed along a predetermined support surface (here, the surface formed by the wall portion 211a). Further, the groove portion 211b is a portion into which a part (lower side portion) of the outer cover 50 is inserted when the terminal device 7 is connected to the bracket 210. Therefore, the groove portion 211b is formed to substantially fit the above-described partial shape of the outer cover 50. The groove portion 211b extends in a direction parallel to the support surface.

Further, the guide members 213a and 213b are members that can be inserted into the second locking holes 50a and 50b of the terminal device 7, and are connected to the position of the holder 210 by the terminal device 7. Each of the guiding members 213a and 213b is provided at a position corresponding to the locking holes 50a and 50b of the terminal device 7. That is, each of the guiding members 213a and 213b is provided at a position where the terminal device 7 is inserted into the locking holes 50a and 50b when the terminal device 7 is properly connected to the holder 210. When the terminal device 7 and the bracket 210 are correctly connected, the charging terminal 212 of the bracket 210 is connected to the charging terminal 66 of the terminal device 7. Further, a part of the guiding members 213a and 213b is provided to protrude from the bottom surface of the groove portion 211b. That is, a part of the guiding members 213a and 213b is provided to protrude upward from the surface of the supporting member 211. When the terminal device 7 is connected to the holder 210, a part of the guide members 213a and 213b is inserted into the locking holes 50a and 50b, respectively. In the present embodiment, each of the guide members 213a and 213b is a rotatable wheel member (roller portion). Each of the guiding members 213a and 213b is rotatable in a predetermined direction. Here, the predetermined direction is a direction parallel to the support surface (in the horizontal direction), in other words, the left-right direction of the terminal device 7 when the terminal device 7 is connected to the holder 210. The guiding member may be any rotating member that is rotatable in a predetermined direction. For example, in other embodiments, the guiding member may be a sphere that is rotatably supported by a spherical recess. Further, in the present embodiment, the number of the guiding members is two, but the number of guiding members may be provided in accordance with the number of the locking holes provided in the lower surface of the terminal device 7, and the bracket 210 may have one or three. More than one guiding member.

When the terminal device 7 is connected to the holder 210, the terminal device 7 can be placed on the holder 210 at a predetermined angle by bringing the back surface of the terminal device 7 into contact with the support member 211. That is, a part of the lower side of the outer cover 50 is inserted into the groove portion 211b, and the wall portion 211a supports the back surface of the outer cover 50, whereby the terminal device 7 can be placed on the holder 210 at a predetermined angle. Therefore, in the present embodiment, the position of the terminal device 7 can be positioned at the correct position by the support member 211 in the direction perpendicular to the predetermined direction. Here, when the terminal device 7 is connected to the cradle 210, when the terminal device 7 and the cradle 210 are not in the correct positional relationship, the position of the terminal device 7 is corrected by the respective guiding members 213a and 213b. That is, when the locking holes 50a and 50b are deviated from the guiding members 213a and 213b in the predetermined direction, the guiding members 213a and 213b contact the outer cover 50 around the locking holes 50a and 50b. In response to this, the terminal device 7 is slidably moved in a predetermined direction by the rotation of the guiding members 213a and 213b. In the present embodiment, since the two guide members 213a and 213b are arranged in the predetermined direction, the lower surface of the terminal device 7 can be brought into contact with only the guide members 213a and 213b, and the terminal device 7 can be smoothly moved. Further, when the inclination (concave inclination) is provided around the locking holes 50a and 50b, the terminal device 7 can be moved more smoothly. As described above, as a result of the sliding movement of the terminal device 7, the respective portions of the guiding members 213a and 213b are inserted into the locking holes 50a and 50b. Thereby, the charging terminal 212 of the holder 210 is brought into contact with the charging terminal 66 of the terminal device 7, and charging is surely performed.

As described above, the user can easily connect the terminal device 7 to the cradle 210 even if the terminal device 7 is not placed at the correct position. According to the present embodiment, the positioning of the terminal device 7 with respect to the holder 210 can be performed by the simple configuration of the locking hole of the terminal device 7 and the guiding member of the holder 210. Therefore, the holder 210 can be formed into a small and simple configuration. In the present embodiment, the terminal device 7 is a relatively large type of transportable device. However, even in such a large transportable device, the holder 210 itself can have a small configuration as shown in Fig. 20. In addition, since the bracket 210 can be connected to terminal devices of various shapes or sizes, it is possible to provide a support device having high versatility. Further, in the present embodiment, the locking holes 50a and 50b are used as holes for locking the claws of the attachment means, and can be used as objects for inserting the guiding members. Therefore, the number of holes provided in the outer cover 50 of the terminal device 7 can be reduced, and the shape of the outer cover 50 can be simplified. In the above-described embodiment, the hole into which the guide member of the holder 210 is inserted is a hole (the locking holes 50a and 50b) provided on the lower side surface of the outer cover 50, but the position of the hole may be any position. For example, a hole may be provided on the other side of the outer cover 50, or a hole may be provided on the front or back surface of the outer cover 50. Since the guide portion must be disposed at the position corresponding to the position of the hole, when the hole is provided on the front surface or the back surface of the outer cover 50, the guide portion of the bracket 210 can be disposed, for example, at the position of the wall portion 211a. Further, holes may be provided in a plurality of faces of the outer cover 50. At this time, the terminal device 7 can be placed on the holder 210 in various directions.

[5. Game Processing] Next, the details of the game processing executed in the game system will be described in detail. First, various materials used in game processing will be described. Figure 21 is a diagram showing various materials used in game processing. Fig. 21 is a view showing main data stored in the main memory (external main memory 12 or internal main memory 11e) of the game device 3. As shown in Fig. 21, in the main memory of the game device 3, the game program 90, the received data 91, and the processing material 106 are stored. In addition to the information shown in Fig. 21, the information required for the game such as the image data of various objects appearing in the game or the sound data used in the game is stored in the main memory. The game program 90 reads all or part of the sound from the optical disc 4 and stores it in the main memory at an appropriate timing after the power of the game device 3 is turned on. The game program 90 can also be obtained from the flash memory 17 or an external device of the game device 3 (for example, via the Internet) instead of reading from the optical disc 4. Further, one of the parts included in the game program 90 (for example, a program for calculating the posture of the controller 5 and/or the terminal device 7) can be stored in the game device 3 in advance. The received data 91 is various materials received from the controller 5 and the terminal device 7. The received data 91 includes: controller operating data 92, terminal operating data 97, camera image data 104, and microphone sound data 105. When a plurality of controllers 5 are connected, the controller operation data 92 also has a plurality of them. When a plurality of terminal devices 7 are connected, the terminal operation data 97, the camera image data 104, and the microphone sound data 105 are also plural.

The controller operation data 92 is information for displaying the operation performed by the user (player) on the controller 5. The controller operation data 92 is transmitted from the controller 5 and acquired in the game device 3, and is memorized in the main memory. The controller operation data 92 includes a first operation key data 93, a first acceleration data 94, a first angular velocity data 95, and a marker coordinate data 96. In the main memory, a predetermined number of controller operation data can be sequentially stored from the latest (final acquisition). The first operation key data 93 is for displaying information on the input states of the operation keys 32a to 32i provided in the controller 5. Specifically, the first operation key data 93 indicates whether or not each of the operation keys 32a to 32i is depressed. The first acceleration data 94 is data showing the acceleration (acceleration vector) detected by the acceleration sensor 37 of the controller 5. Here, the first acceleration data 94 shows that the acceleration in the three-axis direction of the XYZ shown in FIG. 3 is a three-dimensional acceleration of each component. However, in another embodiment, any one direction can be displayed. The above acceleration can be. The first angular velocity data 95 is data showing the angular velocity detected by the rotation sensor 48 of the controller 5. Here, the first angular velocity data 95 shows the angular velocities on the three axes around the XYZ shown in FIG. 3, but in other embodiments, the angular velocities around any one of the axes may be displayed. The marker coordinate data 96 is a coordinate that is displayed by the image processing circuit 41 of the imaging information computing unit 35, that is, the information of the marker coordinates. The marker coordinates are represented by a 2-dimensional coordinate system for displaying the position on the plane corresponding to the captured image, and the marker coordinate data 96 displays the coordinate values on the 2-dimensional coordinate system.

The controller operation data 92 may be any one of the above-mentioned respective materials 93 to 96 as long as the operator of the user who operates the controller 5 can be displayed. In addition, when the controller 5 has other input means (such as a touch panel or analog joystick, etc.), the controller operating data 92 may also contain information indicating the operations performed on the other input means. In the present embodiment, when the operation of the controller 5 itself is used in the game operation, the controller operation data 92 is such as the first acceleration data 94, the first angular velocity data 95, or the marker coordinate data 96, which may include the controller. 5 The action of the value changes itself. The terminal operation data 97 is information showing the operation performed by the user on the terminal device 7. The terminal operation data 97 is transmitted from the terminal device 7 and acquired in the game device 3, and is stored in the main memory. The terminal operation data 97 includes: a second operation key data 98, a joystick data 99, a touch position data 100, a second acceleration data 101, a second angular velocity data 102, and orientation data. In the main memory, a predetermined number of terminal operation data can be sequentially stored from the latest (final acquisition). The second operation key data 98 is for displaying information on the input states of the operation keys 54A to 54L provided in the terminal device 7. Specifically, the second operation key data 98 indicates whether or not each of the operation keys 54A to 54L is depressed. The joystick data 99 is a data showing the direction and amount of sliding (or dumping) of the rocker portion of the analog rocker 53 (analog rocker 53A and analog rocker 53B). The above directions and quantities can be displayed, for example, as 2-dimensional coordinates or 2-dimensional vectors.

The touch position data 100 is information for displaying a position (touch position) input on the input surface of the touch panel 52. In the present embodiment, the touch position data 100 displays a coordinate value on a two-dimensional coordinate system for displaying the position on the input surface. When the touch panel 52 is in the multi-touch mode, the touch location data 100 also displays a plurality of touch locations.

The second acceleration data 101 is data showing the acceleration (acceleration vector) detected by the acceleration sensor 73. In the second embodiment, the second acceleration data 101 shows that the acceleration in the three-axis direction of xyz shown in Fig. 8 is a three-dimensional acceleration of each component. However, in other embodiments, any one can be displayed. The acceleration in more than one direction can be used.

The second angular velocity data 102 is data showing the angular velocity detected by the rotary sensor 74. In the present embodiment, the second angular velocity data 102 shows the angular velocities of the three axes around the xyz shown in Fig. 8, but in other embodiments, the angular velocity around any one of the axes may be displayed.

The orientation data 103 is data showing the orientation detected by the magnetic sensor 72. In the present embodiment, the orientation data 103 displays the orientation of the predetermined orientation (for example, the north) based on the terminal device 7. However, in a place where a magnetic field other than geomagnetism is generated, the orientation data 103 does not strictly indicate the absolute orientation (north, etc.), but shows the relative direction of the direction of the magnetic field of the terminal device 7 with respect to the location, so even here In this case, the posture change of the terminal device 7 can also be calculated.

The terminal operation data 97 may be any one of the above-described respective materials 98 to 103 as long as the operator of the user operating the terminal device 7 can be displayed. In addition, when the terminal device 7 has other input means (for example, a touch pad or an imaging means of the controller 5, etc.), the terminal operation material 97 may also include information on operations performed by the other input means. As shown in the present embodiment, when the operation of the terminal device 7 itself is used for the game operation, the terminal operation data 97 is such as the second acceleration data 101, the second angular velocity data 102, or the orientation data 103, and may include the corresponding terminal device 7 Change the value of the data by its own actions. The camera image data 104 is data for displaying an image (camera image) imaged by the camera 56 of the terminal device 7. The camera image data 104 is image data obtained by decompressing compressed image data from the terminal device 7 by the codec LSI 27, and is stored in the main memory by the output/output processor 11a. In the main memory, a predetermined number of camera image data can be sequentially stored from the latest (final). The microphone sound data 105 is data for displaying the sound (microphone sound) detected by the microphone 79 of the terminal device 7. The microphone sound data 105 is sound data obtained by decompressing the compressed sound data transmitted from the terminal device 7 by the codec LSI 27, and is memorized in the main memory by the output input processor 11a. The processing material 106 is the material used in the game processing (Fig. 22) to be described later. The processing data 106 includes control data 107, controller posture data 108, terminal posture data 109, image recognition data 110, and sound recognition data 111. In addition to the data shown in Fig. 21, the processing material 106 also includes various materials used in game processing such as displaying various parameters set in various objects appearing in the game.

The control data 107 is information for displaying a control instruction to the components included in the terminal device 7. The control data 107 displays, for example, an instruction to light the control indicator unit 55, an instruction to control imaging of the camera 56, and the like. The control data 107 is transmitted to the terminal device 7 at an appropriate timing. The controller posture data 108 is information showing the posture of the controller 5. In the present embodiment, the controller posture data 108 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96 included in the controller operation data 92. The method of calculating the controller posture data 108 will be described in step S23. The terminal posture data 109 is information showing the posture of the terminal device 7. In the present embodiment, the terminal posture data 109 is calculated based on the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 included in the terminal operation data 97. The method of calculating the terminal posture data 109 will be described in step S24. The image identification data 110 is data for displaying a result of performing predetermined image recognition processing on the camera image. The image recognition processing can be any processing as long as a certain feature can be detected from the camera image and outputting the result, for example, a predetermined object can be extracted from the camera image (for example, a user's face or logo, etc.) ) and calculate the processing of the information related to the captured object. The voice recognition data 111 is data for displaying a result of performing predetermined voice recognition processing on the microphone sound. The voice recognition process can be any process as long as it can detect a certain feature from the microphone sound and output the result, for example, the process of detecting the user's speech, or only the output volume.

Next, the details of the game processing performed in the game device 3 will be described with reference to Fig. 22. Fig. 22 is a main flowchart showing the flow of the game processing executed in the game device 3. When the power of the game device 3 is turned on, the CPU 10 of the game device 3 executes an activation program stored in a boot ROM (not shown), thereby initializing each unit of the main memory or the like. Then, the game program stored in the disc 4 is read to the main memory, and the CPU 10 starts execution of the game program. In the game device 3, the game program stored in the disc 4 can be executed immediately after the power is turned on, or the built-in program for displaying the predetermined menu screen is first executed after the power is turned on, and then the user is instructed to indicate the start of the game. Then execute the game program stored on CD 4. The flowchart shown in Fig. 22 is a flowchart of the processing performed after the above processing is completed. The processing of each step of the flowchart shown in Fig. 22 is only an example, and the processing order of each step can be replaced as long as the same result can be obtained. In addition, the variable value or the threshold value used in the judgment step is only an example, and other values may be used as necessary. Further, in the present embodiment, the processing of each step of the above-described flowchart is executed by the CPU 10. However, the processing of one of the steps of the above steps may be executed by a processor or a dedicated circuit other than the CPU 10. First, in step S1, the CPU 10 performs initial processing. The initial processing, for example, is to construct a virtual game space, and to arrange the objects that appear in the game in the initial position, or to set the initial values of various parameters used in the game processing.

Further, in the present embodiment, in the initial processing, the CPU 10 controls the lighting of the pointing device 6 and the indicator portion 55 in accordance with the type of the game program. Here, the game system 1 has both the pointing device 6 and the indicator portion 55 of the terminal device 7 as the imaging target of the imaging means (the imaging information computing unit 35) of the controller 5. Either or both of the pointing device 6 and the indicator portion 55 are used depending on the content of the game (the type of the game program). The game program 90 includes information indicating whether or not each of the pointing device 6 and the indicator portion 55 is turned on. The CPU 10 reads the data and judges whether or not the lighting is to be performed. When the marking device 6 and/or the indicator portion 55 are turned on, the following processing is performed.

That is, when the indicator device 6 is turned on, the CPU 10 transmits a control signal for turning on the infrared LEDs provided in the indicator device 6 to the indicator device 6. The transmission of the control signal can be a power supply only. In response to this, each of the infrared LEDs of the marking device 6 is lit. On the other hand, when the indicator unit 55 is turned on, the CPU 10 generates control data indicating an instruction to turn on the indicator unit 55, and stores it in the main memory. The generated control data is transmitted to the terminal device 7 in step S10 which will be described later. The control data received by the wireless module 80 of the terminal device 7 is transmitted to the UI controller 75 via the codec LSI 76, and the UI controller 75 performs an instruction to turn on the indicator portion 55. Thereby, the infrared LED of the indicator portion 55 is turned on. In the above description, the case where the indicator device 6 and the indicator portion 55 are turned on will be described, and the extinguishing of the indicator device 6 and the indicator portion 55 can be performed by the same processing as that at the time of lighting.

After the above step S1, the processing of step S2 is performed. Hereinafter, the processing circuit constituted by the series of processes of steps S2 to S11 is repeatedly executed at a ratio of one predetermined time (1 frame time).

In step S2, the CPU 10 acquires the controller operation data transmitted from the controller 5. Since the controller 5 repeatedly transmits the controller operation data to the game device 3, in the game device 3, the controller communication module 19 successively receives the controller operation data, and controls the reception by the output input processor 11a. The device operation data is memorized in the main memory. The interval at which the transmission is received is preferably shorter than the processing time of the game, for example, one-twentieth of a second. In step S2, the CPU 10 reads the latest controller operation data 92 from the main memory. The process of step S3 is performed after step S2. In step S3, the CPU 10 acquires various materials transmitted from the terminal device 7. Since the terminal device 7 repeatedly transmits the terminal operation data, the camera image data, and the microphone sound data to the game device 3, the game device 3 sequentially receives the data. In the game device 3, the terminal communication module 28 sequentially receives the data, and the codec LSI 27 sequentially applies decompression processing to the camera image data and the microphone sound data. Further, the output input processor 11a sequentially memorizes the terminal operation data, the camera image data, and the microphone sound data in the main memory. In step S3, the CPU 10 reads the latest terminal operation material 97 from the main memory. The process of step S4 is performed after step S3. In step S4, the CPU 10 executes game control processing. The game control process is a process of performing an action to move an object in the game space in accordance with a game operation of the user, so that the game is processed. In the present embodiment, the user can perform various types of games using the controller 5 and/or the terminal device 7. The game control processing will be described below with reference to Fig. 23.

Figure 23 is a flow chart showing the detailed flow of the game control process. The series of processing shown in Fig. 23 is a process that can be executed when the controller 5 and the terminal device 7 are used as the operating device. However, it is not necessary to perform all the processes, and only the type or content of the game can be used. Perform some processing. In the game control processing, first, in step S21, the CPU 10 determines whether or not to change the marker to be used. As described above, in the present embodiment, at the start of the game processing (step S1), the processing of lighting the control signing device 6 and the indicator portion 55 is performed. Here, depending on the game, it is also considered that the object used (lighting) in the marking device 6 and the indicator portion 55 is changed in the middle of the game. In addition, depending on the game, although the use of both the marking device 6 and the indicator portion 55 is considered, when the two are lit, there is a doubt that one of the markers is erroneously detected as the other marker. . Therefore, in the game, there is also a case where it is preferable to use only one of the lights to switch the lighting. In the process of step S21, in order to consider this situation, it is determined whether or not the processing of the lighting object needs to be changed in the middle of the game. The determination of the above step S21 can be performed, for example, by the following method. That is, the CPU 10 can make the above determination in accordance with whether or not the game situation (the stage of the game or the operation target, etc.) changes. This is because it is possible to change the operation method between the operation method of operating the controller 5 toward the pointing device 6 and the operation method of operating the controller 5 toward the indicator portion 55, as it is considered that the game situation changes. Further, the CPU 10 can perform the above determination in accordance with the posture of the controller 5. That is, the above determination can be made by judging that the controller 5 is facing the pointing device 6 or toward the indicator portion 55. The posture of the controller 5 can be calculated based on, for example, the detection result of the acceleration sensor 37 or the rotation sensor 48 (refer to step S23 described later). Further, the CPU 10 can also perform the above determination by judging whether or not there is a change instruction of the user.

When the result of the determination in the above step S21 is affirmative, the processing of step S22 is performed. On the other hand, if the result of the determination in the above step S21 is negative, the processing of step S23 is executed while the processing of step S22 is skipped. In step S22, the CPU 10 controls the lighting of the pointing device 6 and the indicator portion 55. That is, the lighting state of the marking device 6 and/or the indicator portion 55 is changed. The specific processing of turning on or off the marking device 6 and/or the indicator portion 55 can be performed in the same manner as in the above-described step S1. The process of step S23 is performed after step S22. As described above, according to the present embodiment, by the processing of the above-described step S1, the light emission (lighting) of the pointing device 6 and the indicator portion 55 can be controlled in accordance with the type of the game program, and the processing of the above-described steps S21 and S22 can be performed. The lighting (lighting) of the marking device 6 and the indicator portion 55 is controlled in response to the game situation. In step S23, the CPU 10 calculates the posture of the controller 5. In the present embodiment, the posture of the controller 5 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96. A method of calculating the posture of the controller 5 will be described below.

First, the CPU 10 calculates the posture of the controller 5 based on the first angular velocity data 95 stored in the main memory. The method of calculating the posture of the controller 5 from the angular velocity may be any method using the previous posture (the previously calculated posture) and the current angular velocity (this is obtained in step S2 in the processing circuit of the current time). The angular velocity is calculated. Specifically, the CPU 10 calculates the posture by rotating the previous posture by a certain unit time portion at the current angular velocity. The previous posture is displayed by the controller posture data 108 memorized in the main memory, and the angular velocity of this time is displayed by the first angular velocity data 95 memorized in the main memory. Therefore, the CPU 10 reads the controller posture data 108 and the first angular velocity data 95 from the main memory, and calculates the posture of the controller 5. The data of "posture based on angular velocity" calculated in the above manner is displayed and stored in the main memory. When the posture is calculated from the angular velocity, the initial posture can be determined in advance. That is, when the posture of the controller 5 is calculated from the angular velocity, the CPU 10 can first calculate the initial posture of the controller 5 first. The initial posture of the controller 5 can be calculated based on the acceleration data, or the player can perform a predetermined operation while the controller 5 is in a specific posture, and the specific posture of the hour when the predetermined operation is performed can be used as the initial posture. When the posture of the controller 5 is calculated as an absolute posture based on a predetermined direction in space, the initial posture can be calculated, and the posture of the controller 5 can be calculated as, for example, the posture of the controller 5 at the time of the start of the game. When the relative posture of the reference is made, the initial posture may not be calculated. Next, the CPU 10 corrects the posture of the controller 5 calculated based on the angular velocity using the first acceleration data 94. Specifically, the CPU 10 first reads the first acceleration data 94 from the main memory, and calculates the posture of the controller 5 based on the first acceleration data 94. Here, in the state where the controller 5 is almost stationary, the acceleration applied to the controller 5 means the acceleration of gravity. Therefore, in this state, the direction (gravity direction) of the gravitational acceleration can be calculated using the first acceleration data 94 output from the acceleration sensor 37. Therefore, the controller with respect to the gravity direction can be calculated based on the first acceleration data 94. 5 orientation (posture). The data of "posture based on acceleration" calculated in the above manner is displayed and stored in the main memory.

When calculating the posture according to the acceleration, the CPU 10 then corrects the posture according to the angular velocity using the posture according to the acceleration. Specifically, the CPU 10 reads, from the main memory, the data showing the posture according to the angular velocity and the data showing the posture according to the acceleration, and causes the posture according to the angular velocity data to be close to the posture according to the acceleration data at a predetermined ratio. Make corrections. The predetermined ratio may be a predetermined fixed value or may be set in accordance with the acceleration displayed by the first acceleration data 94 or the like. Further, regarding the posture according to the acceleration, since the posture cannot be calculated in the direction of rotation in which the direction of gravity is the axis, the CPU 10 does not correct the direction of rotation. In the present embodiment, the data showing the corrected posture obtained as described above is stored in the main memory. After correcting the posture according to the angular velocity in the above manner, the CPU 10 further corrects the corrected posture using the marker coordinate data 96. First, the CPU 10 calculates the posture of the controller 5 based on the marker coordinate data 96 (in accordance with the posture of the marker coordinates). Since the marker coordinate data 96 displays the positions of the marker images 6R and 6L of the captured image, the posture of the controller 5 related to the pan direction (the direction of rotation about the Z axis) can be calculated from these positions. That is, in the captured image, the posture of the controller 5 related to the roll direction can be calculated from the slope of the straight line connecting the position of the marker 6R and the position of the marker 6L. Further, when the position of the controller 5 with respect to the pointing device 6 can be specified (it can be assumed, for example, that the controller 5 is located on the front side of the pointing device 6), the direction of the pitch can be calculated from the position of the pointing device 6 of the captured image And the posture of the controller 5 related to the direction of the tilt. For example, when the positions of the markers 6R and 6L in the captured image are moved to the left, it can be determined that the controller 5 changes the orientation (posture) to the right. In this manner, the posture of the controller 5 related to the pitch direction and the yaw direction can be calculated from the positions of the markers 6R and 6L. As described above, the posture of the controller 5 can be calculated based on the marker coordinate data 96. When calculating the posture according to the marker coordinates, the CPU 10 then corrects the corrected posture (the posture corrected according to the posture of the acceleration) by the posture of the marker coordinates. That is, the CPU 10 corrects the corrected posture by a predetermined ratio in accordance with the posture of the marker coordinates. The predetermined ratio may be a predetermined fixed value. Further, the correction by the posture of the marker coordinates may be performed only in any one of the roll direction, the pitch direction, and the tilt direction, or in two directions. For example, when the marker coordinate data 96 is used, the posture can be accurately calculated in the pan direction, so the CPU 10 can correct only the pan direction using the posture according to the marker coordinate data 96. Further, when the pointing device 6 or the indicator portion 55 is not imaged by the image pickup device 40 of the controller 5, the posture according to the marker coordinate data 96 cannot be calculated. Therefore, the correction processing using the marker coordinate data 96 may not be performed at this time.

According to the above, the CPU 10 corrects the posture of the controller 5 calculated based on the first angular velocity data 95 by using the first acceleration data 94 and the marker coordinate data 96. Here, in the method of calculating the posture of the controller 5, in the method of using the angular velocity, the posture can be calculated regardless of how the controller 5 operates. On the other hand, in the method of using the angular velocity, since the posture is calculated by the angular velocity detected by the cumulative addition, the accuracy is deteriorated due to the accumulation of errors or the like, or the rotary sensor is caused by the so-called temperature drift problem. The doubt that the accuracy is deteriorating. Further, in the method of using the acceleration, although the error is not accumulated, the posture cannot be accurately calculated in a state in which the controller 5 is in a strong state (the gravity direction cannot be accurately detected). Further, in the method using the marker coordinates, the posture can be accurately calculated (especially in the pan direction), but the posture cannot be calculated when the indicator portion 55 cannot be imaged. On the other hand, according to the present embodiment, since the three methods of different characteristics are used as described above, the posture of the controller 5 can be accurately calculated. In other embodiments, the posture may be calculated using either or both of the above three methods. Further, when the lighting control of the marker is performed in the processing of the above step S1 or step S22, the CPU 10 preferably calculates the posture of the controller 5 using at least the marker coordinates.

The processing of step S24 is performed after the above step S23. In step S24, the CPU 10 calculates the posture of the terminal device 7. In other words, since the terminal operation data 97 acquired from the terminal device 7 includes the second acceleration data 101, the second angular velocity data 102, and the orientation data 103, the CPU 10 can calculate the posture of the terminal device 7 based on the data. . Here, the CPU 10 can know the amount of rotation (the amount of change in posture) per unit time of the terminal device 7 by the second angular velocity data 102. Further, in the state where the terminal device 7 is almost stationary, the acceleration applied to the terminal device 7 means the gravitational acceleration, so that the direction of gravity applied to the terminal device 7 can be known by the second acceleration data 101 (that is, by gravity) The direction of the terminal device 7 whose direction is the reference). Further, the orientation information 103 can be used to know the predetermined orientation based on the terminal device 7 (that is, the posture of the terminal device 7 based on the predetermined orientation). Even when a magnetic field other than geomagnetism is generated, the amount of rotation of the terminal device 7 can be known. Therefore, the CPU 10 can calculate the posture of the terminal device 7 based on the second acceleration data 101, the second angular velocity data 102, and the orientation data 103. In the present embodiment, the posture of the terminal device 7 is calculated based on the above three types of data. However, in other embodiments, the posture may be calculated based on either or both of the above three types of materials.

The specific calculation method of the posture of the terminal device 7 may be any method. For example, the second acceleration data 101 and the orientation data 103 may be used to correct the posture calculated based on the angular velocity displayed by the second angular velocity data 102. Specifically, the CPU 10 first calculates the posture of the terminal device 7 based on the second angular velocity data 102. The method of calculating the posture based on the angular velocity can be the same as the method in the above step S23. Next, the CPU 10 corrects the angular velocity based on the posture calculated based on the second acceleration data 101 and/or the posture calculated from the orientation data 103 at an appropriate timing (for example, when the terminal device 7 is in a stationary state). Calculate the posture. The method of correcting the posture according to the angular velocity in accordance with the posture of the acceleration can be the same as the method of calculating the posture of the controller 5 described above. Further, when the posture according to the angular velocity is corrected in accordance with the posture of the orientation data, the CPU 10 can also correct the posture according to the angular velocity close to the posture based on the orientation data at a predetermined ratio. Based on the above, the CPU 10 can correctly calculate the posture of the terminal device 7. Since the controller 5 includes the imaging information computing unit 35 as an infrared detecting means, the game device 3 can acquire the marker coordinate data 96. Therefore, with respect to the controller 5, the game device 3 can know from the marker coordinate data 96 the absolute posture in the actual space (in the coordinate system set in the actual space, the posture of the controller 5). On the other hand, the terminal device 7 does not include the infrared detecting means such as the imaging information computing unit 35. Therefore, the game device 3 can only know the absolute posture in the real space from the second acceleration data 101 and the second angular velocity data 102 with respect to the rotation direction about the gravity direction. Therefore, in the present embodiment, the terminal device 7 is configured to include the magnetic sensor 72 so that the game device 3 acquires the orientation data 103. According to this, the game device 3 can calculate the absolute posture in the real space from the orientation data 103 with respect to the rotation direction about the gravity direction, and can more accurately calculate the posture of the terminal device 7.

In the specific processing of the above step S24, the CPU 10 reads the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 from the main memory, and calculates the posture of the terminal device 7 based on the data. The data showing the calculated posture of the terminal device 7 is stored in the main memory as the terminal posture data 109. The process of step S25 is performed after step S24. In step S25, the CPU 10 performs recognition processing of the camera image. That is, the CPU 10 performs predetermined identification processing on the camera image data 104. The identification processing can be arbitrary as long as a certain feature can be detected from the camera image and the result is output. For example, when the face of the player is included in the camera image, the process of recognizing the face can be recognized. Specifically, it is possible to recognize the treatment of a part of the face (eye or nose or mouth, etc.) or to detect the expression of the face. Further, the data showing the result of the identification processing is stored as the image identification data 110 in the main memory. The process of step S26 is performed after step S25. In step S26, the CPU 10 performs recognition processing of the microphone sound. That is, the CPU 10 performs predetermined identification processing on the microphone sound material 105. The identification processing can be arbitrary as long as a certain feature can be detected from the microphone sound and the result is output. For example, the process of detecting the player's indication from the microphone sound, or simply detecting the volume of the microphone sound. Further, when the data showing the result of the recognition processing is stored as the sound recognition material 111 in the main memory. The process of step S27 is performed after step S26.

In step S27, the CPU 10 executes game processing in response to the game input. Here, the game input may be data transmitted from the controller 5 or the terminal device 7 or data obtained from the data. Specifically, the game input may be data obtained from the data in addition to the data included in the controller operation data 92 and the terminal operation data 97 (controller posture data 108, terminal posture data 109, and map). Image identification data 110, and voice recognition data 111). Further, the content of the game processing in step S27 may be any content, for example, processing for moving an object (character) appearing in the game, controlling the processing of the virtual camera, or moving the cursor displayed on the screen. Further, the camera image (or a part thereof) is used as a process of playing a game image, or a microphone sound is used as a process of a game sound. An example of the above game processing will be described later. In step S27, for example, the data of various parameters set in the object (character) appearing in the game, or the data of the parameter related to the virtual camera arranged in the game space, the data of the score, etc., are displayed as the game control result. The data is stored in the main memory. After step S27, the CPU 10 ends the game control processing of step S4. Returning to the description of Fig. 22, in step S5, the television game image for display on the television 2 is generated by the CPU 10 and the GPU 11b. That is, the CPU 10 and the GPU 11b read the data showing the result of the game control processing of step S4 from the main memory, and read the data necessary for generating the game image from the VRAM 11d to generate Game image. The game image may be generated by any method as long as the result of the game control process of step S4 can be displayed. For example, the method of generating the game image may be a method of generating a 3-dimensional CG image by arranging the virtual camera in a virtual game space and calculating a game space viewed from the virtual camera, or (without using virtual Camera) A method of generating a 2-dimensional image. The generated television game image is memorized in the VRAM 11d. The processing of step S6 is performed after the above step S5.

In step S6, the terminal game image for display on the terminal device 7 is generated by the CPU 10 and the GPU 11b. The game image for the terminal is also the same as the game image for the television described above, and may be generated by any method as long as the result of the game control process of step S4 can be displayed. Further, the game image for the terminal can be generated by the same method as the above-described game image for television, or can be generated by a different method. The generated game image for the terminal is stored in the VRAM 11d. The game image for the television and the game image for the terminal may be the same depending on the content of the game. In this case, the process of generating the game image may not be executed in step S6. The processing of step S7 is performed after the above step S6. In step S7, a television game sound for outputting to the speaker 2a of the television 2 is generated. That is, the CPU 10 generates a game sound in response to the result of the game control processing of step S4 in the DSP 11c. The generated game sound is, for example, an effect sound of the game, a voice of a character appearing in the game, or BGM. The processing of step S8 is performed after the above step S7. In step S8, the terminal game sound for outputting to the speaker 77 of the terminal device 7 is generated. That is, the CPU 10 generates a game sound in response to the result of the game control processing of step S4 in the DSP 11c. The game sound for the terminal can be the same as or different from the game sound for the above television. In addition, for example, although the effect sounds are different, the BGM may be different only in part. When the television game sound is the same as the terminal game sound, the game sound generation processing may not be executed in step S8. The processing of step S9 is performed after the above step S8.

In step S9, the CPU 10 outputs the game image and the game sound to the television 2. Specifically, the CPU 10 transmits the data of the television game image stored in the VRAM 11d and the data of the television game sound generated by the DSP 11c in step S7 to the AV-IC 15. In response to this, the AV-IC 15 outputs the image and sound data to the television 2 via the AV connector 16. Thereby, the television game image is displayed on the television 2, and the television game sound is output from the horn 2a. The process of step S10 is performed after step S9. In step S10, the CPU 10 transmits the game image and the game sound to the terminal device 7. Specifically, the image data of the terminal game image stored in the VRAM 11d and the data of the sound generated by the DSP 11c in step S8 are transmitted to the codec LSI 27 by the CPU 10, and by The codec LSI 27 performs predetermined compression processing. Further, the image and sound data subjected to the compression processing are transmitted to the terminal device 7 via the antenna 29 via the terminal communication module 28. The terminal device 7 receives the data of the image and the sound transmitted from the game device 3 by the wireless module 80, and performs predetermined decompression processing by the codec LSI 76. The image data subjected to the decompression processing is output to the LCD 51, and the sound data subjected to the decompression processing is output to the sound IC 78. Thereby, the terminal game image is displayed on the LCD 51, and the terminal game sound is output from the speaker 77. The process of step S11 is performed after step S10.

In step S11, the CPU 10 determines whether or not to end the game. The determination of step S11 is performed, for example, by whether or not the game is over or the user is instructed to suspend the game. When the determination of step S11 is negative, the processing of step S2 is executed again. On the other hand, when the determination in step S11 is affirmative, the CPU 10 ends the game processing shown in Fig. 12. Hereinafter, the sequence of steps S2 to S11 is repeatedly executed until it is determined in step S11 that the game is ended. As described above, in the present embodiment, the terminal device 7 includes the inertial sensor of the touch panel 52 and the acceleration sensor 73 or the swing sensor 74, and the output of the touch panel 52 and the inertial sensor is used as the operation data. It is transmitted to the game device 3 and used as an input to the game (steps S3, S4). Further, the terminal device 7 is provided with a display device (LCD 51), and the game image obtained by the game processing is displayed on the LCD 51 (steps S6, S10). Therefore, the user can directly perform a touch operation on the game image by using the touch panel 52. In addition, (because the action of the terminal device 7 can be detected by the inertial sensor, the game image can be displayed) The operation of the LCD 51 itself moves. The user can perform the game by directly operating the game image in such a manner as to operate the game image. Therefore, for example, a game having a novel operation feeling of the first and second game examples described later can be provided. Further, in the present embodiment, the terminal device 7 includes an analog rocker 53 and an operation key 54 that can be operated while holding the terminal device 7, and the game device 3 can perform operations on the analog rocker 53 and the operation key 54. Used as a game input (steps S3, S4). Therefore, as described above, even if the game image is directly operated, the user can perform detailed game operations by a key operation or a joystick operation.

Furthermore, in the present embodiment, the terminal device 7 includes a camera 56 and a microphone 79, and the data of the camera image captured by the camera 56 and the data of the microphone sound detected by the microphone 79 are transmitted to the game device 3 (step S3). . Therefore, the game device 3 can use the camera image and/or the microphone sound as the game input. Therefore, the user can also operate the image by the camera 56 or input the sound to the microphone 79. Play the game. These operations can be performed while the terminal device 7 is being held, so that when the game image is directly manipulated as described above, the user can perform a more diverse game operation by performing such operations. Further, in the present embodiment, since the game image is displayed on the LCD 51 of the portable terminal device 7 (steps S6 and S10), the user can freely arrange the terminal device 7. Therefore, when the controller 5 is operated toward the marker, by arranging the terminal device 7 in the free position, the user can cause the controller 5 to play in the free direction, and the operation performed on the controller 5 is promoted. The degree of freedom. Further, since the terminal device 7 can be placed at an arbitrary position, a game having a more realistic feeling can be realized by arranging the terminal device 7 at a position suitable for the game content as in the fifth game example described later.

Further, according to the present embodiment, the game device 3 can acquire the operation data and the like from the controller 5 and the terminal device 7 (steps S2 and S3), so that the user can use the two devices of the controller 5 and the terminal device 7 as the operation means. . Therefore, in the game system 1, a plurality of users can use a plurality of individuals to play games by using a plurality of devices, and one user can use two devices to play the game. Further, according to the present embodiment, the game device 3 generates two kinds of game images (steps S5 and S6), and can display the game image on the television 2 and the terminal device 7 (steps S9 and S10). In this way, by displaying the two kinds of game images on different devices, it is possible to provide a game image that is easier for the user to view, and to improve the operability of the game. For example, when two people play a game, as shown in the third or fourth game example described later, a game image that is easy to view for one of the users is displayed on the television 2, and the other user is In other words, the game image for the viewpoint of easy viewing is displayed on the terminal device 7, whereby the game can be played under the viewpoint that each player can easily view. Further, even when a game is played for one person, for example, as shown in the first, second, or fifth game examples described later, two kinds of game images can be displayed in different viewpoints, whereby the player can easily grasp the game space. The appearance of the game, and improve the operability of the game.

[6. Game Example] Next, a specific example of the game played in the game system 1 will be described. In the example of the game described below, some of the configurations of the devices of the game system 1 are not used, and some of the series of processes shown in FIGS. 22 and 23 are not executed. . In other words, the game system 1 does not have all of the above configurations, and the game device 3 may not execute some of the series of processes shown in FIGS. 22 and 23.

(First Game Example) The first game example is a game in which an object (dart) is emitted in the game space by operating the terminal device 7. The player can instruct the direction of the dart by changing the posture of the terminal device 7 and the operation of drawing the line on the touch panel 52. Fig. 24 is a view showing the screen of the television 2 and the terminal device 7 in the first game example. In Fig. 24, on the LCD 51 of the television 2 and the terminal device 7, a game image indicating the game space is displayed. The darts 121, the control surface 122, and the target 123 are displayed on the television 2. The control surface 122 (and the darts 121) is displayed on the LCD 51. In the first game example, the player shoots the darts 121 by using the operation of the terminal device 7, hits the target 123, and plays the game. When the darts 121 is shot, the player first changes the posture of the control surface 122 disposed in the virtual game space by operating the posture of the terminal device 7, and becomes a desired posture. That is, the CPU 10 calculates the posture of the terminal device 7 based on the outputs of the inertial sensors (the acceleration sensor 73 and the rotation sensor 74) and the magnetic sensor 72 (step S24), and based on the calculated posture. The posture of the control plane 122 is changed (step S27). In the first game example, the posture of the control plane 122 is controlled in accordance with the posture of the posture of the terminal device 7 in the actual space. That is, the player can change the posture of the control plane 122 in the game space by changing the posture of the terminal device 7 (displayed on the control plane 122 of the terminal device 7). In the first game example, the position of the control plane 122 is fixed at a predetermined position in the game space.

Next, the player performs an operation of drawing a line on the touch panel 52 using the stylus pen 124 or the like (refer to an arrow shown in FIG. 24). Here, in the first game example, the control unit 122 is displayed on the LCD 51 of the terminal device 7 such that the input surface of the touch panel 52 corresponds to the control surface 122. Therefore, the direction on the control surface 122 (the direction in which the line is displayed) can be calculated by the line drawn on the touch panel 52. The dart 121 is shot in the direction thus determined. From the above, the CPU 10 calculates the direction on the control surface 122 from the touch position data 100 of the touch panel 52, and performs a process of moving the darts 121 to the calculated direction (step S27). The CPU 10 can control the speed of the darts 121, for example, depending on the length of the line or the speed at which the line is drawn.

As described above, according to the first game example, the game device 3 can use the output of the inertial sensor as the game input, and can move the control surface 122 in response to the movement (posture) of the terminal device 7 by using the touch panel. The output of 52 is used as a game input to specify the direction on control surface 122. According to this, the player can move the game image (the image of the control plane 122) displayed on the terminal device 7 or perform the touch operation on the game image, so that the game operation can be directly operated in a novel operation. Feel to play the game. Further, in the first game example, by using the sensor output of the inertial sensor and the touch panel 52 as a game input, the direction in the three-dimensional space can be easily indicated. In other words, the player actually adjusts the posture of the terminal device 7 with one hand and inputs the direction to the touch panel 52 with the other hand wire, whereby the intuitive operation can be easily performed in the actual input direction in the space. The ground indicates the direction. Further, since the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52, the operation of instructing the direction in the three-dimensional space can be quickly performed.

Further, according to the first game example, in order to facilitate the touch input operation on the control surface 122, the control device 122 displays the control surface 122 on the entire screen. On the other hand, the image of the game space including the entire control surface 122 and the target 123 is displayed on the television 2 by easily grasping the posture of the control surface 122 and easily aiming the target 123 (refer to Fig. 24). . In other words, in the above step S27, the first virtual camera for generating the game image for the television is set such that the entire control surface 122 and the target 123 are included in the visual field range, and the game map for the terminal is generated. The second virtual camera is set such that the screen of the LCD 51 (the input surface of the touch panel 52) and the control surface 122 match each other on the screen. Therefore, in the first game example, the game operation can be performed more easily by displaying the image of the game space viewed from different viewpoints on the television 2 and the terminal device 7.

(Second game example) The game in which the sensor output of the inertial sensor and the touch panel 52 is used as the game input is not limited to the above-described first game example, and various types of game examples can be considered. The second game example is a game in which an object (projectile) is emitted in the game space by operating the terminal device 7 in the same manner as the first game example. The player can indicate the direction in which the projectile is fired by changing the operation of the posture of the terminal device 7 and the operation of designating the position on the touch panel 52. Fig. 25 is a view showing the screen of the television 2 and the terminal device 7 in the second game example. In Fig. 25, the cannon 131, the projectile 132, and the target 133 are displayed on the television 2. The projectile 132 and the target 133 are displayed on the terminal device 7. The terminal game image displayed on the terminal device 7 is an image in which the game space is viewed from the position of the cannon 131.

In the second game example, the player can change the display range of the terminal device 7 as the terminal game image by operating the posture of the terminal device 7. That is, the CPU 10 calculates the posture of the terminal device 7 based on the outputs of the inertial sensors (the acceleration sensor 73 and the rotation sensor 74) and the magnetic sensor 72 (step S24), and based on the calculated posture, The position and posture of the second virtual camera for generating the game image for the terminal are controlled (step S27). Specifically, the second virtual camera is installed at the position of the cannon 131, and controls the orientation (posture) in response to the posture of the terminal device 7. In this way, the player can change the range of the game space displayed on the terminal device 7 by changing the posture of the terminal device 7.

Further, in the second game example, the player performs an operation of inputting a point (touch operation) on the touch panel 52, thereby specifying the direction in which the projectile 132 is emitted. Specifically, in the processing of the above step S27, the CPU 10 calculates a position (control position) in the game space corresponding to the touch position, and moves from a predetermined position in the game space (for example, the position of the cannon 131) to the control position. The direction is calculated as the emission direction. Then, the process of moving the projectile 132 in the direction of emission is performed. As described above, in the first game example described above, the player performs an operation of drawing a line on the touch panel 52. However, in the second game example, an operation of designating a point on the touch panel 52 is performed. The control position can be calculated by setting the same control surface as the first game example (but the control surface is not displayed in the second game example). In other words, the control surface is disposed in accordance with the posture of the second virtual camera in accordance with the display range in the terminal device 7 (specifically, the control surface is centered on the position of the cannon 131, and the posture of the terminal device 7 is changed in response to the position of the cannon 131. To rotate the movement, the position on the control surface corresponding to the touch position can be calculated as the control position.

According to the second game example described above, the game device 3 can change the display range of the terminal game image in response to the movement (posture) of the terminal device 7 by using the output of the inertial sensor as the game input, and by specifying the The touch input of the position within the display range is used as a game input to specify the direction within the game space (the direction in which the projectile 132 is fired). Therefore, in the second game example, as in the first game example, the player can move the game image displayed on the terminal device 7 or perform a touch operation on the game image, so that the game image can be directly played. A game-like operation feels like a game.

Further, in the second game example, as in the first game example, the player actually adjusts the posture of the terminal device 7 with one hand and touch input the touch panel 52 with the other hand. The direction is easily indicated by an intuitive operation of the actual input direction in the space. Further, since the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52, the operation of instructing the direction in the three-dimensional space can be quickly performed.

Further, the image displayed on the television 2 in the second game example may be an image viewed from the same viewpoint as the terminal device 7, but in FIG. 25, the game device 3 displays a view viewed from a different viewpoint. image. In other words, the second virtual camera for generating the game image for the terminal is set at the position of the cannon 131. On the other hand, the first virtual camera for generating the game image for the television is set behind the cannon 131. position. Here, for example, a range that cannot be seen on the screen of the terminal device 7 is displayed on the television 2, whereby a player who can view the screen of the television 2 and aim at the target 133 that cannot be seen on the screen of the terminal device 7 can be realized. The gameplay. Thus, by setting the display range of the television 2 and the terminal device 7 differently, it is not only easier to grasp the appearance in the game space, but also to enhance the fun of the game.

As described above, according to the present embodiment, since the terminal device 7 including the touch panel 52 and the inertial sensor can be used as the operation device, it is possible to directly implement the game image as in the first and second game examples described above. A game that feels the operation of the operation.

(3rd game example)

The third game example will be described below with reference to Figs. 26 and 27. The third game example is a baseball game in the form of a two-player game. That is, the first player uses the controller 5 to operate the hitter, and the second player uses the terminal device 7 to operate the shooter. Further, in the television 2 and the terminal device 7, a game image that is easy for each player to perform a game operation is displayed.

Fig. 26 is a view showing an example of a television game image displayed on the television 2 in the third game example. The game image for television shown in Fig. 26 is mainly used for the image of the first player. In other words, the game image for the television shows the game space of the pitcher (pitcher object) 142 of the operation target belonging to the second player from the hitter (the hitter object) 141 side of the operation target of the first player. . The first virtual camera for generating the game image for the television is placed at the rear position of the hitter 141, and is disposed from the hitter 141 toward the pitcher 142.

On the other hand, Fig. 27 is a view showing an example of a terminal game image displayed on the terminal device 7 in the third game example. The game image for the terminal shown in Fig. 27 is mainly used for the image of the second player. In other words, the game image for the terminal is displayed as the game space of the hitter 141 that is the operation target of the first player from the pitcher 142 side of the operation target of the second player. Specifically, in the above-described step S27, the CPU 10 controls the second virtual camera for generating the game image for the terminal based on the posture of the terminal device 7. The posture of the second virtual camera is calculated in accordance with the posture of the terminal device 7 as in the second game example described above. Further, the position of the second virtual camera is fixed at a predetermined predetermined position. The game image for the terminal includes a cursor 143 for displaying the direction in which the pitcher 142 throws the ball.

The method of operating the hitter 141 controlled by the first player and the method of operating the shooter 142 controlled by the second player may be any method. For example, the CPU 10 detects the swinging operation performed on the controller 5 based on the output data of the inertial sensor of the controller 5, and performs an action of causing the beater 141 to swing in response to the swinging operation. Further, for example, the CPU 10 moves the cursor 143 in accordance with the operation performed on the analog joystick 53, and when the predetermined key in the operation key 54 is pressed, the pitcher 142 is caused to perform the pitching operation toward the position indicated by the cursor 143. Further, the cursor 143 can also be moved in response to the posture of the terminal device 7 instead of the operation performed on the analog rocker 53.

As described above, in the third game example, game images having mutually different viewpoints are generated in the television 2 and the terminal device 7, whereby a game image that is easy to view and easy to operate for each player can be provided.

Further, in the third game example, two virtual cameras are set in a single game space, and two kinds of game images (Figs. 26 and 27) for viewing the game space from the respective virtual cameras are respectively displayed. Therefore, regarding the two types of game images generated in the third game example, the game processing (the control of the objects in the game space, etc.) performed in the game space is almost common, and only the common game space has to be drawn twice. The processing can generate each game image, and therefore has an advantage of high processing efficiency compared to when the game processing is performed separately.

Further, in the third game example, since the cursor 143 indicating the pitching direction is displayed only on the terminal device 7 side, the first player cannot see the position indicated by the cursor 143. Therefore, there is no loss in the game in which the pitching direction is known by the first player and which is disadvantageous to the second player. As described above, in the present embodiment, when a game in which one of the players sees the game image is lost to the other player, the game image needs to be displayed on the terminal device 7. . This can prevent the lack of strategic reduction of the game. In other embodiments, the game device 3 may use the game image for the terminal and the television depending on the content of the game (for example, even if the game image for the terminal is not seen by the first player). The game image is displayed on TV 2.

(fourth game example)

The fourth game example will be described below with reference to Figs. 28 and 29. The fourth game example is a shooting game in the form of cooperation between two players. That is, the first player uses the controller 5 to perform the operation of the mobile aircraft, and the second player uses the terminal device 7 to perform the operation of controlling the launch direction of the aircraft's cannon. Similarly to the third game example, the fourth game example displays a game image that is easy for each player to perform a game operation on the television 2 and the terminal device 7.

Fig. 28 is a view showing an example of a television game image displayed on the television 2 in the fourth game example. In addition, FIG. 29 is a view showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example. As shown in Fig. 28, in the fourth game example, the aircraft (aircraft object) 151 and the target (balloon object) 153 appear in the virtual game space. In addition, the aircraft 151 has a cannon (cannon object) 152.

As shown in Fig. 28, the image of the game space including the airplane 151 is displayed as a game image for television. The first virtual camera for generating a game image for television is set so as to generate an image of the game space of the aircraft 151 viewed from the rear. In other words, the first virtual camera is disposed at a position rearward of the aircraft 151 in a posture in which the aircraft 151 is included in the imaging range (field of view). Further, the first virtual camera is controlled to move in accordance with the movement of the aircraft 151. That is, the CPU 10 controls the movement of the aircraft 151 based on the controller operation data in the processing of the above-described step S27, and controls the position and posture of the first virtual camera. As described above, the position and posture of the first virtual camera are controlled in response to the operation of the first player.

On the other hand, as shown in Fig. 29, the image of the game space viewed from the aircraft 151 (more specifically, the cannon 152) is displayed as a game image for the terminal. Therefore, the second virtual camera for generating the game image for the terminal is placed at the position of the aircraft 151 (more specifically, the position of the cannon 152). In the above-described processing of step S27, the CPU 10 controls the movement of the aircraft 151 based on the controller operation data, and controls the position of the second virtual camera. The second virtual camera may also be disposed at a position around the aircraft 151 or the cannon 152 (for example, a position slightly behind the cannon 152). As described above, the position of the second virtual camera is controlled by the operation of the first player (the movement of the operating aircraft 151). Therefore, in the fourth game example, the first virtual camera and the second virtual camera move in conjunction.

Further, the image of the game space viewed in the direction in which the cannon 152 is emitted is displayed as a game image for the terminal. Here, the emission direction of the cannon 152 is controlled to correspond to the posture of the terminal device 7. That is, in the present embodiment, the posture of the second virtual camera is controlled so that the line of sight direction of the second virtual camera coincides with the emission direction of the cannon 152. In the processing of the above-described step S27, the CPU 10 controls the orientation of the cannon 152 and the posture of the second virtual camera in response to the posture of the terminal device 7 calculated in the above-described step S24. Thus, the posture of the second virtual camera is controlled by the operation of the second player. Further, the second player can change the direction in which the cannon 152 is emitted by changing the posture of the terminal device 7.

When the projectile is fired from the cannon 152, the second player presses the predetermined key of the terminal device 7. When the predetermined key is depressed, the projectile is launched from the direction of the cannon 152. In the game image for the terminal, a sight 154 is displayed in the center of the screen of the LCD 51, and the projectile is fired in the direction indicated by the sight 154.

As described above, in the fourth game example, the first game player mainly views the television game image (Fig. 28) in which the game space viewed in the traveling direction of the aircraft 151 is displayed (for example, the target 153 is conventionally desired). The direction of movement moves) while operating the aircraft 151. On the other hand, the second player mainly operates the cannon 152 while watching the terminal game image (Fig. 29) showing the game space viewed in the emission direction of the cannon 152. As described above, in the fourth game example, in a game in which two players cooperate with each other, game images that are easy to view and easy to operate for each player can be displayed on the television 2 and the terminal device 7, respectively.

Further, in the fourth game example, the positions of the first virtual camera and the second virtual camera are controlled by the operation of the first player, and the posture of the second virtual camera is controlled by the operation of the second player. That is, in the present embodiment, the position or posture of the virtual camera is changed in response to each game operation of each player, and as a result, the display range of the game space displayed on each display device can be changed. Since the display range of the game space displayed on the display device changes depending on the operation of each player, each player can feel that his/her own game operation is sufficiently reflected in the progress of the game, and the game can be fully enjoyed.

In the fourth game example, the game image viewed from the rear of the aircraft 151 is displayed on the television 2, and the game image viewed from the position of the cannon of the aircraft 151 is displayed on the terminal device 7. Here, in other game examples, the game device 3 may display the game image viewed from the rear of the aircraft 151 in the terminal device 7, and display the game image viewed from the position of the cannon 152 of the aircraft 151 on the television 2. . At this time, the work of each player is replaced with the fourth game example described above, and the first player can use the controller 5 to perform the operation of the cannon 152, and the second player can use the terminal device 7 to perform the aircraft 151. operating.

(5th game example)

The fifth game example will be described below with reference to Fig. 30. In the fifth game example, the game is operated by the player using the controller 5, and the terminal device 7 is not used as an operation device but as a display device. Specifically, the fifth game example is a golf game, and in response to the player's operation of swinging the controller 5 like a golf club (swing operation), the game device 3 performs the player character in the virtual game space. The swing of the golf ball.

Fig. 30 is a view showing a pattern of the game system 1 used in the fifth game example. In Fig. 30, an image of the game space including the player character (object) 161 and the high club (object) 162 is displayed on the screen of the television 2. In FIG. 30, although it is hidden in the high club 162 and is not displayed, the ball (object) 163 placed in the game space is also displayed on the television 2. On the other hand, as shown in Fig. 30, the terminal device 7 is disposed on the floor on the front side of the front side of the television 2 such that the screen of the LCD 51 is vertically upward. The terminal device 7 displays an image of the display ball 163 and an image showing a part of the high club 162 (specifically, the club head 162a of the high club 162) and an image of the floor on which the game space is displayed. The game image for the terminal is an image of the surroundings of the ball viewed from above.

When the game is played, the player 160 stands in the vicinity of the terminal device 7 to perform the swing operation of the controller 5 like a golf club. At this time, the CPU 10 controls the position and posture of the high club 162 in the game space in accordance with the posture of the controller 5 calculated by the processing of the above-described step S23 in the processing of the above-described step S27. Specifically, the high club 162 is controlled to be high in the game space when the front end direction of the controller 5 (the positive Z-axis direction shown in FIG. 3) is directed toward the image of the ball 163 displayed on the LCD 51. The club 162 hits the ball 163.

Further, when the front end direction of the controller 5 faces the LCD 51, an image (head image) 164 indicating a part of the high club 162 is displayed on the LCD 51 (refer to Fig. 30). In order to increase the sense of presence, the game image for the terminal can display the image of the ball 163 in a large object or display the rotation of the head image 164 in response to the rotation of the controller 5 about the Z axis. Further, the game image for the terminal may be generated using a virtual camera provided in the game space or generated using image data prepared in advance. When the image data prepared in advance is used for generation, it is not necessary to construct the terrain model of the golf course in detail, and it is possible to generate a detailed and realistic image with a small processing load.

The golf club 162 is swung by the player 160 performing the above swing operation, and as a result, when the high club 162 hits the ball 163, the ball 163 moves (flies out). That is, the CPU 10 determines in the above-described step S27 whether or not the high club 162 is in contact with the ball 163, and moves the ball 163 in contact. Here, the television game image is generated so as to include the moved ball 163. That is, the CPU 10 controls the position and posture of the first virtual camera for generating the television game image so that the moving ball is included in the imaging range. On the other hand, when the high club 162 hits the ball 163 in the terminal device 7, the image of the ball 163 moves and immediately disappears outside the screen. Therefore, in the fifth game example, the appearance of the ball movement is mainly displayed on the television 2, and the player 160 can confirm the direction of the ball flying by the swing operation by the game image for the television.

As described above, in the fifth game example, the player 160 can swing the high club 162 (by causing the player character 161 to swing the high club 162) by waving the controller 5. Here, in the fifth game example, when the front end direction of the controller 5 is directed to the image of the ball 163 displayed on the LCD 51, it is controlled that the high club 162 in the game space hits the ball 163. Therefore, the player can obtain the feeling of actually swinging the high club by the swing operation, and can make the swing operation more realistic.

In the fifth game example, when the front end direction of the controller 5 faces the terminal device 7, the head image 164 is displayed on the LCD 51. Therefore, the player can obtain the feeling that the posture of the high club 162 in the virtual space corresponds to the posture of the controller 5 in the real space by directing the front end direction of the controller 5 toward the terminal device 7, and can make the wave The pole operation is more realistic.

As described above, in the fifth game example, when the terminal device 7 is used as the display device, the operation of the use controller 5 can be made more realistic by appropriately arranging the position of the terminal device 7.

Further, in the fifth game example described above, the terminal device 7 is placed on the ground, and the terminal device 7 displays an image in which only the game space around the ball 163 is displayed. Therefore, in the terminal device 7, the position and posture of the entire golf club 162 in the game space cannot be displayed, and in the terminal device 7, the movement of the ball 163 after the swing operation cannot be displayed. Therefore, in the fifth game example, before the movement of the ball 163, the entire golf club 162 is displayed on the television 2, and after the movement of the ball 163, the screen of the movement of the ball 163 is displayed on the television 2. As described above, according to the fifth game example, a more realistic operation can be provided to the player, and the game image that is easy to view can be presented to the player using the two screens of the television 2 and the terminal device 7.

Further, in the fifth game example described above, in order to calculate the posture of the controller 5, the indicator portion 55 of the terminal device 7 is used. That is, the CPU 10 turns on the indicator portion 55 in the initial processing of the above-described step S1 (the pointing device 6 is not lit), and the CPU 10 calculates the posture of the controller 5 based on the marker coordinate data 96 in the above-described step S23. According to this, it is possible to accurately determine whether or not the distal end direction of the controller 5 is the posture toward the indicator portion 55. In the fifth game example described above, the above steps S21 and S22 may not be executed. However, in other game examples, the marker to be turned on may be changed in the middle of the game by executing the processes of steps S21 and S22 described above. For example, in step S21, the CPU 10 determines whether or not the front end direction of the controller 5 is toward the gravity direction based on the first acceleration data 94, and controls the pointing portion 55 to be turned on when facing the gravity direction in step S22. The pointing device 6 is lit when facing the direction of gravity. According to this, when the front end direction of the controller 5 faces the gravity direction, the posture of the controller 5 can be accurately calculated by acquiring the marker coordinate data of the indicator portion 55, and when the front end direction of the controller 5 faces the television 2, The posture of the controller 5 is accurately calculated by acquiring the marker coordinate data of the pointing device 6.

As described in the fifth game example described above, the game system 1 can set the terminal device 7 to a free position and apply it as a display device. According to this, when the marker coordinate data is used as the game input, in addition to using the controller 5 toward the television 2, the controller 5 can be used in the free direction by setting the terminal device 7 at a desired position. . That is, according to the present embodiment, the orientation in which the controller 5 can be used is not limited, so that the degree of freedom of operation of the controller 5 can be improved.

[7. Other examples of game system]

The game system 1 described above can perform the above-described operations for performing various games. The terminal device 7 can also be used as a portable display or a second display, and can also be used as a controller for performing touch input or input according to an action. According to the game system 1, various games can be implemented. In addition, it also includes other uses beyond the game, and the following actions can also be performed.

(Example of an operation in which the player uses only the terminal device 7 to play a game)

In the present embodiment, the terminal device 7 has the function of a display device and also functions as an operating device. Therefore, the terminal device 7 can be used only as a portable game device without using the television and the controller 5 and using only the terminal device 7 as a display means and an operation means.

According to the game processing shown in Fig. 22, the CPU 10 acquires the terminal operation material 97 from the terminal device 7 in step S3, and uses only the terminal operation material 97 as the game input in step S4 (without using the controller operation). Data) to perform game processing. The game image is then generated in step S6, and the game image is transmitted to the terminal device 7 in step S10. At this time, steps S2, S5, and S9 may not be performed. According to the above, the game processing is performed in response to the operation performed by the terminal device 7, and the game image showing the result of the game processing is displayed on the terminal device 7. According to this, (actually, although the game processing is executed in the game device), the terminal device 7 can also be applied as a portable game device. Therefore, according to the present embodiment, the user can use the terminal device 7 to play the game even if the game image cannot be displayed on the television 2 when the television 2 is used (for example, the other person is watching the television).

The CPU 10 is not limited to the game image, and the image may be transmitted to the terminal device 7 for display on the menu screen displayed after the power is turned on. According to this, the player can play the game without using the TV 2 from the beginning, which is extremely convenient.

Furthermore, in the above, the display device that displays the game image may be changed from the terminal device 7 to the television 2 in the middle of the game. Specifically, the CPU 10 can more perform the above-described step S9 and output the game image to the television 2. The image output to the television 2 in step S9 is the same as the game image transmitted to the terminal device 7 in step S10. According to this, by inputting the input of the television 2 so as to display the input from the game device 3, the same game image as the terminal device 7 can be displayed on the television 2, so that the display device displaying the game image can be changed to the television. 2. After the game image is displayed on the television 2, the screen display of the terminal device 7 can be turned off.

In the game system 1, the infrared remote control signal emitted from the television 2 can be output from the infrared output means (the pointing device 6, the indicator portion 55, or the infrared communication module 82). According to this, the game device 3 can operate the television 2 by outputting the infrared remote control signal from the infrared output means in response to an operation performed by the terminal device 7. At this time, the user can operate the television 2 using the terminal device 7 without operating the remote controller of the television 2, and thus it is extremely convenient to switch the input of the television 2 as described above.

(Example of the operation of communicating with other devices via the network)

As described above, since the game device 3 has a function of being connected to the network, the game system 1 can also be applied to the case of communicating with an external device via the network. Fig. 31 is a view showing the connection relationship of the devices included in the game system 1 when connected to an external device via a network. As shown in FIG. 31, the game device 3 can be connected to the external device 191 via the network 190.

As described above, when the external device 191 can communicate with the game device 3, the game system 1 can communicate with the external device 191 by using the terminal device 7 as an interface. For example, the game system 1 can be used as a videophone by receiving and transmitting images and sounds between the external device 191 and the terminal device 7. Specifically, the game device 3 receives an image and sound (image and sound of the telephone partner) from the external device 191 via the network 190, and transmits the received image and sound to the terminal device 7. Thereby, the terminal device 7 can display an image from the external device 191 on the LCD 51, and output the sound from the external device 191 from the speaker 77. Further, the game device 3 receives the camera image captured by the camera 56 and the microphone sound detected by the microphone 79 from the terminal device 7, and transmits the camera image and the microphone sound to the external device 191 via the network 190. The game device 3 can use the game system 1 as a videophone by repeating the above-described reception and transmission of images and sounds with the external device 191.

In the present embodiment, since the terminal device 7 is of a transportable type, the user can use the terminal device 7 at a free position and face the camera 56 in the free direction. Further, in the present embodiment, since the terminal device 7 includes the touch panel 52, the game device 3 can also transmit the input information (touch position data 100) input to the touch panel 52 to the external device 191. For example, when the image and sound from the external device 191 are output by the terminal device 7, and characters or the like written by the user on the touch panel 52 are transmitted to the external device 191, the game system 1 can also be used as a so-called E-learning system (electronic learning system).

(Example of actions linked to TV playback)

Further, the game system 1 can also operate in conjunction with the television broadcast when watching the television broadcast on the television 2. In other words, when the television system 2 watches the television program, the game system 1 can output information related to the television program to the terminal device 7. An example of the operation when the game system 1 operates in conjunction with the television broadcast will be described below.

In the above operation example, the game device 3 can communicate with the server via the network (in other words, the external device 191 shown in Fig. 31 is a server). The server is used to record various information related to TV playback (television information) for each channel of the TV broadcast. The television information may be information related to the program such as subtitles or artist information, or information of an EPG (Electronic Program List), or information played as a material broadcast. In addition, television information can be information such as images, sounds, text, or a combination of these. Further, the server does not have to be one, and a server can be provided for each channel or each program of the television broadcast, and the game device 3 can also communicate with each server.

When the video and sound of the television broadcast are outputted on the television 2, the game device 3 causes the user to input the channel for viewing the television being viewed using the terminal device 7. Then, the server is required to transmit the television information corresponding to the input channel via the network. In response to this, the server transmits data corresponding to the television information of the above channel. When receiving the data transmitted from the server, the game device 3 outputs the received data to the terminal device 7. The terminal device 7 displays the image and the text data in the above material on the LCD 51. And output sound data from the speaker. By the above means, the user can use the terminal device 7 to enjoy information related to the currently watched television program and the like.

As described above, the game system 1 communicates with an external device (server) via the network, whereby the information associated with the television broadcast can be provided to the user via the terminal device 7. In particular, in the present embodiment, since the terminal device 7 is of a transportable type, the user can use the terminal device 7 at a free position, which is highly convenient.

As described above, in the present embodiment, the user can use the terminal device 7 in various uses and forms in addition to the game.

[8. Modifications]

The above embodiment is an embodiment for carrying out the invention, and in other embodiments, the invention may be embodied, for example, in the configuration described below.

(Modification with a plurality of terminal devices)

In the above embodiment, the game system 1 is configured to have only one terminal device, but the game system 1 may be configured to have a plurality of terminal devices. That is, the game device 3 can wirelessly communicate with a plurality of terminal devices, and can transmit the data of the game image, the data of the game sound, and the control data to each terminal device, and receive the operation data and the camera image from each terminal device. Data and microphone sound data. The game device 3 wirelessly communicates with each of a plurality of terminal devices. In this case, the game device 3 can perform wireless communication with each terminal device in a time division manner or divide the frequency band.

When there are a plurality of terminal devices as described above, the game system can be used to perform a wider variety of games. For example, when the game system 1 has two terminal devices, since the game system 1 has three display devices, game images for each of the three players can be generated and displayed on each display device. Further, when the game system 1 has two terminal devices, in a game in which the controller and the terminal device are used as one group (for example, the fifth game example described above), the two players can simultaneously play the game. Furthermore, when the game processing of the above step S27 is performed based on the marker coordinate data output from the two controllers, the two players can respectively cause the controller to face the marker (the pointing device 6 or the indicator portion 55). Game operation. That is, the player of one of the players causes the controller to perform the game operation toward the pointing device 6, and the other player causes the controller to perform the game operation toward the indicator portion 55.

(Modification of the function of the terminal device)

In the above embodiment, the terminal device 7 has a function of a so-called compact terminal that does not execute game processing. Here, in another embodiment, some of the series of game processes executed by the game device 3 in the above-described embodiment may be executed by another device such as the terminal device 7. For example, a part of the processing (for example, a process of generating a game image for a terminal) is executed by the terminal device 7. In other words, the terminal device may perform game processing based on the operation of the operation unit, and generate a game image in accordance with the game processing and display it on the display unit as a function as a portable game device. Further, for example, in a game system having a plurality of information processing devices (game devices) that can communicate with each other, the plurality of information processing devices can share the execution of the game device.

(Modification of the configuration of the terminal device)

The terminal device in the above embodiment is an example, and the shape of each operation key or the cover 50 of the terminal device 7, or the number of each component and the installation position are merely examples, and may be other shapes, numbers, and installation positions. For example, the terminal device can be configured as shown below. Hereinafter, a modification of the terminal device will be described with reference to Figs. 32 to 35.

Fig. 32 is a view showing an external configuration of a terminal device according to a modification of the above embodiment. Fig. 32 (a) is a front view of the terminal device, (b) is a plan view, (c) is a right side view, and (d) is a bottom view. Further, Fig. 33 is a view showing a state in which the user holds the terminal device. In the 32nd and 33rd drawings, the components of the components corresponding to the terminal device 7 of the above-described embodiment are denoted by the same reference numerals as those of the eighth embodiment, but need not necessarily be the same.

As shown in Fig. 32, the terminal device 8 includes a cover 50 having a substantially rectangular plate shape that is substantially horizontally long. The cover 50 is of a size that can be gripped by the user. Therefore, the user can hold the terminal device 8 to move or change the arrangement position of the terminal device 8.

The terminal device 8 has an LCD 51 on the surface of the housing 50. The LCD 51 is disposed near the center of the surface of the outer cover 50. Therefore, as shown in Fig. 9, by holding the cover 50 on both sides of the LCD 51, the user can move while holding the terminal device while viewing the screen of the LCD 51. In Fig. 9, the example in which the user holds the outer cover 50 on the left and right sides of the LCD 51 while holding the terminal device 8 in a laterally held manner (longer lateral direction) is also possible, but it can also be held in a longitudinal manner ( The terminal device 8 is held in a longitudinally long orientation.

As shown in (a) of FIG. 32, the terminal device 8 has a touch panel 52 as an operation means (operation portion) on the screen of the LCD 51. In the present modification, the touch panel 52 is a resistive film type touch panel. However, the touch panel is not limited to the resistive film method, and for example, a touch panel of any method such as an electrostatic capacitance method can be used. In addition, the touch panel 52 can be a single touch method or a multi-touch method. In the present modification, the touch panel 52 is applied to the same resolution (detection accuracy) as the resolution of the LCD 51. However, the resolution of the touch panel 52 does not necessarily have to match the resolution of the LCD 51. The input to the touch panel 52 is usually performed by a stylus, but is not limited to the stylus, and the touch panel 52 can be input by the user's finger. The housing 50 may be provided with a receiving hole for accommodating a stylus for operating the touch panel 52. As such, since the terminal device 8 has the touch panel 52, the user can operate the touch panel 52 while moving the terminal device 8. That is, the user can input the screen directly (by the touch panel 52) while moving the screen of the LCD 51.

As shown in Fig. 32, the terminal device 8 includes two analog rockers 53A and 53B and a plurality of operation keys 54A to 54L as operation means (operation unit). Various types of ratio rockers 53A and 53B are means for indicating directions. The various types of rocker bars 53A and 53B are configured such that the rocker portion operated by the user's finger can slide or tilt in any direction (any angle of up, down, left, right, and oblique directions) with respect to the surface of the outer cover 50. Further, the left analog stick 53A and the right analog stick 53B are respectively disposed on the left and right sides of the LCD 51 screen. Therefore, the user can use the analog joystick to perform the direction input by either the left and right hands. Further, as shown in Fig. 33, the various types of ratio rockers 53A and 53B are provided at positions where the user can operate while holding the left and right portions of the terminal device 8, and therefore, even if the user holds the terminal device 8, When moving, it is also easy to operate various types of joysticks 53A and 53B.

Each of the operation keys 54A to 54L is an operation means for performing predetermined input. As will be described below, each of the operation keys 54A to 54L is provided at a position where the user can operate while holding the left and right portions of the terminal device 8 (refer to Fig. 33). Therefore, even when the user holds the terminal device 8 to move, the operation means can be easily operated.

As shown in (a) of Fig. 32, on the surface of the outer cover 50, a cross key (direction input key) 54A and keys 54B to 54H among the operation keys 54A to 54L are provided. That is, the keys 54A to 54G are disposed at positions where the user's thumb can operate (refer to Fig. 33).

The cross key 54A is disposed on the left side of the LCD 51 and on the lower side of the left analog rocker 53A. That is, the cross key 54A is disposed at a position where the user's left hand can operate. The cross key 54A has a cross shape and is a key that can indicate the directions of up, down, left, and right. Further, keys 54B to 54D are provided on the lower side of the LCD 51. These three keys 54B to 54D are disposed at positions where the left and right hands can be operated. Further, four keys 54E to 54H are provided on the right side of the LCD 51 and on the lower side of the right analog rocker 53B. That is, the four keys 54E to 54H are disposed at positions where the user's right hand can operate. Further, the four keys 54E to 54H are arranged such that they are in a positional relationship of up, down, left, and right (with respect to the center positions of the four keys 54E to 54H). Therefore, the terminal device 8 can have the four keys 54E to 54H having a function for indicating the up, down, left, and right directions to the user's keys.

Further, as shown in (a), (b), and (c) of FIG. 32, the first L-key 54I and the first R-key 54J are provided on the obliquely upper portion (the upper left portion and the upper right portion of the outer cover 50). ). Specifically, the first L-key 54I is provided on the left side end of the upper side of the plate-shaped outer cover 50, and is exposed from the upper side and the left side. Further, the first R key 54J is provided on the right side end of the upper side of the plate-shaped outer cover 50, and is exposed from the upper side and the right side surface. In this manner, the first L key 54I is disposed at a position where the user's left index finger can be operated, and the first R key 54J is disposed at a position where the user's right index finger can be operated (see FIG. 9).

Further, as shown in FIGS. 32(b) and (c), the second L key 54K and the second R key 54L are disposed on the back surface of the plate-shaped outer cover 50 (that is, on the opposite side of the surface on which the LCD 51 is provided). The upper portions 59A and 59B are provided on the protrusions. Similarly to the crotch portion 59 of the above-described embodiment, each of the leg portions 59A and 59B is provided in a region including the opposite side of the operation portion (the various types of the rockers 53A and 53B) provided on the left and right sides of the display portion. Further, the second L-key 54K is provided slightly above the rear left side of the outer cover 50 (the left side when viewed from the front side), and the second R-key 54L is provided slightly above the rear right side of the outer cover 50 (the right side when viewed from the front side) At the office. In other words, the second L key 54K is provided at a position substantially opposite to the left analog rocker 53A provided on the surface, and the second R key 54L is provided at a position substantially opposite to the right analog rocker 53B provided on the surface. In this manner, the second L key 54K is disposed at a position where the user's left middle finger can be operated, and the second R key 54L is disposed at a position at which the user's right middle finger can operate (see FIG. 9). Further, as shown in Fig. 32(c), the second L key 54K and the second R key 54L are provided on the upper surface of the above-mentioned foot portions 59A and 59B, and have a key surface that is obliquely upward. When the user holds the terminal device 8, the middle finger is estimated to move in the up and down direction. Therefore, the user can easily press the second L key 54K and the second R key 54L by pressing the button surface upward. Further, by providing the foot on the back surface of the outer cover 50, the user can easily hold the outer cover 50, and by providing a key on the foot, it is easy to operate under the outer cover 50.

In the terminal device 8 shown in FIG. 32, when the second L-key 54K and the second R-key 54L are provided on the back surface, when the terminal device 8 is downloaded with the screen of the LCD 51 (the surface of the cover 50) facing upward, there is The picture will not be fully horizontal. Therefore, in other embodiments, three or more feet may be provided on the back surface of the outer cover 50. According to this, the foot can be placed on the ground by bringing the foot into contact with the ground while the screen of the LCD 51 is facing upward, so that the terminal device 8 can be placed horizontally on the screen. Further, the terminal device 8 may be horizontally placed by adding a detachable foot.

Each of the operation keys 54A to 54L is appropriately assigned with a function corresponding to the game program. For example, the cross key 54A and the keys 54E to 54H can be used for a direction indicating operation or a selection operation, etc., and the keys 54B to 54E can be used for a decision operation or a cancel operation or the like.

Although not shown, the terminal device 8 has a power key for turning on/off the power of the terminal device 8. In addition, the terminal device 8 may also have a button for turning on/off the screen display of the LCD 51, or a button for setting (pairing) connection with the game device 3, or for adjusting the speaker (Fig. 10) The volume button of the speaker 77) is shown.

As shown in Fig. 32(a), the terminal device 8 is provided with a label portion (marker portion 55 shown in Fig. 10) including a marker 55A and a marker 55B on the surface of the outer cover 50. The indicator portion 55 is provided on the upper side of the LCD 51. Each of the marker 55A and the marker 55B is composed of one or more infrared LEDs, similarly to the markers 6R and 6L of the indicator device 6. The indicator unit 55 is the same as the above-described indicator device 6, and is used to cause the game device 3 to calculate the operation of the controller 5 or the like. Further, the game device 3 can control the lighting of each of the infrared LEDs provided in the indicator portion 55.

The terminal device 8 is provided with a camera 56 as an imaging means. The camera 56 includes an imaging element (for example, a CMOS sensor or a CCD sensor or the like) having a predetermined resolution and a lens. As shown in Fig. 32, in the present modification, the camera 56 is provided on the surface of the outer cover 50. Therefore, the camera 56 can image the face of the user holding the terminal device 8, for example, while viewing the LCD 51, the user who is playing the game will be imaged.

The terminal device 8 includes a microphone (a microphone 79 shown in Fig. 10) as a voice input means. A microphone hole 50c is provided on the surface of the outer cover 50. The microphone 79 is disposed inside the outer cover 50 in the microphone hole 50c. The microphone detects the sound of the user and the like, and the sound around the terminal device 8.

The terminal device 8 includes a speaker (a speaker 77 shown in Fig. 10) as a sound output means. As shown in FIG. 32(d), a horn hole 57 is provided on the lower side surface of the outer cover 50. The output sound of the speaker 77 is output from the horn hole 57. In the present modification, the terminal device 8 has two speakers, and a horn hole 57 is provided at each position of the left horn and the right horn.

Further, the terminal device 8 is provided with an expansion connector 58 for connecting another device to the terminal device 8. In the present modification, as shown in FIG. 32(d), the expansion connector 58 is provided on the lower side surface of the outer cover 50. The other device connected to the expansion connector 58 may be any device such as a controller (gun type controller or the like) used in a specific game or an input device such as a keyboard. If there is no need to connect other devices, the expansion connector 58 may not be provided.

Regarding the terminal device 8 shown in Fig. 32, the shape of each operation key or the cover 50, the number of each component, the installation position, and the like are merely examples, and may be other shapes, numbers, and installation positions.

As described above, in the above-described modification, the two leg portions 59A and 59B provided at the left and right sides of the outer cover 50 are provided as projections. At this time, in the same manner as in the above-described embodiment, the terminal device 8 can be easily held by the user while holding the terminal device 8 with the ring finger or the middle finger hooked on the lower surface of the protruding portion (see FIG. 33). Further, similarly to the above-described embodiment, since the second L key 54K and the second R key 54L are provided on the upper surface of the projection, the user can easily operate the keys in the above state.

In the above-described embodiments and modifications, the protruding portion is preferably provided on the back side of the outer cover, and is provided on the upper side of the outer cover and protruded at least at the left and right sides. According to this, when the user grips the left and right sides of the outer cover, the protruding portion can hook the finger, and the terminal device can be easily held. Further, by providing the protrusion on the upper side, the user can support the cover by the palm (refer to FIG. 10 or the like), and the operation device can be gripped tightly.

The protrusion may not be provided on the upper side of the center of the outer cover. For example, when the operation portion is provided on the right and left sides of the display portion, the protrusion portion can be set to any finger other than the thumb in a state where the user can hold the cover with the operation of each of the operation portions. The position that can be hooked. Thereby, the user can easily hold the terminal device by hooking the finger to the protruding portion.

Fig. 34 and Fig. 35 are views showing the external configuration of a terminal device according to another modification of the above embodiment. Figure 34 is a right side view showing the terminal device, and Figure 35 is a plan view. The terminal device 9 shown in Figs. 34 and 35 is the same as the terminal device 7 in the above embodiment except that the convex portions 230a and 230b are provided. Hereinafter, the configuration of the terminal device 9 in the present modification will be described focusing on differences from the above-described embodiments.

The convex portions 230a and 230b are convex in cross section, and are provided on the left and right sides of the outer cover 50 on the back side. Here, the convex portion 230a is provided on the left side of the outer cover 50 (the left side when viewed from the front surface side), and the convex portion 230b is provided on the right side of the outer cover 50 (the right side when viewed from the front surface side). As shown in Fig. 35, each of the convex portions 230a and 230b is provided on the left and right sides (both ends) of the outer cover 50. Further, each of the convex portions 230a and 230b is provided below the relatively protruding portion (the crotch portion 59). Each of the convex portions 230a and 230b is provided at an interval from the protruding portion. That is, in the outer cover 50, the portion between the convex portions 230a and 230b and the protruding portion is configured to be thinner than the respective portions. Each of the convex portions 230a and 230b has a convex portion extending in the vertical direction and a cross-sectional shape perpendicular to the vertical direction.

In the present modification, the user holds the convex portions 230a and 230b with the little finger (and the ring finger), and the terminal device 9 can be held tightly. That is, the convex portions 230a and 230b have the function of the grip portion. The convex portion (grip portion) may have any shape, and it is preferable to easily hold the terminal device 9 when it is formed to extend in the vertical direction. Further, the height of each of the convex portions 230a and 230b may be any height, and may be formed to be lower than the protruding portion. According to this, in the state in which the terminal device 9 is placed such that the screen of the LCD 51 faces upward, the lower side of the screen is lower than the upper side, so that the terminal device 9 can be downloaded in an easily viewable state. Further, since the convex portions 230a and 230b are provided at intervals from the protruding portion, the user can hold the terminal device 9 by pressing the finger against the lower surface of the protruding portion, and the convex portion does not become an obstacle of the finger. As described above, according to the above modification, by providing the convex portion below the protruding portion, the user can hold the terminal device tightly. In other embodiments, the protruding portion may not be provided on the back surface of the outer cover 50. In this case, the user may hold the outer cover 50 tightly by the convex portion (grip portion). In addition, the surface of the convex portion (grip portion) can be made of a material that does not easily slide in order to enhance the gripping function. Even if there is no protrusion, a material that is not easy to slide can be used on the back of the cover.

(Modification of the device using this configuration)

In the above embodiment, the terminal device used together with the fixed game device is taken as an example. However, the configuration of the operation device described in the present specification can be applied to any device that the user can hold. For example, the operating device can also be realized as an information terminal such as a portable game machine, a mobile phone, a smart phone, and an e-book terminal device.

The present invention has been described in detail above, but is not intended to limit the scope of the invention. Various modifications and variations are of course possible without departing from the scope of the invention.

(Industry use possibility)

As described above, the present invention is intended to be easily held by a user, for example, an operation device (terminal device) or the like that can be applied to a game system.

1. . . Game system

2. . . TV

2a. . . horn

3. . . Game device

4. . . Disc

5. . . Controller

6. . . Marking device

6L. . . Marker

6R. . . Marker

7. . . Terminal device

8. . . Terminal device

9. . . Terminal device

10. . . CPU

11. . . System LSI

11a. . . Output input processor

11b. . . GPU

11c. . . DSP

11d. . . VRAM

11e. . . Internal main memory

12. . . External main memory

13. . . ROM/RTC

14. . . CD player

15. . . AV-IC

16. . . AV connector

17. . . Flash memory

18. . . Network communication module

19. . . Controller communication module

20. . . Expansion connector

twenty one. . . Memory card connector

twenty two. . . antenna

twenty three. . . antenna

twenty four. . . Power button

25. . . Reset button

26. . . Rewind button

27. . . Codec LSI

28. . . Terminal communication module

29. . . antenna

30. . . Substrate

31. . . Cover

31a. . . Sound hole

32. . . Operation department

32a. . . Cross key

32b. . . Key 1

32c. . . Key 2

32d. . . A key

32e. . . Minus key

32f. . . Home button

32g. . . Positive key

32h. . . Power button

32i. . . B key

33. . . Connector

33a. . . Locking hole

34a. . . led

34b. . . led

34c. . . led

34d. . . led

35. . . Camera information computing department

35a. . . Light incident surface

36. . . Ministry of Communications

37. . . Acceleration sensor

38. . . Infrared filter

39. . . lens

40. . . Camera element

41. . . Image processing circuit

42. . . Microcomputer

43. . . Memory

44. . . Wireless module

45. . . antenna

46. . . Vibrator

47. . . horn

48. . . Rotary sensor

50. . . Cover

50a. . . Locking hole

50b. . . Locking hole

50c. . . Microphone hole

51. . . LCD

52. . . Touch panel

53. . . Analog rocker

53A. . . Analog rocker

53B. . . Analog rocker

54. . . Operation key

54A to 54M. . . Operation key (button)

55‧‧‧Marking Department

55A‧‧‧ marker

55B‧‧‧ marker

56‧‧‧ camera

57‧‧‧ horn hole

58‧‧‧Expansion connector

59‧‧‧檐

59a‧‧‧Knock hole

59b‧‧‧ card hole

60‧‧‧ stylus

60a‧‧‧ accommodating holes

61‧‧‧ 盖部

62‧‧‧Sound output terminal

63‧‧‧Acceleration sensor

63‧‧‧ window

64‧‧‧ knob

65a‧‧ hole

65b‧‧‧ hole

66‧‧‧Charging terminal

67‧‧‧Battery cover

69‧‧‧Microphone

71‧‧‧Touch Panel Controller

72‧‧‧Magnetic sensor

73‧‧‧Acceleration sensor

74‧‧‧Rotary sensor

75‧‧‧User interface controller

76‧‧‧ Codec LSI

77‧‧‧ Speaker

78‧‧‧Sound IC

79‧‧‧ microphone

80‧‧‧Wireless Module

81‧‧‧Antenna

82‧‧‧Infrared communication module

83‧‧‧Flash memory

84‧‧‧Power IC

85‧‧‧Battery

86‧‧‧Charger

87‧‧‧CPU

88‧‧‧Internal memory

90‧‧‧ game program

91‧‧‧ Receiving information

92‧‧‧Controller operation data

93‧‧‧1st operation key data

94‧‧‧1st acceleration data

95‧‧‧1st angular velocity data

96‧‧‧Marker coordinate data

97‧‧‧ terminal operation data

98‧‧‧2nd operation key data

99. . . Rocker data

100. . . Touch location data

101. . . 2nd acceleration data

102. . . 2nd angular velocity data

103. . . Azimuth data

104. . . Camera image data

105. . . Microphone sound data

106. . . Processing data

107. . . Control data

108. . . Controller posture data

109. . . Terminal posture data

110. . . Image identification data

111. . . Sound identification data

121. . . Darts

122. . . Control surface

123. . . Target

124. . . Stylus

131. . . cannon

132. . . Shell

133. . . Target

141. . . Hitter (wit object)

142. . . Pitcher (pitcher object)

143. . . cursor

151. . . Aircraft (aircraft object)

152. . . cannon

153. . . Target

154. . . front sight

160. . . Player

161. . . Character

162. . . High club

162a. . . Head

163. . . ball

164. . . Image (head image)

190. . . network

191. . . External device

200. . . Input device

200a. . . First grip

200b. . . Second grip

201. . . 1st key

202. . . 2nd key

203. . . 3rd key

204. . . Rocker

205. . . Support

205a. . . Claw

205b. . . Claw

205c. . . Claw

206. . . Connecting member

207. . . 4th key

208. . . Window

209. . . Connection

210. . . support

211. . . Support member

211a. . . Wall

211b. . . Groove

212. . . Charging terminal

213a. . . Guide member

213b. . . Guide member

220. . . Input device

230. . . Convex

230a. . . Convex

230b. . . Convex

FIG. 1 is an external view of the game system 1.

Fig. 2 is a block diagram showing the internal structure of the game device 3.

Fig. 3 is a perspective view showing the appearance of the controller 5.

Fig. 4 is a perspective view showing the appearance of the controller 5.

Fig. 5 is a view showing the internal configuration of the controller 5.

Fig. 6 is a view showing the internal configuration of the controller 5.

Fig. 7 is a block diagram showing the configuration of the controller 5.

Fig. 8 is a view showing the appearance of the terminal device 7.

Fig. 9 is a view showing the appearance of the terminal device 7.

Fig. 10 is a view showing a state in which the user holds the terminal device 7 laterally.

Fig. 11 is a view showing a state in which the user holds the terminal device 7 laterally.

Fig. 12 is a view showing a state in which the user holds the terminal device 7 in the longitudinal direction.

Fig. 13 is a view showing a state in which the user holds the terminal device 7 in the longitudinal direction.

Fig. 14 is a block diagram showing the internal configuration of the terminal device 7.

Fig. 15 is a view showing an example in which an attachment device (input device 200) is attached to the terminal device 7.

Fig. 16 is a view showing an example in which an attachment device (input device 200) is attached to the terminal device 7.

Fig. 17 is a view showing another example of the input device.

Fig. 18 is a view showing the appearance of the input device 220 shown in Fig. 17 attached to the terminal device 7.

Fig. 19 is a view showing the appearance of the input device 220 shown in Fig. 17 attached to the terminal device 7.

Fig. 20 is a view showing another example of attaching the attachment device (bracket 210) to the terminal device 7.

Figure 21 is a diagram showing various materials used in game processing.

Fig. 22 is a main flowchart showing the flow of the game processing executed in the game device 3.

Figure 23 is a flow chart showing the detailed flow of the game control process.

Fig. 24 is a view showing the screen of the television 2 and the terminal device 7 in the first game example.

Fig. 25 is a view showing the screen of the television 2 and the terminal device 7 in the second game example.

Fig. 26 is a view showing an example of a television game image displayed on the television 2 in the third game example.

Fig. 27 is a view showing an example of a terminal game image displayed on the terminal device 7 in the third game example.

Fig. 28 is a view showing an example of a television game image displayed on the television 2 in the fourth game example.

Fig. 29 is a view showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example.

Fig. 30 is a view showing the appearance of the game system 1 in the fifth game example.

Fig. 31 is a view showing the connection relationship of the devices included in the game system 1 when connected to an external device via a network.

Fig. 32 is a view showing an external configuration of a terminal device according to a modification of the embodiment.

Fig. 33 is a view showing a state in which the user holds the terminal device shown in Fig. 32.

Fig. 34 is a view showing the appearance of a terminal device according to another modification of the embodiment.

Fig. 35 is a view showing an external configuration of a terminal device according to another modification of the embodiment.

50. . . Cover

50a. . . Locking hole

50b. . . Locking hole

50c. . . Microphone hole

51. . . LCD

52. . . Touch panel

53A. . . Analog rocker

53B. . . Analog rocker

54A to 54M. . . Operation key (button)

55A. . . Marker

55B. . . Marker

56. . . camera

57. . . Horn hole

58. . . Expansion connector

59. . . Crotch

59a. . . Locking hole

59b. . . Locking hole

60. . . Stylus

60a. . . Storage hole

61. . . Cover

62. . . Magnetic sensor

63. . . Acceleration sensor

64. . . Rotary sensor

65a. . . hole

65b. . . hole

66. . . Charging terminal

Claims (23)

  1. An operation device for an operation of a user, comprising: a cover having a substantially plate shape; a display portion provided on a surface side of the cover; and a protrusion provided on a back side of the cover, and The protrusion is located above the center of the outer cover and at least on the left and right sides; and a first locking hole that can be locked by an additional device different from the operating device is provided on the lower surface of the protruding portion.
  2. The operation device according to claim 1, wherein the first operation unit and the second operation unit are provided on the upper side of the center of the outer cover and on the left and right sides of the display unit.
  3. The operation device according to claim 2, wherein the protruding portion is provided in a region including a position opposite to the first operation portion and the second operation portion.
  4. The operation device according to claim 2, further comprising: a fifth operation unit disposed on a surface on a surface side of the outer cover and below the first operation unit; and a sixth operation unit disposed in the sixth operation unit The surface on the front surface side of the outer cover is below the second operation portion.
  5. An operation device includes: a cover having a substantially plate shape; a display portion provided on a front surface side of the cover; and a first operation portion and a second operation portion respectively provided on left and right sides of the display portion; In the protruding portion, when the user holds the outer cover so that the first operation portion and the second operation portion can be operated by the thumb of both hands, any finger other than the thumb provided on the back side of the outer cover can be hooked. a position; a first locking hole that can be locked by an additional device different from the operating device is provided on a lower surface of the protruding portion.
  6. The operation device according to claim 5, wherein the protrusion is provided in a region including a position opposite to the first operation portion and the second operation portion.
  7. The operation device according to claim 5, further comprising: a fifth operation unit disposed on a surface on a surface side of the outer cover and below the first operation unit; and a sixth operation unit disposed in the sixth operation unit The surface on the front surface side of the outer cover is below the second operation portion.
  8. An operation device includes: an outer cover having a substantially plate shape; a display portion provided on a surface side of the outer cover; and a protruding portion provided on an inner surface side of the outer cover and protruding at at least left and right sides; and an operation portion a surface provided on the upper side of the protruding portion; and a first locking hole that can be locked by an additional device different from the operating device is provided on the lower surface of the protruding portion.
  9. An operation device, which is an operation device for allowing a user to operate, has an outer cover and is substantially plate-shaped; The display portion is provided on a surface side of the outer cover; the grip portion is provided on the left and right sides of the outer cover on the back side of the outer cover so as to extend in the vertical direction, and has a convex cross section; and the protruding portion is provided in the foregoing The inner surface side of the outer cover protrudes from at least the left and right sides, and a first locking hole that can be locked by an additional device different from the operating device is provided on the lower surface of the protruding portion.
  10. The operation device according to claim 9, wherein the protrusion portion is provided on the back side of the cover, and protrudes at an upper side of the grip portion and at least at right and left sides.
  11. An information processing device is a flat type information processing device including: an outer cover having a substantially plate shape; a display portion provided on a surface side of the outer cover; and a protruding portion provided on an inner surface side of the outer cover, and The outer cover has a center on the upper side and protrudes at least at the left and right sides. On the lower surface of the protrusion, a first locking hole that can be locked by an attachment different from the operation device is provided.
  12. The operation device according to claim 1 or 5, further comprising: a third operation portion and a fourth operation portion provided on the upper and lower sides of the outer cover on the upper surface of the protrusion.
  13. The operation device according to claim 1 or 5, wherein the protruding portion has a meandering shape extending to the left and right.
  14. The operating device according to claim 1 or 5, wherein the surface of the lower side of the outer cover is provided with the aforementioned additional device The second locking hole that is locked.
  15. The operation device according to claim 1 or 5, wherein a convex portion having a convex cross section is provided on a right and left sides of the back surface of the outer cover.
  16. The operation device according to claim 15, wherein the protruding portion and the protruding portion are provided at intervals.
  17. The operation device according to claim 1 or 5, further comprising: a grip portion provided on the left and right sides of the back surface of the outer cover.
  18. The operation device according to claim 1, wherein the upper surface of the outer cover is provided with a seventh operation portion and an eighth operation portion on the left and right sides.
  19. The operation device according to the first, fifth, eighth or ninth aspect of the invention, wherein the touch panel provided on the screen of the display unit is provided.
  20. The operating device according to claim 1, wherein the inertial sensor is provided inside the outer cover.
  21. The operating device according to claim 1, wherein the communication unit transmits the operation data showing the operation performed by the own machine to the game device and receives the data in a wireless manner. An image data transmitted from the game device; and a display control unit that displays the received image data on the display unit.
  22. The operation device according to claim 1, wherein the game processing unit executes a game process based on an operation of the own machine; and the display control unit generates the game process according to the game process. The game image is displayed on the aforementioned display portion.
  23. The operation device according to any one of claims 1, 5, 8 or 9, wherein the display unit has a screen of 5 inches or more.
TW100126152A 2010-11-01 2011-07-25 Controller device and information processing device TWI442963B (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
JP2010245298 2010-11-01
JP2010245299A JP4798809B1 (en) 2010-11-01 2010-11-01 Display device, game system, and game processing method
JP2011092506 2011-04-18
JP2011092612A JP6103677B2 (en) 2010-11-01 2011-04-19 Game system, operation device, and game processing method
JP2011102834A JP5837325B2 (en) 2010-11-01 2011-05-02 Operating device and operating system
JP2011103706A JP6005908B2 (en) 2010-11-01 2011-05-06 Equipment support system and support device
JP2011103705 2011-05-06
JP2011103704A JP6005907B2 (en) 2010-11-01 2011-05-06 Operating device and operating system
JP2011118488A JP5936315B2 (en) 2010-11-01 2011-05-26 Information processing system and information processing apparatus

Publications (2)

Publication Number Publication Date
TW201220109A TW201220109A (en) 2012-05-16
TWI442963B true TWI442963B (en) 2014-07-01

Family

ID=46518614

Family Applications (2)

Application Number Title Priority Date Filing Date
TW100126151A TWI440496B (en) 2010-11-01 2011-07-25 Controller device and controller system
TW100126152A TWI442963B (en) 2010-11-01 2011-07-25 Controller device and information processing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
TW100126151A TWI440496B (en) 2010-11-01 2011-07-25 Controller device and controller system

Country Status (3)

Country Link
CN (7) CN202355829U (en)
AU (2) AU2011213764B2 (en)
TW (2) TWI440496B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8814686B2 (en) 2010-02-03 2014-08-26 Nintendo Co., Ltd. Display device, game system, and game method
US8684842B2 (en) 2010-02-03 2014-04-01 Nintendo Co., Ltd. Display device, game system, and game process method
US8913009B2 (en) 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US8339364B2 (en) 2010-02-03 2012-12-25 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
JP6243586B2 (en) 2010-08-06 2017-12-06 任天堂株式会社 Game system, game device, game program, and game processing method
US10150033B2 (en) 2010-08-20 2018-12-11 Nintendo Co., Ltd. Position calculation system, position calculation device, storage medium storing position calculation program, and position calculation method
JP5840386B2 (en) 2010-08-30 2016-01-06 任天堂株式会社 Game system, game device, game program, and game processing method
JP5840385B2 (en) 2010-08-30 2016-01-06 任天堂株式会社 Game system, game device, game program, and game processing method
KR101364826B1 (en) 2010-11-01 2014-02-20 닌텐도가부시키가이샤 Operating apparatus and operating system
TWI440496B (en) * 2010-11-01 2014-06-11 Nintendo Co Ltd Controller device and controller system
JP5689014B2 (en) 2011-04-07 2015-03-25 任天堂株式会社 Input system, information processing apparatus, information processing program, and three-dimensional position calculation method
US20130326104A1 (en) * 2012-06-01 2013-12-05 Nvidia Corporation Methodology for using smartphone and mobile computer in a mobile compute environment
TWI487554B (en) * 2013-02-06 2015-06-11 Univ Southern Taiwan Sci & Tec Game machine control method
CN105302232A (en) 2014-06-04 2016-02-03 振桦电子股份有限公司 Tablet computer having detachable handle
JP6341568B2 (en) * 2014-08-05 2018-06-13 アルプス電気株式会社 Coordinate input device
US20180101247A1 (en) * 2016-10-06 2018-04-12 Htc Corporation System and method for detecting hand gesture
US20180188816A1 (en) * 2017-01-04 2018-07-05 Htc Corporation Controller for finger gesture recognition and method for recognizing finger gesture
CN108031111A (en) * 2017-12-29 2018-05-15 安徽科创智慧知识产权服务有限公司 Have wireless and wired connection handle system concurrently

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2231856T3 (en) * 1996-03-05 2005-05-16 Sega Enterprises, Ltd. Controller and expansion unit for the controller.
US6966837B1 (en) * 2001-05-10 2005-11-22 Best Robert M Linked portable and video game systems
WO2003007117A2 (en) * 2001-07-12 2003-01-23 Friedman Gary L Portable, hand-held electronic input device and combination with a personal digital device
US6773349B2 (en) * 2002-07-31 2004-08-10 Intec, Inc. Video game controller with integrated video display
US20060252537A1 (en) * 2005-04-21 2006-11-09 Wen-An Wu Portable wireless control apparatus
JP4778267B2 (en) * 2005-05-16 2011-09-21 任天堂株式会社 Game machine operating device and portable game machine
TWM278452U (en) * 2005-06-03 2005-10-21 Weistech Technology Co Ltd Game controlling handle having a display device
AU2009221762B2 (en) * 2008-03-07 2013-06-27 Milwaukee Electric Tool Corporation Visual inspection device
US8384680B2 (en) * 2008-12-23 2013-02-26 Research In Motion Limited Portable electronic device and method of control
CN201572520U (en) * 2009-12-23 2010-09-08 周建正 Three-in-one support for game consoles
TWI440496B (en) * 2010-11-01 2014-06-11 Nintendo Co Ltd Controller device and controller system

Also Published As

Publication number Publication date
TW201219093A (en) 2012-05-16
TWI440496B (en) 2014-06-11
CN102600611B (en) 2015-03-11
CN202355829U (en) 2012-08-01
TW201220109A (en) 2012-05-16
CN202398092U (en) 2012-08-29
AU2011213764A1 (en) 2012-05-17
CN102600612A (en) 2012-07-25
CN102600614B (en) 2015-11-25
AU2011213765A1 (en) 2012-05-17
CN102600614A (en) 2012-07-25
CN102600612B (en) 2015-12-02
CN202398095U (en) 2012-08-29
AU2011213764B2 (en) 2013-10-24
CN102600611A (en) 2012-07-25
AU2011213765B2 (en) 2013-07-11
CN202355827U (en) 2012-08-01

Similar Documents

Publication Publication Date Title
US8550915B2 (en) Game controller with adapter duplicating control functions
US9733702B2 (en) Video game using dual motion sensing controllers
ES2711609T3 (en) Video game controller and video game system
US8882596B2 (en) Game program and game apparatus
US8409003B2 (en) Game controller and game system
EP1900406B1 (en) Game device and storage medium storing game program
JP5188682B2 (en) Game device, game program, game system, and game control method
US8308563B2 (en) Game system and storage medium having game program stored thereon
US7831064B2 (en) Position calculation apparatus, storage medium storing position calculation program, game apparatus, and storage medium storing game program
JP5430246B2 (en) Game device and game program
JP5330640B2 (en) Game program, game device, game system, and game processing method
US9345962B2 (en) Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
JP5361349B2 (en) Information processing apparatus, computer program, information processing system, and information processing method
CN102198330B (en) Game system
US8797264B2 (en) Image processing apparatus and storage medium storing image processing program
JP5131809B2 (en) Game device and game program
US9199168B2 (en) Game system, game apparatus, storage medium having game program stored therein, and game process method
EP2422854B1 (en) Game system, game device, storage medium storing game program, and game process method
JP2009064449A (en) Controller and computer system
JP2009172010A (en) Information processing program and information processor
AU2011204816B8 (en) Display device, game system, and game process method
JP5506129B2 (en) Game program, game device, game system, and game processing method
CN102462960B (en) Controller device and controller system
JP5520457B2 (en) Game device and game program
US10150033B2 (en) Position calculation system, position calculation device, storage medium storing position calculation program, and position calculation method