KR20130020715A - Operating apparatus and operating system - Google Patents

Operating apparatus and operating system Download PDF

Info

Publication number
KR20130020715A
KR20130020715A KR1020130014536A KR20130014536A KR20130020715A KR 20130020715 A KR20130020715 A KR 20130020715A KR 1020130014536 A KR1020130014536 A KR 1020130014536A KR 20130014536 A KR20130014536 A KR 20130014536A KR 20130020715 A KR20130020715 A KR 20130020715A
Authority
KR
South Korea
Prior art keywords
game
operation
terminal device
device
data
Prior art date
Application number
KR1020130014536A
Other languages
Korean (ko)
Inventor
겐-이찌로오 아시다
요시또모 고또오
다까노리 오까무라
준지 다까모또
마사또 이부끼
신지 야마모또
히또시 쯔찌야
후미요시 스에따께
아끼꼬 스가
나오야 야마모또
다이스께 구마자끼
Original Assignee
닌텐도가부시키가이샤
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JPJP-P-2010-245299 priority Critical
Priority to JP2010245299A priority patent/JP4798809B1/en
Priority to JPJP-P-2010-245298 priority
Priority to JP2010245298 priority
Priority to JPJP-P-2011-092506 priority
Priority to JP2011092506 priority
Priority to JP2011092612A priority patent/JP6103677B2/en
Priority to JPJP-P-2011-092612 priority
Priority to JPJP-P-2011-102834 priority
Priority to JP2011102834A priority patent/JP5837325B2/en
Priority to JP2011103704A priority patent/JP6005907B2/en
Priority to JP2011103705 priority
Priority to JP2011103706A priority patent/JP6005908B2/en
Priority to JPJP-P-2011-103704 priority
Priority to JPJP-P-2011-103705 priority
Priority to JPJP-P-2011-103706 priority
Priority to JP2011118488A priority patent/JP5936315B2/en
Priority to JPJP-P-2011-118488 priority
Application filed by 닌텐도가부시키가이샤 filed Critical 닌텐도가부시키가이샤
Publication of KR20130020715A publication Critical patent/KR20130020715A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2401Detail of input, input devices
    • A63F2009/2402Input by manual operation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1043Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being characterized by constructional details
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen

Abstract

The present invention provides a manipulation apparatus that can be easily gripped by a user. The terminal device 7 includes a substantially plate-shaped housing 50, an LCD 51 provided on the surface side of the housing, analog sticks 53A and 53B, a second L button 54K and a second R button 54L. Equipped with. The analog sticks 53A and 53B are provided on the left and right sides of the LCD 51 above the center of the housing 50, respectively. The second L button 54K and the second R button 54L are provided on opposite sides of the analog sticks 53A and 53B on the back side of the housing 50, respectively.

Description

Operating device and operating system {OPERATING APPARATUS AND OPERATING SYSTEM}

The present invention relates to an operation apparatus that a player can grip and operate.

Conventionally, there exists an operation apparatus which a player hold by hand (for example, refer patent document 1). For example, the portable game device described in Patent Document 1 is a folding type, and an operation button is provided on a lower housing. According to this game device, the user can perform a game operation by using operation buttons provided on both sides of the screen while looking at the screen, and can easily perform the game operation while holding the game device.

Japanese Patent No.3703473

In recent years, with respect to a portable terminal device (manipulation device), screens and the like have become larger, and the devices themselves have also increased in size. Here, if the device itself held by the user becomes large, there is a possibility that the device becomes difficult to hold.

Therefore, the objective of this invention is providing the operation apparatus which a user can hold easily.

In order to solve the said subject, this invention employ | adopted the structure of the following (1)-(21).

(1) An example of the present invention is an operating device including a display unit, a first operating unit and a second operating unit, and a third operating unit and a fourth operating unit. The display portion is provided on the surface side of the housing. The first operation portion and the second operation portion are provided on the left and right sides of the display portion above the center of the housing, respectively. The third operation portion and the fourth operation portion are provided at positions opposite to the first operation portion and the second operation portion on the back side of the housing, respectively.

The above-mentioned "operation part" may be any as long as it is an operation device which can be operated by a user, For example, it is a stick (analog stick), a key (button), a touch panel, a touch pad, etc. in embodiment mentioned later.

The above-mentioned "position on the opposite side" does not mean to be strictly limited to the state in which the positions of the two operation portions coincide, and the case where the area where the operation portion is provided on the front surface side of the housing is projected on the back surface side of the housing It is the meaning also including the state in which the area | region in which an operation part is provided in the back surface side, and the projected area overlap in part.

According to the configuration of (1), the first and second operation portions and the third and fourth operation portions are disposed at positions facing each other on the front side and the back side of the housing, so that the user may face the housing when operating these operation portions. It can be gripped so that it may pinch from a side and a back side. Moreover, when operating these operation parts, since a user grips an upper side rather than the center of the up-down direction in a housing, the user can hold an operation apparatus from the upper side and can support an operation apparatus with a palm. Therefore, the user can stably hold the operating device in a state in which at least four operating units can be operated. That is, according to the configuration of the above (1), it is possible to provide an operation device which can be easily gripped by the user and has good operability.

(2) The operation device may further include a protrusion. The protruding portion is formed to protrude at least on both the left and right positions on the rear surface side of the housing. At this time, the third operation portion and the fourth operation portion are disposed on the upper surface of the projection portion.

According to the configuration of (2), since the projections are formed on the rear surface side of the housing, when operating each of the above operation portions, the user supports the projections with a middle finger, a weak finger, or the like (the projections are caught by the fingers). ) The operation device can be gripped. As a result, the user can hold the operating device in a stable state without fatigue.

(3) The projection may have a shape of a sunshade extending from side to side.

According to the configuration of (3), the user can hold the operating device with the finger supporting the protruding portion along the lower surface of the protruding portion, so that the operating device is more likely to be gripped. Further, since the projections are formed to extend left and right, when the user grips the operation device so that the projections are in the vertical direction, fingers other than the thumb may be touched to the projections even if they are held at any position on one side of the operation device. Can be. Therefore, even when the projection is gripped so that the projection is in the vertical direction, the user can reliably grip the manipulation device.

(4) The lower surface of the projection may be provided with a first locking hole to which an additional device different from the operation device can be engaged.

According to the configuration of (4), the operating device and the attachment can be firmly connected using the first locking hole. In addition, when combining the structure of said (3) and the structure of (4), since a 1st locking hole can be formed in the vicinity of the center of the left-right direction of an operating apparatus, the left-right balance is equally maintained, and an additional apparatus is provided. It can be connected stably.

(5) The lower surface of the housing may be provided with a second locking hole to which the attachment can be locked.

According to the configuration of (5), since the operating device and the additional device are connected by using the first locking hole and the second locking hole formed at different positions, the connection can be made more secure.

(6) The operation device may further include a convex portion having a convex cross section at both the left and right sides of the rear side of the housing under the protruding portion.

According to the configuration of (6), the user can grip the housing by holding a finger (for example, a weak finger or a little finger) on the convex portion, so that the operating device can be held more reliably.

(7) The projections and the convex portions may be formed at intervals.

According to the above configuration (7), the user can support the protrusion with his or her fingers, weak fingers, or the like without disturbing the convex portion, and can hold the operating device by hanging another finger on the convex portion. This makes the operation device easier to grip.

(8) The operation device may further include grip portions provided on both the left and right sides on the rear surface of the housing.

According to the configuration of (8), the user can grip the housing by putting a finger (for example, a weak finger or a little finger) on the grip portion, and thus can hold the operating device more reliably.

(9) The 1st operation part and the 2nd operation part may be the direction input part which has a movable member which can slide or a hardness, respectively.

According to the configuration of (9), the user can easily perform the direction input by operating the direction input unit with the thumb while holding the left and right sides of the operation apparatus. Thereby, the operation apparatus with good operability can be provided.

(10) The 3rd operation part and the 4th operation part may be keys which can be pressed, respectively.

According to the configuration of (10), the user can easily press the key by the index finger or the middle finger while holding the left and right sides of the operating device. Thereby, the operation apparatus with good operability can be provided.

(11) The operation device may further include a fifth operation portion and a sixth operation portion. The fifth operation portion is disposed below the first operation portion on the surface side surface of the housing. The sixth operation portion is disposed below the second operation portion on the surface side surface of the housing.

According to the configuration of (11), more various operations are possible by using the operation device. Further, even when the fifth operation unit and the sixth operation unit are operated, the user can reliably grip the operation device, so that the operation device with good operability can be provided.

(12) The fifth operation part may be a key capable of inputting at least four directions in up, down, left, and right directions, and the sixth operation part may include a plurality of pushable keys.

According to the configuration of (12), the user can easily press the key by the thumb in a state in which both the left and right sides of the operating device are gripped. Thereby, the operation apparatus with good operability can be provided.

(13) The operation device may further include a seventh operation portion and an eighth operation portion that are respectively provided on the left and right sides on the upper surface of the housing.

According to the above configuration (13), more various operations can be performed using the operation device. In addition, since the operation portion is disposed on the upper surface of the housing, the user can reliably grip the operating device by wrapping the housing from the front side, the upper side, and the rear side of the housing.

(14) The seventh and eighth operation portions may be keys that can be pressed respectively.

According to the configuration of the above (14), the user can easily press the key by the index finger in the state of holding and holding the operation device. Thereby, the operation apparatus with good operability can be provided.

(15) The operation device may further include a touch panel provided on the screen of the display unit.

According to the above configuration (15), the user can intuitively and easily operate the image displayed on the display unit using the touch panel. In addition, when combining the structure of said (2) and the structure of said (15), when an operation apparatus is mounted facing up a display part, it is loaded in the state slightly inclined by the projection part. Therefore, it becomes easy to perform operation with respect to a touch panel in the state which mounted the operation apparatus.

(16) The operation device may further include an inertial sensor inside the housing.

According to the configuration of (16), the operation of swinging or moving the operation device itself becomes possible, and the user can perform an intuitive and easy operation using the operation device. In addition, according to this, since it is assumed to move and use an operation apparatus, when an additional apparatus is connected to an operation apparatus, it becomes important to connect an operation apparatus and an additional apparatus firmly. Therefore, in the configuration of the above (16), by employing the configuration of the above (4) or (5), it is particularly effective to firmly connect the operating device and the additional device.

(17) The operation device may further include a communication unit for wirelessly transmitting operation data indicating an operation performed on the device to the game device.

According to the configuration of (17), the user can easily grip and perform game operation using an operation device having good operability.

(18) The communication unit may receive image data transmitted from the game device. At this time, the operation device further includes a display control unit which displays the received image data on the display unit.

According to the configuration of (18), since the image transmitted from the game apparatus is displayed on the display unit, the user can perform game operation while watching the image displayed on the display unit of the operating apparatus.

(19) The operation device may further include a game processing unit and a display control unit. The game processing unit executes the game processing based on the manipulations to itself. The display control unit generates a game image based on the result of the game processing and displays it on the display unit.

According to the configuration of (19), the portable game device can be easily gripped and can be made to have good operability.

(20) The display portion may have a screen of 5 inches or more.

According to the configuration of (20), it is possible to display an image that is easy to see and powerful using a large screen. In addition, when the display unit of a large screen is used as in the configuration of (20) above, the size of the operation apparatus itself is inevitably increased, so that the configurations of (1) to (19) described above that the user can easily grip Especially valid.

(21) In addition, another example of the present invention may be an operation system including the operation device described in the above (5) and an additional device. The attachment device is provided with respective hook portions that can be locked to the first and second locking holes, respectively, and the hook portions are connected to the operating device by locking the first and second locking holes.

According to the configuration of the above (21), it is possible to provide an operating system including a rigidly connected operating device and additional devices.

(22) In addition, another example of the present invention may be an operation system including the operation device described in the above (5) and a support device. The support device has a guide member and a support member. The guide member can be inserted into the second locking hole. In addition, the support member supports the rear surface of the housing at a predetermined angle when the guide member is inserted into the second locking hole.

According to the configuration of the above (22), it is possible to provide an operation system capable of loading the operation device at a predetermined angle. In addition, since the second locking hole is used for positioning when connecting the operating device and the supporting device, the number of holes formed in the housing of the operating device can be reduced, so that the shape of the housing can be made simple and easy.

Moreover, another example of this invention is an operation apparatus for a user to operate, and may be an operation apparatus provided with a substantially plate-shaped housing, the display part provided in the surface side of the said housing, and a protrusion part. The protruding portion is formed on the rear surface side of the housing and protrudes at a position above the center of the housing and at least on both the left and right sides thereof.

Another example of the present invention may be an operating device including a substantially plate-shaped housing, a display portion provided on the surface side of the housing, a first operating portion, a second operating portion, and a projection. The first operation unit and the second operation unit are provided on the left and right sides of the display unit, respectively. The projection is formed at a position that can be caught by one finger other than the thumb when the user grips the housing so that the user can operate the first operation unit and the second operation unit with the thumbs of both hands.

Moreover, another example of this invention is an operation apparatus for a user to operate, and may be an operation apparatus provided with a substantially plate-shaped housing, the display part provided in the surface side of the said housing, and a convex part. The convex portions are formed on the left and right sides of the housing on the rear surface side of the housing. The convex portion extends in the vertical direction and has a convex cross section.

Another example of the present invention may be an operating device including a substantially plate-like housing, a display portion provided on the surface side of the housing, a projection portion, and an operation portion. The protruding portion is formed to protrude in at least left and right positions on the rear surface side of the housing. The operation portion is provided on the upper side of the protrusion.

According to the present invention, the first and second operation portions are provided on the left and right sides of the housing above the center on the surface side of the housing, respectively, and the third and fourth operation portions are located on the opposite side of the first operation portion and the second operation portion on the rear surface side of the housing. Install it. As a result, the user can easily hold the operation device.

1 is an external view of a game system 1.
2 is a block diagram showing an internal configuration of a game device 3.
3 is a perspective view showing an appearance configuration of the controller 5;
4 is a perspective view showing an appearance configuration of the controller 5;
5 shows the internal structure of the controller 5;
6 shows the internal structure of the controller 5;
7 is a block diagram showing the configuration of the controller 5;
8 is a diagram illustrating an appearance configuration of the terminal device 7.
9 is a diagram illustrating an appearance configuration of the terminal device 7.
FIG. 10 shows how the user grips the terminal device 7 in the horizontal direction. FIG.
FIG. 11 is a diagram illustrating a state in which a user grips the terminal device 7 in a horizontal direction. FIG.
FIG. 12 is a diagram illustrating a state in which the user grips the terminal device 7 in the vertical direction. FIG.
FIG. 13 is a diagram illustrating a state in which the user grips the terminal device 7 in the vertical direction. FIG.
14 is a block diagram showing an internal configuration of a terminal device 7. FIG.
FIG. 15 is a diagram showing an example in which an additional device (input device 200) is attached to terminal device 7. FIG.
FIG. 16 is a diagram showing an example in which an additional device (input device 200) is attached to the terminal device 7. FIG.
17 is a diagram illustrating another example of the input device.
FIG. 18 is a diagram illustrating a state in which the input device 220 shown in FIG. 17 is mounted on the terminal device 7.
19 is a diagram illustrating a state in which the input device 220 shown in FIG. 17 is mounted on the terminal device 7.
20 is a diagram showing another example in which an additional device (stand 210) is connected to the terminal device 7. FIG.
21 is a diagram showing various data used in game processing.
22 is a main flowchart showing a flow of game processing executed in the game device 3;
Fig. 23 is a flowchart showing a detailed flow of game control processing.
Fig. 24 is a diagram showing the screen of the television 2 and the terminal device 7 in the first game example.
Fig. 25 is a diagram showing a screen of the television 2 and the terminal device 7 in the second game example.
FIG. 26 is a diagram showing an example of a television game image displayed on the television 2 in the third game example. FIG.
27 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the third game example.
FIG. 28 is a diagram showing an example of a television game image displayed on the television 2 in the fourth game example. FIG.
29 is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example.
30 is a diagram illustrating a use state of the game system 1 in the fifth game example.
FIG. 31 is a diagram showing a connection relationship between devices included in the game system 1 in the case of being connected to an external device via a network. FIG.
32 is a diagram illustrating an appearance configuration of a terminal device according to a modification of the present embodiment.
FIG. 33 is a diagram illustrating a state in which a user grips the terminal device illustrated in FIG. 32.
34 is a diagram illustrating an appearance configuration of a terminal device according to another modification of the present embodiment.
35 is a diagram illustrating an appearance configuration of a terminal device according to another modification of the present embodiment.

[One. Overall configuration of the game system]

EMBODIMENT OF THE INVENTION Hereinafter, with reference to drawings, the game system 1 which concerns on one Embodiment of this invention is demonstrated. 1 is an external view of a game system 1. In Fig. 1, the game system 1 is a stationary display device (hereinafter referred to as "TV") 2 represented by a television receiver or the like, a stationary game device 3, an optical disk 4, a controller. (5), a marker device 6 and a terminal device 7 are included. The game system 1 executes a game process in the game device 3 on the basis of a game operation using the controller 5, and displays the game image obtained by the game process on the television 2 and / or the terminal device 7. ).

In the game device 3, an optical disk 4, which is an example of an information storage medium used to be exchangeable with respect to the game device 3, is detachably inserted. An information processing program (typically, a game program) to be executed by the game apparatus 3 is stored in the optical disk 4. [ An insertion opening of the optical disk 4 is formed in the front surface of the game device 3. The game device 3 executes the game process by reading and executing the information processing program stored in the optical disk 4 inserted in the insertion port.

The television 2 is connected to the game device 3 via a connection cord. The television 2 displays the game image obtained by the game process performed in the game device 3. The television 2 has a speaker 2a (FIG. 2), and the speaker 2a outputs a game sound obtained as a result of the game process. In another embodiment, the game device 3 and the stationary display device may be integrated. In addition, the communication between the game device 3 and the television 2 may be wireless communication.

A marker device 6 is provided around the screen of the television 2 (upper screen in FIG. 1). Although details will be described later, a user (player) can perform a game operation of moving the controller 5, and the marker device 6 is used by the game device 3 to calculate the movement, position, or posture of the controller 5. Used. The marker device 6 has two markers 6R and 6L at both ends thereof. The marker 6R (also for the marker 6L) is specifically one or more infrared LEDs (Light Emitting Diodes), and outputs infrared light toward the front of the television 2. The marker device 6 is connected to the game device 3, and the game device 3 can control the lighting of each infrared LED included in the marker device 6. In addition, the marker device 6 is a portable type, and the user can install the marker device 6 in a free position. In FIG. 1, although the marker apparatus 6 has shown the form provided on the television 2, the position and direction which install the marker apparatus 6 are arbitrary.

The controller 5 provides the game device 3 with operation data indicating an operation content performed on itself. The controller 5 and the game device 3 can communicate by wireless communication. In the present embodiment, for example, the technology of Bluetooth (registered trademark) is used for wireless communication between the controller 5 and the game device 3, for example. In another embodiment, the controller 5 and the game device 3 may be connected by wire. In the present embodiment, the controller 5 included in the game system 1 is one, but the game device 3 can communicate with a plurality of controllers. It is possible to play. The detailed structure of the controller 5 is mentioned later.

The terminal device 7 is large enough to be grasped by the user, and the user can move the terminal device 7 in his hand or arrange the terminal device 7 at a free position. Although the detailed structure is mentioned later, the terminal apparatus 7 is equipped with the liquid crystal display (LCD) 51 which is a display means, and an input means (touch panel 52, the gyro sensor 74, etc. which are mentioned later). . The terminal device 7 and the game device 3 can communicate by wireless (may be wired). The terminal device 7 receives data of an image (for example, a game image) generated in the game device 3 from the game device 3, and displays the image on the LCD 51. In addition, although LCD is used as a display apparatus in this embodiment, the terminal apparatus 7 may have other arbitrary display apparatuses, such as the display apparatus which used EL (Electro Luminescence), for example. In addition, the terminal device 7 transmits to the game device 3 operation data indicating an operation content performed on the self.

[2. Internal structure of the game device 3]

Next, with reference to FIG. 2, the internal structure of the game device 3 is demonstrated. 2 is a block diagram showing the internal configuration of the game device 3. The game device 3 includes a central processing unit (CPU) 10, a system LSI 11, an external main memory 12, a ROM / RTC 13, a disk drive 14, an AV-IC 15, and the like. Have

The CPU 10 executes game processing by executing a game program stored in the optical disk 4, and functions as a game processor. The CPU 10 is connected to the system LSI 11. In addition to the CPU 10, an external main memory 12, a ROM / RTC 13, a disk drive 14, and an AV-IC 15 are connected to the system LSI 11. The system LSI 11 performs processing such as control of data transfer between each component connected thereto, generation of an image to be displayed, data acquisition from an external device, and the like. In addition, the internal structure of the system LSI 11 is mentioned later. The volatile external main memory 12 stores a program such as a game program read from the optical disk 4, a game program read from the flash memory 17, or stores various data, and the CPU 10 It is used as a work area or a buffer area. The ROM / RTC 13 has a ROM (so-called boot ROM) in which a startup program of the game device 3 is embedded, and a clock circuit (RTC: Real Time Clock) for counting time. The disk drive 14 reads program data, texture data, and the like from the optical disk 4, and writes the read data to the internal main memory 11e or the external main memory 12 described later.

The system LSI 11 includes an input / output processor (I / O processor) 11a, a graphics processor unit (GPU) 11b, a digital signal processor (DSP) 11c, a video RAM (VRAM) 11d, and an internal main memory. 11e is provided. Although not shown, these components 11a to 11e are connected to each other by an internal bus.

The GPU 11b forms a part of the drawing means and generates an image in accordance with the graphics command (drawing command) from the CPU 10. The VRAM 11d stores data (data such as polygon data or texture data) necessary for the GPU 11b to execute a graphics command. When an image is generated, the GPU 11b creates image data using the data stored in the VRAM 11d. In addition, in the present embodiment, the game device 3 generates both a game image displayed on the television 2 and a game image displayed on the terminal device 7. Hereinafter, the game image displayed on the television 2 may be called "the game image for TV", and the game image displayed on the terminal device 7 may be called "the game image for terminals."

The DSP 11c functions as an audio processor, and generates sound data using sound data or sound wave form (tone) data stored in the internal main memory 11e or the external main memory 12. In addition, in this embodiment, both game sound output from the speaker of the television 2 and game sound output from the speaker of the terminal device 7 are produced | generated similarly to a game image also about a game sound. In the following, the game sound output from the television 2 may be referred to as "TV game sound", and the game sound output from the terminal device 7 may be referred to as "game sound for terminal".

As described above, among the images and audio generated by the game device 3, the data of the images and audio output by the television 2 are read by the AV-IC 15. The AV-IC 15 outputs the read image data to the television 2 via the AV connector 16 and outputs the read audio data to the speaker 2a incorporated in the television 2. Thereby, an image is displayed on the television 2, and sound is output from the speaker 2a.

In addition, among the images and sounds generated by the game device 3, the data of the images and sounds output by the terminal device 7 are transmitted to the terminal device 7 by the input / output processor 11a or the like. Data transmission to the terminal device 7 by the input / output processor 11a or the like will be described later.

The input / output processor 11a performs data transmission / reception with the components connected thereto or downloads data from an external device. The input / output processor 11a is connected to a flash memory 17, a network communication module 18, a controller communication module 19, an expansion connector 20, a memory card connector 21, and a codec LSI 27. In addition, an antenna 22 is connected to the network communication module 18. An antenna 23 is connected to the controller communication module 19. The codec LSI 27 is connected to the terminal communication module 28, and an antenna 29 is connected to the terminal communication module 28.

The game device 3 can connect to a network such as the Internet and communicate with an external information processing device (for example, another game device or various servers). That is, the input / output processor 11a can connect to a network such as the Internet through the network communication module 18 and the antenna 22 and communicate with an external information processing apparatus connected to the network. The input / output processor 11a periodically accesses the flash memory 17, detects the presence or absence of data that needs to be transmitted to the network, and if there is such data, turns off the network communication module 18 and the antenna 22. Send to network via In addition, the input / output processor 11a receives the data transmitted from the external information processing apparatus or the data downloaded from the download server through the network, the antenna 22 and the network communication module 18, and receives the received data from the flash memory. Remember at (17). By executing the game program, the CPU 10 reads the data stored in the flash memory 17 and uses it in the game program. In addition to data transmitted and received between the game device 3 and the external information processing device, the flash memory 17 may store save data (game result data or middle data) of a game played using the game device 3. . In addition, a game program may be stored in the flash memory 17.

In addition, the game device 3 can receive operation data from the controller 5. That is, the input / output processor 11a receives the operation data transmitted from the controller 5 through the antenna 23 and the controller communication module 19, and the buffer area of the internal main memory 11e or the external main memory 12. Remember (temporarily remember).

In addition, the game device 3 can transmit / receive data such as an image, a voice, and the like with the terminal device 7. When the input / output processor 11a transmits a game image (terminal game image) to the terminal device 7, the input / output processor 11a outputs data of the game image generated by the GPU 11b to the codec LSI 27. The codec LSI 27 performs predetermined compression processing on the image data from the input / output processor 11a. The terminal communication module 28 performs wireless communication with the terminal device 7. Therefore, the image data compressed by the codec LSI 27 is transmitted by the terminal communication module 28 to the terminal device 7 via the antenna 29. In the present embodiment, the image data transmitted from the game device 3 to the terminal device 7 is used for the game. In the game, if a delay occurs in the displayed image, adversely affects the operability of the game. Therefore, it is preferable that the delay is not caused as much as possible regarding the transmission of the image data from the game device 3 to the terminal device 7. Therefore, in the present embodiment, the codec LSI 27 compresses the image data using a high-efficiency compression technique such as, for example, the H.264 standard. In addition, a compression technique other than that may be used, or when the communication speed is sufficient, the configuration may be used to transmit image data without compression. In addition, the terminal communication module 28 is, for example, a communication module certified by Wi-Fi and, for example, the terminal device 7 using the technology of Multiple Input Multiple Output (MIMO) adopted by the IEEE 802.11n standard. May be performed at high speed or another communication method may be used.

The game device 3 also transmits audio data to the terminal device 7 in addition to the image data. That is, the input / output processor 11a outputs the voice data generated by the DSP 11c to the terminal communication module 28 through the codec LSI 27. The codec LSI 27 performs compression processing on audio data as well as image data. The compression method for the voice data may be any method, but a high compression ratio and a low voice degradation method are preferable. In another embodiment, voice data may be transmitted without compression. The terminal communication module 28 transmits the compressed image data and the voice data to the terminal device 7 via the antenna 29. [

In addition to the image data and audio data, the game device 3 transmits various control data to the terminal device 7 as necessary. The control data is data indicating control instructions for the components included in the terminal device 7, for example, an instruction for controlling the lighting of the marker unit (marker unit 55 shown in FIG. 10), or a camera [FIG. Instructions for controlling the imaging of the camera 56 shown in FIG. The input / output processor 11a transmits control data to the terminal device 7 in accordance with the instruction of the CPU 10. Regarding this control data, in the present embodiment, the codec LSI 27 does not perform data compression processing. However, in other embodiments, the codec LSI 27 may perform compression processing. In addition, the above-mentioned data transmitted from the game device 3 to the terminal device 7 may or may not be encrypted as necessary.

In addition, the game device 3 can receive various data from the terminal device 7. Although details will be described later, in the present embodiment, the terminal device 7 transmits operation data, image data, and audio data. Each data transmitted from the terminal device 7 is received by the terminal communication module 28 via the antenna 29. Here, the image data and the audio data from the terminal device 7 are subjected to compression processing such as the image data and the audio data from the game device 3 to the terminal device 7. Therefore, these image data and audio data are sent from the terminal communication module 28 to the codec LSI 27, and the decompression process is performed by the codec LSI 27 and output to the input / output processor 11a. On the other hand, regarding the operation data from the terminal apparatus 7, since the data amount is small compared with an image and an audio | voice, it is not necessary to perform a compression process. In addition, encryption may or may not be performed as necessary. Therefore, the operation data is received by the terminal communication module 28 and then output to the input / output processor 11a via the codec LSI 27. The input / output processor 11a stores (temporarily stores) data received from the terminal device 7 in the buffer area of the internal main memory 11e or the external main memory 12.

In addition, the game device 3 can be connected to another device or an external storage medium. That is, the expansion connector 20 and the memory card connector 21 are connected to the input / output processor 11a. The expansion connector 20 is a connector for an interface such as USB or SCSI. The expansion connector 20 is connected to a network such as an external storage medium, a peripheral device such as another controller, or a wired communication connector to be connected to the network communication module 18 to communicate with the network. Can be. The memory card connector 21 is a connector for connecting an external storage medium such as a memory card. For example, the input / output processor 11a can access the external storage medium through the expansion connector 20 or the memory card connector 21, store data in the external storage medium, or read data from the external storage medium. Can be.

The game device 3 is provided with a power button 24, a reset button 25 and an eject button 26. The power button 24 and the reset button 25 are connected to the system LSI 11. When the power button 24 is turned ON, power is supplied to each component of the game device 3 from an external power supply by an AC adapter (not shown). When the reset button 25 is pressed, the system LSI 11 restarts the start program of the game device 3. The eject button 26 is connected to the disk drive 14. When the eject button 26 is pressed, the optical disk 4 is ejected from the disk drive 14.

In another embodiment, some of the components included in the game device 3 may be configured as an expansion device of a member separate from the game device 3. At this time, the expansion device may be connected to the game device 3 via the expansion connector 20, for example. Specifically, the expansion device is provided with respective components of the codec LSI 27, the terminal communication module 28, and the antenna 29, and may be detachable from the expansion connector 20. According to this, by connecting the said expansion device to the game device which is not provided with each said component, it can be set as the structure which can communicate with the terminal device 7.

[3. Configuration of Controller 5]

Next, the controller 5 will be described with reference to FIGS. 3 to 7. 3 is a perspective view illustrating an external configuration of the controller 5. 4 is a perspective view illustrating an external configuration of the controller 5. 3 is a perspective view of the controller 5 seen from the upper rear, and FIG. 4 is a perspective view of the controller 5 seen from the lower front.

3 and 4, the controller 5 has a housing 31 formed by, for example, plastic molding. The housing 31 has a substantially rectangular parallelepiped shape in which the front and back direction (the Z-axis direction shown in Fig. 3) is set in the longitudinal direction, and is a size that can be gripped by one hand of an adult or a child as a whole. The user can perform the game operation by pressing a button provided on the controller 5 and by moving the controller 5 itself to change its position or posture (tilt).

The housing 31 is provided with a plurality of operation buttons. As shown in FIG. 3, the upper surface of the housing 31 has a crisscross button 32a, a button 1, 32b, a button 2, 32c, a button 32d, a minus button 32e, and a home button. 32f, a plus button 32g, and a power button 32h are provided. In this specification, the upper surface of the housing 31 in which these buttons 32a-32h are provided may be called "button surface." On the other hand, as shown in FIG. 4, the recessed part is formed in the lower surface of the housing 31, and the B button 32i is provided in the rear surface side inclined surface of the said recessed part. To each of these operation buttons 32a-32i, the function according to the information processing program which the game device 3 runs is suitably assigned. Moreover, the power button 32h is for turning ON / OFF the power supply of the game device 3 main body from a remote. The upper surface of the home button 32f and the power button 32h are embedded in the upper surface of the housing 31. As a result, the user can be prevented from accidentally pressing the home button 32f or the power button 32h.

The connector 33 is provided in the rear surface of the housing 31. The connector 33 is used to connect another device (for example, another sensor unit or controller) to the controller 5. In addition, locking holes 33a are formed on both sides of the connector 33 on the rear surface of the housing 31 to prevent the other device from easily detaching.

A plurality of (four in FIG. 3) LEDs 34a to 34d are provided behind the housing 31 upper surface. Here, the controller 5 is provided with a controller type (number) to distinguish it from other controllers. Each of the LEDs 34a to 34d is used for the purpose of notifying the user of the controller type currently set in the controller 5, or notifying the user of the remaining battery capacity of the controller 5. Specifically, when game operation is performed using the controller 5, one of the plurality of LEDs 34a to 34d lights up in accordance with the controller type.

In addition, the controller 5 has an image capturing information calculating section 35 (FIG. 6), and as shown in FIG. 4, the light incident surface 35a of the image capturing information calculating section 35 is provided on the front surface of the housing 31. Is installed. The light incident surface 35a is made of a material that at least transmits infrared light from the markers 6R and 6L.

Between the first button 32b on the upper surface of the housing 31 and the home button 32f, a sound emission hole for emitting sound from the speaker 47 (FIG. 5) incorporated in the controller 5 to the outside. 31a is formed.

Next, with reference to FIG. 5 and FIG. 6, the internal structure of the controller 5 is demonstrated. Figs. 5 and 6 are diagrams showing the internal structure of the controller 5. Fig. 5 is a perspective view showing a state where the upper housing (part of the housing 31) of the controller 5 is removed. 6 is a perspective view showing a state in which the controller 5 lower housing (part of the housing 31) is removed. The perspective view shown in FIG. 6 is a perspective view which looked at the board | substrate 30 shown in FIG. 5 from the back surface.

In FIG. 5, the board | substrate 30 is fixedly installed in the housing 31, and each operation button 32a-32h and each LED 34a- on the upper main surface of the said board | substrate 30 are shown. 34d), an acceleration sensor 37, an antenna 45, a speaker 47, and the like are provided. These are connected to a microcomputer 42 (see Fig. 6) by wiring (not shown) formed on the substrate 30 or the like. In the present embodiment, the acceleration sensor 37 is disposed at a position shifted from the center of the controller 5 with respect to the X axis direction. As a result, the movement of the controller 5 when the controller 5 is rotated around the Z axis can be easily calculated. In addition, the acceleration sensor 37 is disposed ahead of the center of the controller 5 with respect to the longitudinal direction (Z-axis direction). Moreover, the controller 5 functions as a wireless controller by the radio module 44 (FIG. 6) and the antenna 45.

6, the imaging information calculating part 35 is provided in the edge of the front-end part on the lower main surface of the board | substrate 30. In FIG. The imaging information calculating unit 35 includes an infrared filter 38, a lens 39, an imaging device 40, and an image processing circuit 41 in order from the front of the controller 5. These members 38 to 41 are mounted on the lower main surface of the substrate 30, respectively.

In addition, the microcomputer 42 and the vibrator 46 are provided on the lower main surface of the substrate 30. The vibrator 46 is a vibration motor or a solenoid, for example, and is connected with the microcomputer 42 by the wiring provided in the board | substrate 30 grade | etc.,. Vibration occurs in the controller 5 when the vibrator 46 is operated by the instruction of the microcomputer 42. Thus, a so-called vibration-corresponding game in which the vibration is transmitted to the hands of the user holding the controller 5 can be realized. In the present embodiment, the vibrator 46 is disposed slightly on the front side of the housing 31. That is, since the vibrator 46 is arrange | positioned at the end side rather than the center of the controller 5, the whole controller 5 can be vibrated largely by the vibration of the vibrator 46. FIG. In addition, the connector 33 is mounted to the trailing edge of the lower main surface of the substrate 30. 5 and 6, the controller 5 includes a crystal oscillator for generating a basic clock of the microcomputer 42, an amplifier for outputting an audio signal to the speaker 47, and the like.

In addition, the shape of the controller 5 shown in FIGS. 3-6, the shape of each operation button, the number of acceleration sensors and a vibrator, the installation position, etc. are only an example and other shapes, numbers, and an installation position may be sufficient. . In addition, in this embodiment, although the imaging direction by an imaging means is a Z-axis forward direction, the imaging direction may be any direction. That is, the position of the imaging information calculating part 35 in the controller 5 (the light incident surface 35a of the imaging information calculating part 35) may not be the front surface of the housing 31, As long as light can be introduced from the outside, it may be provided on another surface.

7 is a block diagram showing the configuration of the controller 5. The controller 5 is provided with the operation part 32 (each operation button 32a-32i), the imaging information calculation part 35, the communication part 36, the acceleration sensor 37, and the gyro sensor 48. As shown in FIG. The controller 5 transmits the data indicating the operation contents performed on the self as the operation data to the game device 3. In addition, below, the operation data which the controller 5 transmits is called "controller operation data", and the operation data which the terminal apparatus 7 transmits may be called "terminal operation data."

The operation part 32 includes each operation button 32a-32i mentioned above, and operation which shows the input state (whether each operation button 32a-32i was pressed) with respect to each operation button 32a-32i. The button data is output to the microcomputer 42 of the communication unit 36.

The imaging information calculating part 35 is a system for analyzing the image data picked up by the imaging means, discriminating a region with high brightness from among them, and calculating a center of gravity position, size, etc. of the region. Since the imaging information calculating part 35 has a sampling period of up to about 200 frames / second, for example, it can track and analyze even the movement of the controller 5 which is comparatively high speed.

The imaging information calculating section 35 includes an infrared filter 38, a lens 39, an imaging device 40, and an image processing circuit 41. The infrared filter 38 passes only infrared rays from light incident from the front of the controller 5. The lens 39 collects the infrared rays transmitted through the infrared filter 38 and enters the image pickup device 40. The imaging device 40 is, for example, a solid-state imaging device such as a CMOS sensor or a CCD sensor. The imaging device 40 receives infrared rays collected by the lens 39 and outputs an image signal. Here, the marker part 55 and the marker device 6 of the terminal device 7 to be imaged are constituted by markers for outputting infrared light. Therefore, by providing the infrared filter 38, the imaging element 40 receives only the infrared rays passing through the infrared filter 38 to generate image data. Therefore, the imaging target (marker portion 55 and / or marker device 6 Image can be captured more accurately. Hereinafter, the image picked up by the imaging element 40 is called a picked up image. The image data generated by the imaging device 40 is processed by the image processing circuit 41. The image processing circuit 41 calculates the position of the imaging target in the captured image. The image processing circuit 41 outputs the coordinates indicating the calculated position to the microcomputer 42 of the communication unit 36. The data of this coordinate is transmitted to the game device 3 as the operation data by the microcomputer 42. Hereinafter, the said coordinate is called "marker coordinate." Since the marker coordinates change in correspondence with the direction (inclined angle) or position of the controller 5 itself, the game device 3 can calculate the direction or position of the controller 5 using this marker coordinate.

In another embodiment, the controller 5 may not be provided with the image processing circuit 41, or the captured image itself may be transmitted from the controller 5 to the game device 3. At this time, the game device 3 has a circuit or program having the same function as the image processing circuit 41, and may calculate the marker coordinates.

The acceleration sensor 37 detects the acceleration (including gravity acceleration) of the controller 5, that is, the force (including gravity) applied to the controller 5. The acceleration sensor 37 detects the value of the acceleration (linear acceleration) in the linear direction along the sensing axial direction among the accelerations applied to the detection unit of the acceleration sensor 37. For example, in the case of a multi-axis acceleration sensor of two or more axes, the acceleration of the component along each axis is detected as the acceleration applied to the detection unit of the acceleration sensor. In addition, although the acceleration sensor 37 is a capacitance type MEMS (Micro Electro Mechanical System) type acceleration sensor, you may make it use the acceleration sensor of another system, for example.

In this embodiment, the acceleration sensor 37 has the up-down direction (Y-axis direction shown in FIG. 3), the left-right direction (X-axis direction shown in FIG. 3), and the front-back direction (shown in FIG. 3) with respect to the controller 5 as reference. The linear acceleration is detected in each of the three axis directions (Z-axis direction). Since the acceleration sensor 37 detects the acceleration in the linear direction along each axis, the output from the acceleration sensor 37 represents the value of the linear acceleration of each of the three axes. That is, the detected acceleration is represented as a three-dimensional vector in the XYZ coordinate system (controller coordinate system) set on the basis of the controller 5.

The data (acceleration data) indicating the acceleration detected by the acceleration sensor 37 is output to the communication unit 36. In addition, since the acceleration detected by the acceleration sensor 37 changes in correspondence with the direction (inclined angle) or movement of the controller 5 itself, the game device 3 uses the acquired acceleration data to direct the direction of the controller 5. Or motion can be calculated. In the present embodiment, the game device 3 calculates the attitude, tilt angle and the like of the controller 5 based on the acquired acceleration data.

The CPU of the game device 3 (for example, the CPU 10) or the controller (not shown) of the game device 3, based on the acceleration signal output from the acceleration sensor 37 (which is also applied to the acceleration sensor 73 described later) It should be understood by those skilled in the art that new information relating to the controller 5 can be guessed or calculated (processed) by a computer such as a processor (e.g., the microcomputer 42) It will be easy to understand. For example, when the computer-side processing is executed on the assumption that the controller 5 equipped with the acceleration sensor 37 is in a stopped state (that is, the acceleration detected by the acceleration sensor is only gravity acceleration, the processing is executed). If the controller 5 is in a stationary state in reality, it is possible to know whether or not the attitude of the controller 5 is inclined with respect to the direction of gravity based on the detected acceleration. Specifically, when the detection axis of the acceleration sensor 37 is in the vertical downward direction, the controller 5 is inclined relative to the reference depending on whether 1G (gravity acceleration) is applied. It can know whether it exists, and how much it inclines with respect to a reference | standard by the magnitude | size. In addition, in the case of the multi-axis acceleration sensor 37, it is possible to know in more detail how much the controller 5 is inclined with respect to the direction of gravity by further processing the acceleration signal of each axis. In this case, the processor may calculate the inclination angle of the controller 5 based on the output from the acceleration sensor 37, or may calculate the inclination direction of the controller 5 without calculating the inclination angle. good. In this manner, the inclination angle or attitude of the controller 5 can be determined by using the acceleration sensor 37 in combination with the processor.

On the other hand, when it is assumed that the controller 5 is in a dynamic state (state in which the controller 5 is moving), the acceleration sensor 37 detects the acceleration according to the movement of the controller 5 in addition to the gravity acceleration. Therefore, the direction of movement of the controller 5 can be known by removing the component of the gravity acceleration from the detected acceleration by a predetermined process. In addition, even when it is assumed that the controller 5 is in a dynamic state, by removing a component of acceleration according to the movement of the acceleration sensor from the detected acceleration by a predetermined process, the controller 5 with respect to the gravity direction is removed. It is possible to know the slope. Further, in another embodiment, the acceleration sensor 37 is a built-in processing device for performing predetermined processing on the acceleration signal before outputting the acceleration signal detected by the built-in acceleration detection means to the microcomputer 42. Alternatively, other types of dedicated processing apparatus may be provided. The built-in or dedicated processing device converts the acceleration signal into an inclination angle (or other desirable variable), for example when the acceleration sensor 37 is used to detect static acceleration (e.g., gravity acceleration). It may be.

The gyro sensor 48 detects the angular velocity around three axes (in this embodiment, XYZ axis). In the present specification, the rotation direction around the X axis is the pitch direction, and the rotation direction around the Y axis is the yaw direction and the Z axis circumference based on the imaging direction (Z-axis positive direction) of the controller 5 as a reference. The rotation direction of is called a roll direction. The gyro sensor 48 only needs to be able to detect the angular velocity around three axes, and the number and combination of gyro sensors to be used may be any. For example, the gyro sensor 48 may be a three-axis gyro sensor or a combination of the two-axis gyro sensor and the one-axis gyro sensor may be used to detect the angular velocity around the three axes. Data representing the angular velocity detected by the gyro sensor 48 is output to the communication unit 36. In addition, the gyro sensor 48 may detect the angular velocity around one axis or two axes.

The communication unit 36 includes a microcomputer 42, a memory 43, a wireless module 44, and an antenna 45. The microcomputer 42 controls the radio module 44 which wirelessly transmits the data acquired by the microcomputer 42 to the game device 3, using the memory 43 as a storage area when performing the processing.

Data output from the operation unit 32, the imaging information calculating unit 35, the acceleration sensor 37, and the gyro sensor 48 to the microcomputer 42 is temporarily stored in the memory 43. These data are transmitted to the game device 3 as operation data (controller operation data). That is, when the transmission timing of the game device 3 to the controller communication module 19 arrives, the microcomputer 42 outputs operation data stored in the memory 43 to the radio module 44. The radio module 44 modulates a carrier wave of a predetermined frequency into operation data using a technique of Bluetooth (registered trademark), for example, and radiates the weak radio wave signal from the antenna 45. That is, the operation data is modulated by the weak radio signal in the wireless module 44 and transmitted from the controller 5. The weak radio wave signal is received by the controller communication module 19 on the game device 3 side. By demodulating or decoding the received weak radio signal, the game device 3 can acquire the operation data. Then, the CPU 10 of the game device 3 performs a game process using the operation data acquired from the controller 5. In addition, while the wireless transmission from the communication unit 36 to the controller communication module 19 is sequentially performed at predetermined intervals, game processing is generally performed in units of 1/60 seconds (with one frame time). It is preferable to perform the transmission at a period less than this time. The communication unit 36 of the controller 5 outputs the operation data to the controller communication module 19 of the game device 3 at a rate of once every 1/200 second, for example.

As described above, the controller 5 can transmit marker coordinate data, acceleration data, angular velocity data, and operation button data as operation data indicating an operation for itself. In addition, the game device 3 executes game processing using the operation data as a game input. Therefore, by using the controller 5, the user can perform a game operation of moving the controller 5 itself in addition to the conventional general game operation of pressing each operation button. For example, it is possible to perform an operation of tilting the controller 5 to an arbitrary posture, an operation of indicating an arbitrary position on the screen by the controller 5, an operation of moving the controller 5 itself, and the like.

In addition, in this embodiment, although the controller 5 does not have a display means for displaying a game image, you may have display means for displaying the image etc. which show a battery residual quantity, for example.

[4. Configuration of Terminal Device 7]

Next, with reference to FIGS. 8-13, the structure of the terminal device 7 is demonstrated. 8 is a plan view showing the external configuration of the terminal device 7. (A) in FIG. 8 is a front view of the terminal device 7, (b) is a top view, (c) is a right side view, and (d) is a bottom view. 9 is a rear view of the terminal device 7. 10 and 11 illustrate a state in which the user grips the terminal device 7 in the horizontal direction. 12 and 13 illustrate a state in which the user grips the terminal device 7 in the vertical direction.

As shown in FIG. 8, the terminal device 7 includes a housing 50 that is generally rectangular in the shape of a plate. In other words, the terminal device 7 may be referred to as a tablet type information processing device. Further, the housing 50 may have a curved surface as long as the housing 50 is generally plate-shaped, or may have a projection or the like on a part thereof. The housing 50 is large enough to be gripped by the user. Therefore, the user can move while holding the terminal device 7 or change the arrangement position of the terminal device 7. The length of the longitudinal (z-axis direction) of the terminal device 7 is preferably 100 to 150 [mm], and is 133.5 [mm] in this embodiment. As for the length of the width | variety (x-axis direction) of the terminal device 7, 200-250 [mm] is preferable, and is 228.26 [mm] in this embodiment. The thickness (length in the y-axis direction) of the terminal device 7 is preferably about 15 to 30 [mm] in the plate-shaped portion and about 30 to 50 [mm] including the thickest portion. In this embodiment, the thickness is 23.6. (40.26 thickest part) [mm]. In addition, the weight of the terminal device 7 is about 400-600 [g], and is 530 [g] in this embodiment. Although details will be described later, the terminal device 7 has a configuration that is easy to be gripped by the user and easily operated even by the relatively large terminal device (operation device) as described above.

The terminal device 7 has an LCD 51 on the surface (surface side) of the housing 50. In addition, the size of the LCD 51 screen is preferably 5 inches or more, here 6.2 inches. The operation apparatus 7 of this embodiment is easy to operate even if a large LCD is provided by the structure which is easy to hold and operate. In addition, in another embodiment, a smaller LCD 51 may be provided so that the size of the operating device 7 is made relatively small. The LCD 51 is installed near the center of the surface of the housing 50. Accordingly, the user can move by holding the terminal device 7 while viewing the screen of the LCD 51 by opening the housing 50 of both sides of the LCD 51 as shown in FIGS. 10 and 11. 10 and 11 show an example in which the user grasps the terminal device 7 horizontally (horizontally in a longitudinal direction) by squeezing the housings 50 on the left and right sides of the LCD 51, but FIG. As shown in FIG. 12 and FIG. 13, it is also possible to hold the terminal device 7 in a vertical grip (vertically in a longitudinal direction).

As shown in FIG. 8A, the terminal device 7 has a touch panel 52 on the screen of the LCD 51 as an operation means. In this embodiment, the touch panel 52 is a resistive touch panel. However, the touch panel is not limited to the resistive film method, and for example, any type of touch panel such as a capacitive method can be used. The touch panel 52 may be a single touch system or a multi-touch system. In this embodiment, the touch panel 52 is used as the resolution of the LCD 51 and the same resolution (detection accuracy). However, the resolution of the touch panel 52 and the resolution of the LCD 51 do not necessarily need to match. Although the input to the touch panel 52 is normally performed using the touch pen 60, it is not limited to the touch pen 60, It is also possible to input to the touch panel 52 with a user's finger. Moreover, the housing 50 is provided with the accommodating hole 60a which accommodates the touch pen 60 used for performing operation with respect to the touch panel 52 (refer FIG. 8 (b)). In addition, although the storage hole 60a is formed in the upper surface of the housing 50 so that the touch pen 60 may not fall here, you may form in the side surface or the lower surface. As described above, since the terminal device 7 includes the touch panel 52, the user can operate the touch panel 52 while moving the terminal device 7. That is, the user can directly input (by the touch panel 52) to the screen while moving the screen of the LCD 51.

As shown in FIG. 8, the terminal apparatus 7 is equipped with two analog sticks 53A and 53B and several buttons (keys) 54A-54M as an operation means (operation part). Each analog stick 53A and 53B is a device capable of indicating a direction. Each of the analog sticks 53A and 53B allows the movable member (stick portion) operated by a user's finger to slide in any direction (any angle in up, down, left, and right directions) with respect to the surface of the housing 50. Consists of. That is, it is a direction input device which may be called a slide pad. In addition, the movable member of each analog stick 53A and 53B may be a kind of hardness in arbitrary directions with respect to the surface of the housing 50. In this embodiment, since the movable member uses an analog stick of the type which slides, a user can operate each analog stick 53A and 53B without moving a thumb greatly, and grips the housing 50 more reliably. The operation can be performed in one state. In addition, when each kind of analog stick 53A and 53B uses the kind of hardness of a movable member, the degree of input (degree of inclination) is plain to a user, and detailed operation can be performed more easily.

The left analog stick 53A is provided on the left side of the screen of the LCD 51 and the right analog stick 53B is provided on the right side of the screen of the LCD 51. Therefore, the user can input the direction indicating the direction by using the analog stick in either hand. 10 and 11, each analog stick 53A and 53B can be operated in a state in which the user grips the left and right portions of the terminal device 7 (both left and right portions of the LCD 51). Since it is installed in the user, the user can easily operate the respective analog sticks 53A and 53B even when the user moves by holding the terminal device 7.

Each button 54A-54L is an operation means (operation part) for performing predetermined | prescribed input, and is a key which can be pressed. As shown below, each button 54A-54L is provided in the position which can be operated in the state which the user gripped the left-right part of the terminal apparatus 7 (refer FIG. 10 and FIG. 11). Therefore, the user can easily operate these operating means even when the user moves by holding the terminal device 7.

As shown in Fig. 8A, the surface of the housing 50 has a crisscross button (direction input button) 54A and buttons 54B to 54H and 54M among the operation buttons 54A to 54L. This is installed. That is, these buttons 54A to 54H and 54M are disposed at positions operable by the user's thumb (see FIGS. 10 and 11).

The criss-cross button 54A is on the left side of the LCD 51 and is provided below the left analog stick 53A. That is, the crisscross button 54A is disposed at a position which can be operated by the user's left hand. The criss-cross button 54A has a shape of a criss-cross, and is a button capable of instructing at least up, down, left, and right directions.

In addition, buttons 54B to 54D are provided below the LCD 51. These three buttons 54B-54D are arrange | positioned in the position which can be operated by both left and right hands. In addition, the terminal device 7 has a power button 54M for turning on / off the power supply of the terminal device 7. By operating the power button 54M, it is also possible to remotely turn on / off the power supply of the game device 3. The power button 54M is provided below the LCD 51 similarly to the buttons 54B to 54D. The power button 54M is provided on the right side of the buttons 54B to 54D. Thus, the power button 54M is disposed at a position that can be operated (easy to operate) with the right hand. In addition, the four buttons 54E to 54H are on the right side of the LCD 51 and are provided below the right analog stick 53B. That is, the four buttons 54E to 54H are disposed at positions operable by the user's right hand. In addition, the four buttons 54E to 54H are arranged so as to be in a positional relationship of up, down, left and right (relative to the center position of the four buttons 54E to 54H). Accordingly, the terminal device 7 can also function the four buttons 54E to 54H as buttons for instructing the user in the up, down, left, and right directions.

In addition, in this embodiment, each analog stick 53A and 53B is arrange | positioned above the crisscross button 54A and each button 54E-54H. Here, each of the analog sticks 53A and 53B protrudes in the thickness direction (y-axis direction) than the crisscross buttons 54A and the buttons 54E to 54H. Therefore, for example, when the arrangement of the analog stick 53A and the crisscross button 54A is reversed, the thumb touches the analog stick 53A when the user operates the crisscross button 54A with a thumb. There is a risk of wrong operation. The same problem also occurs when the arrangement of the analog sticks 53B and the buttons 54E to 54H is reversed. In contrast, in the present embodiment, the analog sticks 53A and 53B are disposed above the crisscross buttons 54A and the buttons 54E to 54H, so that when the user operates the analog sticks 53A and 53B, The likelihood that a finger will touch the crisscross button 54A and each of the buttons 54E to 54H is lower than in the case described above. As described above, in the present embodiment, the possibility of misoperation can be reduced, and the operability of the terminal device 7 can be improved. However, in another embodiment, the analog stick 53A may be disposed opposite to the hot cross button 54A, or the analog stick 53B may be disposed opposite to the buttons 54E to 54H, if necessary.

Here, in this embodiment, some operation parts (each analog stick 53A and 53B, a crisscross button 54A, and three buttons 54E-54G) are in the left and right both sides of the display part (LCD51). It is provided above the center of the up-down direction (y-axis direction) in the housing 50. When operating these operation parts, the user mainly grips the upper side rather than the center of the up-down direction in the terminal apparatus 7. Here, when the user grips the lower side of the housing 50, the terminal device 7 to be gripped (particularly in the case where the terminal device 7 is of a relatively large size as in the present embodiment) becomes unstable, and the user becomes a terminal. The device 7 becomes difficult to grip. On the other hand, in this embodiment, when operating the said operation part, a user mainly grips the upper side rather than the center of the up-down direction in the terminal apparatus 7, and can support the housing 50 by the palm by the side. . Therefore, the user can hold the housing 50 in a stable state, and the terminal device 7 can be easily gripped, so that the operation unit can also be easily operated. Moreover, in another embodiment, at least 1 operation part may be provided in the left and right of the display part above the center of the housing 50, respectively. For example, only the analog sticks 53A and 53B may be provided above the center of the housing 50. For example, when the crisscross button 54A is provided above the left analog stick 53A, and the four buttons 54E to 54H are installed above the right analog stick 53B, the crisscross button 54A and four buttons 54E to 54H may be provided above the center of the housing 50.

In addition, in this embodiment, the protrusion part (cover part 59) is formed in the back surface side (opposite side of the surface on which the LCD 51 is installed) of the housing 50 (refer FIG. 8 (c) and FIG. 9). . As shown to FIG. 8C, the cover part 59 is a mountain-shaped member which protrudes from the back surface of the substantially plate-shaped housing 50, and is formed. The protrusion has a height (thickness) that can be caught by a user's finger that grips the rear surface of the housing 50. It is preferable that the height of a projection part is 10-25 [mm], and is 16.66 [mm] in this embodiment. It is preferable that the lower surface of the projection has an inclination of 45 deg. Or more (more preferably 60 deg. Or more) with respect to the back surface of the housing 50 so that the protrusion can be easily caught by the user's finger. As shown in FIG. 8C, the lower surface of the protrusion may be formed such that the inclination angle is larger than that of the upper surface. As shown in Figs. 10 and 11, the user hangs his finger on the lid 59 (puts the lid 59 on the finger) and grips the finger, so that even if the terminal device 7 is relatively large in size, fatigue occurs. The terminal device 7 can be gripped in a stable state without. That is, the cover part 59 can be said to be a support member for supporting the housing 50 with a finger, and can also be called a finger hook part.

In addition, the lid part 59 is provided above the center with respect to the up-down direction of the housing 50. The cover portion 59 is provided at a position substantially opposite to the operation portions (each analog stick 53A and 53B) provided on the surface of the housing 50. That is, the projections are provided in the area including the opposite positions of the operation portions respectively provided on the left and right of the display portion. Therefore, when the operation unit is operated, the user can hold the terminal unit 7 by supporting the cover unit 59 with a middle finger or a weak finger (see FIGS. 10 and 11). As a result, the terminal device 7 is more likely to be gripped, and the operation unit is also easier to operate. In addition, in this embodiment, since the projection part has the shape of the sunshade (protruding part) extends to the left and right, the user can hold the terminal device 7 with the middle finger or the weak finger along the lower surface of the projection part. This makes it easier to hold the terminal device 7. In addition, the lid part 59 may be formed so that it may extend in the left-right direction (protrusion part), and is not limited to the shape extended in a horizontal direction as shown in FIG. In another embodiment, the lid portion 59 may extend in the direction slightly inclined from the horizontal direction. For example, the cover part 59 may be provided so that it may incline up (or down) as it goes to the center from both left and right ends.

In addition, in this embodiment, since the cover part 59 formed in the sunshade shape is employ | adopted as the protrusion part formed in the back surface of a housing | casing for the reason of forming the latching hole mentioned later in the cover part 59, a protrusion part Any shape may be sufficient. For example, in another embodiment, a configuration may be provided in which the two projections are formed on both the left and right sides on the rear surface side of the housing 50 (the projections are not formed in the center in the left and right directions) (see FIG. 32). In addition, in another embodiment, the cross-sectional shape (the shape in the cross section perpendicular to the x-axis direction) of the protrusion is so that the user's finger can more reliably support the terminal device 7 (the protrusion is secured by the finger). Hook), or a hook type (shape in which the lower surface is concave) may be used.

In addition, the width | variety with respect to the up-down direction of the projection part (cover part 59) may be sufficient. For example, the protrusion may be formed up to an upper side of the housing 50. That is, the upper surface of the protrusion may be formed at the same position as the upper side surface of the housing 50. At this time, the housing 50 has a two-stage configuration with a thin lower side and a thick upper side. Thus, it is preferable that the housing 50 has a downward facing surface (lower surface of the projection) formed on both left and right sides on the rear surface thereof. Thereby, a user can hold | grip the operation apparatus easily by placing a finger on the said surface. In addition, although the said "downward facing surface" may be formed in any position on the rear surface of the housing 50, it is preferable to be located above the center of the housing 50. As shown in FIG.

In addition, as shown to (a), (b) and (c) of FIG. 8, the 1st L button 54I and the 1st R button 54J are at both left and right sides in the upper surface of the housing | casing 50. Moreover, as shown to FIG. Each is installed. In the present embodiment, the first L button 54I and the first R button 54J are provided on the inclined upper end portions (left upper end portion and upper right upper portion) of the housing 50. Specifically, the first L button 54I is provided at the left end of the upper side in the plate-shaped housing 50 and is exposed from the side of the upper left side (in other words, exposed from both the upper side and the left side). Is available). In addition, the 1st R button 54J is provided in the right end part of the upper side surface in the housing 50, and is exposed from the side surface of the upper right side (in other words, is exposed from both sides of an upper side and a right side). In this manner, the first L button 54I is disposed at a position operable by the user's left hand index finger, and the first R button 54J is disposed at a position operable by the user's right hand index finger (see FIG. 10). In addition, in another embodiment, the operation parts respectively provided in the left and right in the upper surface of the housing 50 do not need to be provided in the left and right end parts, but may be provided in positions other than an end part. Moreover, the operation part may be provided in the left and right side surface of the housing | casing 50, respectively.

As shown in Figs. 8C and 9, the second L button 54K and the second R button 54L are disposed in the projection portion (cover portion 59). The second L button 54K is provided near the left end of the lid 59. The second R button 54L is provided near the right end of the lid 59. In other words, the second L button 54K is provided slightly above the rear left side (left side when viewed from the front side) of the housing 50, and the second R button 54L is positioned on the rear right side of the housing 50 (from the front side). It is installed slightly above the right side when seen. In other words, the second L button 54K is provided at the position of (approximately) the opposite side of the left analog stick 53A provided on the surface, and the second R button 54L is disposed on the surface of the right analog stick 53B provided on the surface. Approximately) on the opposite side. In this way, the second L button 54K is disposed at a position operable by the user's left hand index finger or forefinger, and the second R button 54L is disposed at a position operable by the user's right hand index finger or forefinger (FIG. 10 and FIG. 11). The second L button 54K and the second R button 54L are provided on the upper surface of the lid portion 59, as shown in Fig. 8C. Thus, the second L button 54K and the second R button 54L have a button surface facing upward (inclined upward). When the user grips the terminal device 7, the finger or forefinger is considered to move up and down, so that the user presses the second L button 54K and the second R button 54L by pointing the button face upward. Easier

As described above, in the present embodiment, the operation portions (analog sticks 53A and 53B) are provided on the left and right sides of the display portion (LCD 51) above the center of the housing 50, and the housing 50 is further provided. On the back side of the side), the other operation part (2nd L button 54K and 2nd R button 54L) is provided in the opposite position of the said operation part, respectively. According to this, since the said operation part and another operation part are arrange | positioned in the mutually opposing position of the front side and the back side of the housing 50, a user grips the housing 50 from the front side and the back side when operating these operation parts. can do. Moreover, when operating these operation parts, since a user grips the upper side rather than the center of the up-down direction in the housing 50, the user can hold the terminal device 7 from the upper side, and the terminal device 7 with the palm. (See FIGS. 10 and 11). By the above, the user can hold | maintain the housing 50 stably in the state which can operate at least 4 operation part, and a user can hold | grip easily and the operability (terminal device 7) with good operability. ] Can be provided.

As described above, in the present embodiment, the user can grip the terminal device 7 comfortably by holding the terminal device 7 in a state where the finger is placed on the lower surface of the protrusion (cover part 59). Further, since the second L button 54K and the second R button 54L are provided on the upper surface of the projection, the user can easily operate these buttons in the above state. The user can easily hold the terminal device 7 by, for example, the following holding method.

That is, as shown in FIG. 10, the user places the weak finger on the lower surface of the lid portion 59 (one dashed line shown in FIG. 10) (to support the lid portion 59 with the weak finger). It is also possible to grasp. At this time, the user can operate four buttons (first L button 54I, first R button 54J, second L button 54K, and second R button 54L) with an index finger and a middle finger. For example, when the required game operation has many buttons and is relatively complicated, as shown in Fig. 10, many buttons can be easily operated by holding them. Further, since each analog stick 53A and 53B is installed above the crisscross button 54A and the buttons 54E to 54H, the user can use the thumb for the analog sticks 53A and 53B when a relatively complicated operation is required. It is preferable because it can operate. In addition, in FIG. 10, a user puts a thumb on the surface of the housing 50, a forefinger on the upper surface of the housing 50, and a finger which touches the upper surface of the cover part 59 on the back surface of the housing 50. In FIG. The terminal device 7 is gripped by placing a finger on the bottom surface of the lid portion 59 and a little finger on the back surface of the housing 50. In this manner, the user can reliably grip the terminal device 7 by covering the housing 50 from all directions.

In addition, as shown in FIG. 11, the user can hold the terminal device 7 against the lower finger (single dashed line shown in FIG. 11) of the lid portion 59. At this time, the user can easily operate two buttons (the second L button 54K and the second R button 54L) with the index finger. For example, when the required game operation has few buttons to use and is relatively simple, the game operation may be held as shown in FIG. 11. In FIG. 11, since the user can hold the lower side of the housing 50 with two fingers (about a finger and a little finger), the terminal device 7 can be firmly gripped.

In addition, in this embodiment, the lower surface of the cover part 59 is located between each analog stick 53A and 53B, a crisscross button 54A, and four buttons 54E-54H (each analog stick). Lower than the 53A and 53B, and positioned above the crisscross button 54A and the four buttons 54E to 54H. Therefore, when the terminal device 7 is gripped with the weak finger on the lid 59 (FIG. 10), each of the analog sticks 53A and 53B can be easily operated with the thumb, and the middle finger is closed with the lid 59. In the case where the terminal device 7 is held (Fig. 11), the crisscross buttons 54A and four buttons 54E to 54H can be easily operated with the thumb. That is, in any of the two types of cases, the user can perform the direction input operation in a state where the terminal device 7 is firmly gripped.

In addition, as described above, the user may hold the terminal device 7 in a vertical grip. That is, as illustrated in FIG. 12, the user can grip the terminal device 7 in the vertical direction by holding the upper side of the terminal device 7 with the left hand. As shown in FIG. 13, the user can grip the terminal device 7 in the vertical direction by holding the lower side of the terminal device 7 with the left hand. 12 and 13 illustrate a case in which the terminal device 7 is held by the left hand, it is also possible to hold by the right hand. As described above, since the user can hold the terminal device 7 with one hand, an operation such as inputting the touch panel 52 with the other hand while holding the terminal device 7 with one hand, for example. It is also possible to do this.

In addition, when the terminal device 7 is gripped by the grip method shown in FIG. 12, the user may place a finger other than the thumb (the middle finger, the weak finger and the little finger in FIG. 12) on the lower surface of the lid 59 (FIG. By damaging to the dashed-dotted line shown in 12, the terminal apparatus 7 can be reliably gripped. In particular, in this embodiment, since the cover part 59 extends to the left and right (up and down in FIG. 12), even if a user grips in any position in the upper side of the terminal apparatus 7, it is other than a thumb. Can be touched by the lid 59, so that the terminal device 7 can be reliably gripped. That is, when using the terminal device 7 as a vertical grip, the cover part 59 can be used as a handle. On the other hand, when the terminal device 7 is gripped by the holding method shown in Fig. 13, the user can operate the buttons 54B to 54D with his left hand. Therefore, for example, while the input to the touch panel 52 is performed with one hand, the operation of the buttons 54B to 54D can be performed with the hand holding the terminal device 7, and thus more operations can be performed. .

In addition, regarding the terminal apparatus 7 in this embodiment, since the projection part (cover part 59) is formed in the back surface, the screen (surface of the housing 50) of the LCD 51 faces up. When the terminal device 7 is loaded in the screen, the screen is slightly tilted. This makes the screen easier to see in the state where the terminal device 7 is loaded. Moreover, in the state which mounted the terminal apparatus 7, it becomes easy to perform input operation with respect to the touch panel 52. FIG. In addition, in another embodiment, the additional protrusion part which has the same height as the said cover part 59 may be formed in the back surface of the housing | casing 50. As shown in FIG. According to this, in the state where the screen of the LCD 51 faces upward, the terminal part 7 can be mounted so that the screen may be horizontal by contacting the bottom surface with each projection part. Further, the additional protrusions may be detachable (or foldable). According to this, the terminal apparatus can be loaded in both the state where the screen is slightly inclined and the state where the screen is horizontal. In other words, when the terminal device 7 is placed and used, the lid portion 59 can be used as a leg portion.

To each of the buttons 54A to 54L, a function corresponding to a game program is appropriately assigned. For example, the crisscross buttons 54A and the buttons 54E to 54H may be used for direction indication operations, selection operations, or the like, and the respective buttons 54B to 54E may be used for determination operations, cancellation operations, and the like. In addition, the terminal device 7 may have a button for turning on / off the screen display of the LCD 51 and a button for performing connection setting (pairing) with the game device 3.

As shown in FIG. 8A, the terminal device 7 includes a marker portion 55 formed of a marker 55A and a marker 55B on the surface of the housing 50. The marker portion 55 is provided above the LCD 51. Each marker 55A and the marker 55B are composed of one or more infrared LEDs, similarly to the respective markers 6R and 6L of the marker device 6. The infrared LEDs constituting the markers 55A and 55B are disposed inside the window portion that transmits infrared light. The marker unit 55 is used for the game device 3 to calculate the movement of the controller 5 and the like, similarly to the marker device 6 described above. In addition, the game device 3 can control the lighting of each infrared LED included in the marker unit 55.

The terminal device 7 is equipped with the camera 56 which is an imaging means. The camera 56 includes an imaging device (eg, a CCD image sensor, a CMOS image sensor, etc.) having a predetermined resolution, and a lens. As shown in FIG. 8, in the present embodiment, the camera 56 is provided on the surface of the housing 50. Therefore, the camera 56 can image the face of the user holding the terminal device 7, and can image the user when playing the game while watching the LCD 51, for example. In this embodiment, the camera 56 is arrange | positioned between two markers 55A and 55B.

In addition, the terminal device 7 is provided with a microphone 69 which is a voice input means. The hole 50c for a microphone is formed in the surface of the housing 50. As shown in FIG. The microphone 69 is installed inside the inner housing 50 of the hole 50c for the microphone. The microphone 69 detects ambient sounds of the terminal device 7, such as a user's voice.

The terminal device 7 includes a speaker 77 which is a sound output means. As shown in FIG. 8D, a speaker hole 57 is formed below the surface of the housing 50. The output sound of the speaker 77 is output from this speaker hole 57. In the present embodiment, the terminal device 7 includes two speakers, and speaker holes 57 are formed at respective positions of the left speaker and the right speaker. The terminal device 7 also has a knob 64 for adjusting the volume of the speaker 77. In addition, the terminal device 7 is provided with an audio output terminal 62 for connecting an audio output unit such as an earphone. Here, the audio output terminal 62 and the handle 64 are installed on the side of the upper screen of the housing 50 in consideration of the connection of the additional device to the lower side of the housing. It may be.

In addition, the housing 50 is provided with a window 63 for emitting infrared signals from the infrared communication module 82 to the outside of the terminal device 7. Here, the window 63 is provided on the upper side of the housing 50 so that an infrared signal is emitted to the front of the user when both sides of the LCD 51 are gripped. However, in another embodiment, the window 63 may be provided in any position, such as the back surface of the housing 50, for example.

In addition, the terminal device 7 is provided with an expansion connector 58 for connecting another device to the terminal device 7. The expansion connector 58 is a communication terminal for transmitting and receiving data (information) with another device connected to the terminal device 7. In the present embodiment, as shown in FIG. 8D, the expansion connector 58 is provided on the lower side surface of the housing 50. The other additional devices connected to the expansion connector 58 may be any type, for example, an input device such as a controller (gun controller or the like) or a keyboard used for a specific game. If it is not necessary to connect an additional device, the expansion connector 58 may not be provided. The expansion connector 58 may include a terminal for supplying power to the additional device and a terminal for charging.

The terminal device 7 also has a charging terminal 66 for acquiring electric power from the additional device separately from the expansion connector 58. [ When the charging terminal 66 is connected to the stand 210 which will be described later, power is supplied from the stand 210 to the terminal device 7. In this embodiment, the charging terminal 66 is provided in the lower side surface of the housing 50. Therefore, when the terminal device 7 and the additional device (for example, the input device 200 shown in FIG. 15 or the input device 220 shown in FIG. 17) are connected, information is received through the expansion connector 58. In addition to transmitting and receiving, it is also possible to supply power from one side to the other. Thus, by providing the charging terminal 66 around the expansion connector 58 (both right and left sides), when the terminal device 7 and the additional device are connected, it is also possible to supply power together with transmission and reception of information. In addition, the terminal device 7 has a charging connector, and the housing 50 has a cover portion 61 for protecting the charging connector. The charging connector is connectable to a charger 86 described later, and when the charging connector is connected to the charger, power is supplied from the charger 86 to the terminal device 7. In the present embodiment, the charging connector (cover portion 61) is provided on the side of the upper screen of the housing 50 in consideration of the fact that the additional device is connected to the lower side of the housing. good.

In addition, the terminal device 7 has a battery cover 67 that can be attached to and detached from the housing 50. Inside the battery cover 67, a battery (battery 85 shown in FIG. 14) is disposed. In this embodiment, the battery cover 67 is provided on the back surface side of the housing 50, and is provided below the protrusion part (cover part 59).

In addition, the housings 50 of the terminal device 7 are formed with holes 65a and 65b for connecting the strap of the strap. As shown in FIG. 8D, the holes 65a and 65b are formed in the lower surface of the housing 50 in the present embodiment. In the present embodiment, two holes 65a and 65b are formed, one each on the left and right sides of the housing 50. That is, the hole 65a is formed in the left side rather than the center of the lower surface of the housing 50, and the hole 65b is formed in the right side rather than the center of the lower surface of the housing 50. The user may connect the strap to either of the holes 65a and 65b and connect the strap to his wrist. Thereby, even if the user drops the terminal device 7, or the terminal device 7 is out of hand, the terminal device 7 can be prevented from falling or colliding with another object. Moreover, in this embodiment, since a hole is formed in both left and right sides, a user can connect a strap to either hand, and it is convenient.

In addition, with respect to the terminal device 7 shown in FIGS. 8 to 13, the shape of each operation button and the housing 50, the number of each component, the installation position, and the like are merely examples. It may be an installation position.

Next, with reference to FIG. 14, the internal structure of the terminal device 7 is demonstrated. 14 is a block diagram showing an internal configuration of the terminal device 7. As shown in FIG. As shown in FIG. 14, the terminal device 7 includes a touch panel controller 71, a magnetic sensor 72, an acceleration sensor 73, a gyro sensor 74, and a user interface controller in addition to the configuration shown in FIG. 8. (UI controller) 75, codec LSI 76, speaker 77, sound IC 78, microphone 79, wireless module 80, antenna 81, infrared communication module 82, flash memory 83, a power supply IC 84, and a battery 85 are provided. These electronic components are mounted on an electronic circuit board and stored in the housing 50.

The UI controller 75 is a circuit for controlling input and output of data to various input / output units. The UI controller 75 includes a touch panel controller 71, an analog stick 53 (analog sticks 53A and 53B), operation buttons 54 (each operation button 54A to 54L), a marker portion 55, It is connected to the magnetic sensor 72, the acceleration sensor 73, and the gyro sensor 74. The UI controller 75 is also connected to the codec LSI 76 and the expansion connector 58. In addition, the power supply IC 84 is connected to the UI controller 75, and electric power is supplied to each unit through the UI controller 75. The built-in battery 85 is connected to the power supply IC 84, and electric power is supplied. In addition, it is possible to connect a charger 86 or a cable that can acquire power from an external power source to the power supply IC 84 via a charging connector, and the terminal device 7 uses an external power supply using the charger 86 or a cable. Power supply and charging from can be performed. The terminal device 7 can also be charged by mounting the terminal device 7 to a cradle having a charging function (not shown). That is, although not shown, it is possible to connect the cradle (stand 210 shown in FIG. 20), which can acquire power from an external power source, to the power supply IC 84 via the charging terminal 66, and the terminal device 7. ) Can use a cradle to perform power supply and charging from an external power source.

The touch panel controller 71 is connected to the touch panel 52 and controls the touch panel 52. The touch panel controller 71 generates touch position data of a predetermined format based on a signal from the touch panel 52 and outputs the touch position data to the UI controller 75. The touch position data indicates the coordinates of the position where the input is made on the input surface of the touch panel 52. In addition, the touch panel controller 71 reads a signal from the touch panel 52 and generates touch position data at a predetermined time at a predetermined time. In addition, various control instructions for the touch panel 52 are output from the UI controller 75 to the touch panel controller 71.

The analog stick 53 outputs, to the UI controller 75, stick data indicating the direction and amount in which the stick portion operated by the user's finger slides (or is hard). In addition, the operation button 54 outputs operation button data indicating the input status (whether or not it is pressed) to each operation button 54A to 54L to the UI controller 75.

The magnetic sensor 72 detects the orientation by detecting the magnitude and direction of the magnetic field. The orientation data indicating the detected orientation is output to the UI controller 75. In addition, the control instruction to the magnetic sensor 72 is output from the UI controller 75 to the magnetic sensor 72. As for the magnetic sensor 72, a MI (magnetic impedance) element, a flex gate sensor, a hall element, a GMR (giant magnetoresistance) element, a TMR (tunnel magnetoresistance) element, an AMR (anisotropic magnetoresistance) element, or the like is used. Although there is a sensor, any one may be used as long as the orientation can be detected. In addition, strictly in the place where a magnetic field is generated other than the geomagnetism, the obtained orientation data does not show the orientation, but even in such a case, when the terminal apparatus 7 moves, since the orientation data changes, the terminal The change in posture of the device 7 can be calculated.

The acceleration sensor 73 is installed inside the housing 50, and detects the magnitude of the linear acceleration along the three axes (xyz axis shown in Fig. 8A). Specifically, the acceleration sensor 73 has the x-axis as the long side direction of the housing 50, the y-axis as the direction perpendicular to the surface of the housing 50, and the z-axis as the short side direction of the housing 50. The magnitude of linear acceleration of each axis is detected. Acceleration data indicative of the detected acceleration is output to the UI controller 75. In addition, a control instruction to the acceleration sensor 73 is output from the UI controller 75 to the acceleration sensor 73. In the present embodiment, the acceleration sensor 73 is, for example, a capacitive MEMS type acceleration sensor, but in another embodiment, an acceleration sensor of another method may be used. The acceleration sensor 73 may be an acceleration sensor that detects one or two axis directions.

The gyro sensor 74 is installed inside the housing 50 and detects angular velocities around three axes of the x-axis, y-axis, and z-axis. Angular velocity data representing the detected angular velocity is output to the UI controller 75. In addition, the control instruction for the gyro sensor 74 is output from the UI controller 75 to the gyro sensor 74. The number and combination of gyro sensors used to detect the angular velocity of three axes may be any, and the gyro sensor 74 may be composed of a two-axis gyro sensor and a one-axis gyro sensor, similar to the gyro sensor 48. . In addition, the gyro sensor 74 may be a gyro sensor which detects a 1-axis or 2-axis direction.

The UI controller 75 outputs operation data including the touch position data, the stick data, the operation button data, the orientation data, the acceleration data, and the angular velocity data received from the above components to the codec LSI 76. In addition, when another device is connected to the terminal device 7 via the expansion connector 58, the data indicating the operation for the other device may be further included in the operation data.

The codec LSI 76 is a circuit which performs compression processing on the data transmitted to the game device 3 and decompression processing on the data transmitted from the game device 3. The LCD 51, the camera 56, the sound IC 78, the wireless module 80, the flash memory 83, and the infrared communication module 82 are connected to the codec LSI 76. The codec LSI 76 also includes a CPU 87 and an internal memory 88. Although the terminal apparatus 7 is a structure which does not perform game process itself, it is necessary to execute the minimum program for management and communication of the terminal apparatus 7. When the power is turned on, the terminal device 7 is activated by reading the program stored in the flash memory 83 into the internal memory 88 and executing the CPU 87. In addition, some areas of the internal memory 88 are used as the VRAM for the LCD 51.

The camera 56 picks up an image in accordance with an instruction from the game device 3, and outputs the picked-up image data to the codec LSI 76. The control instruction for the camera 56 is output from the codec LSI 76 to the camera 56 such as an image capturing instruction. The camera 56 can also capture moving images. That is, the camera 56 can also perform repetitive imaging and repeatedly output image data to the codec LSI 76.

The sound IC 78 is a circuit which is connected to the speaker 77 and the microphone 79 and controls the input / output of voice data to the speaker 77 and the microphone 79. That is, when sound data is received from the codec LSI 76, the sound IC 78 outputs the sound signal obtained by performing D / A conversion on the sound data to the speaker 77, and the sound from the speaker 77. Outputs In addition, the microphone 79 detects a sound (such as a user's voice) transmitted to the terminal device 7 and outputs a voice signal representing the sound to the sound IC 78. The sound IC 78 performs A / D conversion on the audio signal from the microphone 79 and outputs the audio data of a predetermined format to the codec LSI 76.

The codec LSI 76 uses the image data from the camera 56, the audio data from the microphone 79, and the operation data from the UI controller 75 as the terminal operation data via the wireless module 80 as a game device 3. To send). In this embodiment, the codec LSI 76 performs compression processing similar to the codec LSI 27 on image data and audio data. The terminal operation data, compressed image data, and audio data are output to the radio module 80 as transmission data. An antenna 81 is connected to the radio module 80, and the radio module 80 transmits the transmission data to the game device 3 via the antenna 81. The radio module 80 has the same function as the terminal communication module 28 of the game device 3. In other words, the wireless module 80 has a function of connecting to a wireless LAN by, for example, a system conforming to the standard of IEEE 802.11n. The data to be transmitted may or may not be encrypted as necessary.

As described above, the transmission data transmitted from the terminal device 7 to the game device 3 includes operation data (terminal operation data), image data, and audio data. In addition, when another device is connected to the terminal device 7 via the expansion connector 58, the data received from the other device may further be included in the transmission data. In addition, the infrared communication module 82 performs infrared communication with other devices, for example, in accordance with the IRDA standard. The codec LSI 76 may transmit the data received by the infrared communication to the game device 3 as necessary in the transmission data.

In addition, as described above, compressed image data and audio data are transmitted from the game device 3 to the terminal device 7. These data are received by codec LSI 76 via antenna 81 and wireless module 80. The codec LSI 76 expands the received image data and audio data. The expanded image data is output to the LCD 51, and the image is displayed on the LCD 51. That is, the codec LSI 76 (CPU 87) causes the display unit to display the received image data. Further, the expanded voice data is output to the sound IC 78, and the sound IC 78 outputs sound from the speaker 77.

When the control data is included in the data received from the game device 3, the codec LSI 76 and the UI controller 75 give control instructions to the respective parts in accordance with the control data. As described above, the control data includes each component included in the terminal device 7 (in this embodiment, the camera 56, the touch panel controller 71, the marker unit 55, and the sensors 62 to 64). And infrared communication module 82). In this embodiment, as a control instruction | indication by control data, the instruction | indication which makes each said component operate or pauses (stops) an operation can be considered. In other words, the components that are not used in the game may be paused in order to suppress power consumption. In that case, the data from the idle components may be included in the transmission data transmitted from the terminal device 7 to the game device 3. It should not be included. In addition, since the marker unit 55 is an infrared LED, the control can be easily turned on or off.

As described above, the terminal device 7 includes operation means such as a touch panel 52, an analog stick 53, an operation button 54, etc., but in other embodiments, these operation means may be used instead of these operation means. In addition, the structure provided with another operation means may be sufficient.

In addition, the terminal device 7 is a sensor for calculating the movement (including position or posture, or change in position or posture) of the terminal device 7, and includes a magnetic sensor 72, an acceleration sensor 73, and a gyro. Although the sensor 74 is provided, in another embodiment, the structure provided with only one or two of these sensors may be sufficient. Moreover, in another embodiment, the structure provided with other sensors instead of these sensors or with these sensors may be sufficient.

In addition, although the terminal apparatus 7 is comprised with the camera 56 and the microphone 79, in another embodiment, it is not necessary to provide the camera 56 and the microphone 79, and only one is provided. You may do it.

In addition, the terminal device 7 is a structure for calculating the positional relationship between the terminal device 7 and the controller 5 (the position and / or posture of the terminal device 7 from the controller 5, etc.). Although it is the structure provided with 55, you may make it the structure which is not equipped with the marker part 55 in another embodiment. In another embodiment, the terminal device 7 may include other means as a configuration for calculating the positional relationship. For example, in another embodiment, the controller 5 may be provided with the marker part, and the terminal device 7 may be provided with the imaging element. In this case, the marker device 6 may be configured to include an imaging element instead of the infrared LED.

(Configuration of Additional Device)

Next, with reference to FIGS. 15-20, the example of the additional apparatus which can be attached (connected) to the terminal apparatus 7 is demonstrated. The additional device may have any function, for example, may be an additional operation device mounted on the terminal device 7 to perform a predetermined operation, or may be a charger that supplies electricity to the terminal device 7. It may be a stand for placing the terminal device 7 in a predetermined posture.

As shown in FIG. 8D and FIG. 9, engaging holes 59a and 59b in which the hook portion of the attachment device can be engaged are formed on the lower surface of the projection portion (cover portion 59). The locking holes 59a and 59b are used when connecting other additional devices to the terminal device 7. That is, the attachment device has a hook portion which can be locked to the locking holes 59a and 59b, and when the attachment device is connected to the terminal device 7, the hook device is locked by the hook holes 59a and 59b. And additional device is fixed. Further, a screw hole may be further formed inside the locking holes 59a and 59b, or the attachment device may be fixed firmly with a screw. In addition, the protrusion provided in the back surface of the terminal device 7 is the cover part 59 which has a shading shape here. That is, the cover part 59 is extended and installed in the left-right direction. As shown in FIG. 9, the locking holes 59a and 59b are provided near the center of the lower surface of the lid portion 59 (with respect to the left and right directions). In addition, although the number of the latching holes 59a and 59b provided in the lower surface of the cover part 59 may be sufficient, it is preferable that it is formed in the center of the cover part 59 when it is one, and when there are multiple, it is right and left It is preferred to be arranged symmetrically. According to this, the left and right balance can be maintained equally, and the additional device can be connected stably. In addition, when the locking hole is formed near the center, the size of the additional device can be reduced as compared with the case where it is formed in the left and right both ends. That is, the cover part 59 can be used as a locking member of the attachment.

In addition, in this embodiment, as shown in FIG.8 (d), engaging holes 50a and 50b are formed in the lower surface of the housing 50. As shown in FIG. Therefore, in the case of connecting the additional device to the terminal device 7, the four hook portions are locked to the four locking holes, respectively, so that the terminal device 7 and the additional device are fixed. Thereby, the additional device can be firmly connected by the terminal device 7. Further, a screw hole may be formed inside the locking holes 50a and 50b, and the attachment device may be screwed. In addition, in another embodiment, the latching hole formed in the housing may be any arrangement.

15 and 16 are diagrams showing an example in which the additional device is mounted on the terminal device 7. FIG. 15 is a view of the terminal device 7 and the input device 200 from the front side of the terminal device 7, and FIG. 16 is a view of the terminal device 7 and the input device 200 from the back side of the terminal device 7. ). In FIG. 15 and FIG. 16, the input device 200 which is an example of the additional device is attached to the terminal device 7.

The input device 200 includes a first grip part 200a and a second grip part 200b. Each grip part 200a and 200b is a rod-shaped (pillar) shape, respectively, and a user can hold | grip with one hand. The user can hold either one of the grip portions 200a and 200b to use the input device 200 (and the terminal device 7), or the user can use the input device 200 by holding both. In addition, the input device 200 may be configured to include only one grip part. The input device 200 also includes a support 205. In this embodiment, the support part 205 supports the back surface (back surface) of the terminal device 7. Specifically, the support portion 205 has four hook portions (convex portions), and the four hook portions can be locked to the locking holes 50a, 50b, 59a, and 59b, respectively.

15, when the input device 200 is connected to the terminal device 7, the four claw portions are hooked to the four engagement holes 50a, 50b, 59a, and 59b, respectively, 7) and the additional device are fixed. Thereby, the input device 200 can be firmly fixed to the terminal device 7. In addition, in another embodiment, the input device is connected to the terminal device 7 by screwing the input device 200 and the terminal device 7 in addition to (or instead of the locking) of the hook portion and the locking hole. 200 may be fixed more firmly. In addition, although the position of screwing may be anywhere, the support part 205 and the lid part 59 of the input device 200 which contact | connect the back surface of the housing 50 may be screwed, for example.

In this way, in the present embodiment, the attachment device can be reliably fixed to the terminal device 7 by the locking holes 59a and 59b. In addition, since the terminal device 7 has sensors (magnetic sensor 72, acceleration sensor 73 and gyro sensor 74) for detecting the movement and the tilt of the terminal device 7, the terminal device 7 It can also be used by moving itself. For example, when the input device 200 shown in FIG. 15 and FIG. 16 is connected to the terminal device 7, the user grasps the grip part 200a and / or 200b of the input device 200, and input device. It is also possible to operate the 200 by moving it like a gun. When it is assumed that the terminal device 7 itself is moved and used as in the present embodiment, it is particularly effective to secure the additional device reliably by the locking holes 59a and 59b.

In addition, in this embodiment, the support part 205 uses the terminal device (when the 1st grip part 200a (or 2nd grip part 200b) faces a perpendicular direction so that the screen of the LCD 51 may become substantially a perpendicular direction. 7) is detachably supported. Each grip part 200a and 200b is formed so that it may become substantially parallel with the display part (surface of the housing 50) of the terminal apparatus 7 connected to the input apparatus 200. FIG. In other words, each grip part 200a and 200b is formed so that it may face the up-down direction of the display part of the terminal apparatus 7 connected to the input apparatus 200. FIG. In this manner, the input device 200 is connected to the terminal device 7 in a posture in which the display portion of the terminal device 7 faces the user (when the user holds the input device 200). The user can direct the screen of the display unit toward the user by holding the grip portions 200a and 200b (at least one of them) substantially vertically, so that the user can operate by using the input device 200 while viewing the screen of the display unit. Can be done. In addition, in this embodiment, although the 2nd grip part 200b faces in the direction substantially parallel to the 1st grip part 200a, in another embodiment, at least 1 grip part is substantially parallel with the screen of the LCD 51. FIG. It may be formed in one direction. As a result, the user can easily grip the input device 200 (and the terminal device 7) with the LCD 51 pointing toward the user by the grip part.

In addition, in the said embodiment, the support part 205 is provided in the connection member 206 which connects the 1st grip part 200a and the 2nd grip part 200b. That is, since the support part 205 is provided between two grip parts 200a and 200b, the terminal device 7 connected to the input device 200 is arrange | positioned between two grip parts 200a and 200b. At this time, the center of gravity of the operating device (manipulation system) consisting of the terminal device 7 and the input device 200 is between the two grip parts 200a and 200b, so that the user grasps and grips the two grip parts 200a and 200b. By doing so, the operating device can be easily gripped. In addition, in the said embodiment, one grip part (1st grip part 200a) is provided in the position which becomes the front side of the screen of the terminal apparatus 7 attached to the input apparatus 200, and the other grip part [2nd Grip portion 200b] is provided at a position that becomes the rear side of the screen. Therefore, the user can easily grip the operation device by holding two grips in the same gripping manner as holding a gun with one hand at the front of the screen and the other hand at the back of the screen. Therefore, for example, the operation device is particularly suitable for a shooting game in which the operation device is selected with a gun to perform game operation.

The input device 200 also includes a first button 201, a second button 202, a third button 203, and a stick 204 as an operation unit. Each button 201-203 is a button (key) which can be pressed by a user, respectively. Stick 204 is a device that can indicate a direction. It is preferable that the said operation part is provided in the position which can be operated by the finger of the gripped hand, when a user grips a grip part. In this embodiment, the 1st button 201, the 2nd button 202, and the stick 204 are provided in the position which can be operated by the thumb of the hand which grasped the 1st grip part 200a. In addition, the third button 203 is provided at a position which can be operated by the index finger of the hand that holds the second grip portion 200b.

In addition, the input apparatus 200 may be provided with the imaging device (imaging part). For example, the input device 200 may have the same structure as the imaging information calculating section 35 included in the controller 5. At this time, the imaging element of the imaging information calculating unit may be provided in the direction of imaging the front of the input device 200 (back of the screen of the terminal device 7). For example, the infrared filter may be disposed at the position of the third button 203 instead of the third button 203, and the imaging element may be disposed inside the infrared filter. According to this, the user can use the front of the input device 200 facing the television 2 (marker device 6), whereby the game device 3 can calculate the direction or position of the input device 200. . Therefore, the user can perform an operation for directing the input device 200 in a desired direction, and can perform an intuitive and easy operation using the input device 200. In addition, the input device 200 may be configured to include a camera such as the camera 56 instead of the imaging information calculating unit. At this time, a camera may be provided in the direction of imaging the front of the input device 200 similarly to the said imaging element. According to this, the user captures an image in the imaging direction opposite to the camera 56 of the terminal device 7 by using the front of the input device 200 facing the television 2 (marker device 6). can do.

In addition, the input device 200 includes a connector (not shown), which is connected to the expansion connector 58 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. As a result, data can be transmitted and received between the input device 200 and the terminal device 7. For example, data indicating an operation on the input device 200 or data indicating an image pickup result by the imaging device may be transmitted to the terminal device 7. At this time, the terminal device 7 may wirelessly transmit operation data indicating an operation performed on the terminal device 7 and data transmitted from the input device to the game device 3. In addition, the input device 200 may be provided with the charging terminal connected with the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. According to this, when the terminal apparatus 7 is attached to the input apparatus 200, electric power can be supplied from one apparatus to the other apparatus. For example, the input device 200 may be connected to the charger, and the terminal device 7 may be charged by acquiring power from the charger via the input device 200.

The input device 200 may be configured as follows, for example. 17 is a diagram illustrating another example of the input device. 18 and 19 illustrate a state in which the input device 220 illustrated in FIG. 17 is mounted on the terminal device 7. FIG. 18 is a view of the terminal device 7 and the input device 220 from the back side of the terminal device 7, and FIG. 19 is a view of the terminal device 7 and the input device 220 from the surface side of the terminal device 7. ). It is also possible to mount the input device 220 shown in FIG. 17 to the terminal device 7, for example. Hereinafter, the input device 220 will be described. In addition, in FIG. 17-20, the component corresponding to the component of the input device 200 shown in FIG. 15 and FIG. 16 is attached | subjected with the same reference numeral as FIG. 15 and FIG. 16, and detailed description is abbreviate | omitted. .

As shown in FIG. 17, the input device 220 includes a first grip part 200a and a second grip part 200b similar to the input device 200. Therefore, the user can hold only one of the grip parts 200a and 200b to use the input device 220 (and the terminal device 7), and the user can hold both sides to use the input device 220. It is possible.

In addition, the input device 220 includes a support portion 205 similar to the input device 200. The support 205 has four hooks (only three hooks 205a to 205c are shown in FIG. 17), similar to the support of the input device 200. The upper two hook portions 205a and 205b of each hook portion can be caught by the catching holes 59a and 59b of the terminal device 7, respectively. The remaining two hook portions can be caught by the locking holes 50a and 50b of the terminal device 7, respectively. In addition, the hook part which is not shown in figure is provided in the position which is symmetrical with the hook part 205c about the left-right direction (left-right direction of the terminal apparatus 7 attached to the support part 205).

As shown in FIG. 18 and FIG. 19, when connecting the input device 220 to the terminal device 7, four hook parts are engaged with the four locking holes 50a, 50b, 59a and 59b, respectively. The terminal device 7 and the input device 220 are fixed. As a result, the input device 220 can be firmly fixed to the terminal device 7. In addition, in another embodiment, the input device is connected to the terminal device 7 by screwing the input device 220 and the terminal device 7 in addition to (or instead of the locking) of the hook portion and the locking hole. 220 may be fixed more firmly. For example, a screw hole may be formed inside the locking holes 50a and 50b, and the two lower hook portions may be screwed to the locking holes 50a and 50b. In addition, the position of screw fixing may be anywhere.

As described above, the input device 220 can also be reliably fixed to the terminal device 7 similarly to the input device 200.

In addition, with respect to the input device 220, similarly to the input device 200, the support part 205 has a screen of the LCD 51 when the first grip part 200a (or the second grip part 200b) faces the vertical direction. The terminal device 7 is detachably supported so as to be in this substantially vertical direction. Each grip part 200a and 200b is formed so that it may become substantially parallel with the display part (surface of the housing 50) of the terminal device 7 connected to the input device 220. FIG. Therefore, the user can hold the grip portions 200a and 200b (at least one of them) in a substantially vertical position so that the screen of the display portion faces the user, while using the input device 200 while viewing the screen of the display portion. The operation can be performed. In addition, the input device 220 also supports the terminal device 7 above the grip part similarly to the input device 200, so that the screen is easily viewed by the user by holding the grip part. In another embodiment, at least one grip portion may be formed in a direction substantially parallel to the screen of the LCD 51.

In the input device 220, the shape of the connecting portion is different from that of the input device 200. The connection part 209 shown in FIG. 17 is connected to two places of the upper side and the lower side of the 1st grip part 200a, and is connected to the upper side (upper end part) of the 2nd grip part 200b. In addition, also with respect to the input device 220, like the input device 200, the connection part 209 protrudes ahead of the 2nd grip part 200b, and is formed. As for the input device 200, the support part 205 is provided in the connection member 209 which connects the 1st grip part 200a and the 2nd grip part 200b also about the input device 220. As shown in FIG. Therefore, the user can easily grip the operation device by holding and holding the two grip parts 200a and 200b.

The connection portion 209 has a member extending downward from a connection portion with the support portion 205. [ This member becomes a direction extending in a substantially vertical direction when the screen of the LCD 51 of the terminal device 7 connected to the support part 205 becomes a substantially vertical direction. That is, the member is in a direction substantially parallel to the grip portions 200a and 200b. Therefore, even when the user grips the member as the grip portion, the user can grip the member in a substantially vertical manner to perform the operation using the input device 200 while viewing the screen of the LCD 51. In addition, since the member is disposed below the support part 205, the gripping of the member results in an arrangement in which the screen is easy to see for the user.

Similarly to the input device 200, one grip part (first grip part 200a) is also provided at the position that becomes the front side of the screen of the terminal device 7 mounted on the input device 220. The other grip part (second grip part 200b) is provided at the position which becomes the back side of the said screen. Therefore, like the input device 200, the input device 220 can be easily grasped by a holding method such as holding a gun, and is particularly suitable for a shooting game or the like that selects an operating device as a gun and performs game operation.

In addition, the input device 220 includes a fourth button 207 in addition to the second button 202 and the stick 204 similar to the input device 200 as an operation unit. Like the input device 200, the second button 202 and the stick 204 are provided above the first grip part 200a. The fourth button 207 is a button (key) that can be pressed by the user. The fourth button 207 is provided above the second grip portion 200b. That is, the fourth button 207 is provided at a position operable by the index finger of the hand grasping the second grip portion 200b and the like.

The input device 220 is provided with an imaging element (imaging device). Here, the input device 220 is provided with the structure similar to the imaging information calculating part 35 with which the said controller 5 is equipped. The imaging element of the imaging information calculation part is provided in the direction which image | photographs the front of the input device 220 (rear of the screen of the terminal device 7). Specifically, a window portion (infrared filter) 208 is provided at the front end portion (the front end portion of the connection portion 206) of the input device 220, and the imaging device is provided inside the window portion 208, and the window portion 208 is provided. ) Is installed in the direction of imaging the front. According to the above, the user uses the front of the input device 220 toward the television 2 (marker device 6), so that the game device 3 can calculate the direction or position of the input device 220. have. Therefore, the user can perform an operation for directing the input device 220 in a desired direction, and can perform an intuitive and easy operation using the input device 220.

In addition, the input device 220 may be configured to include a camera such as the camera 56 instead of the imaging information calculating unit. According to this, the user captures an image in the imaging direction opposite to the camera 56 of the terminal device 7 by using the front of the input device 220 facing the television 2 (marker device 6). can do.

The input device 220 is provided with a connector (not shown) similarly to the input device 200, and the connector has an expansion connector (not shown) when the terminal device 7 is mounted on the input device 220. 58). As a result, data can be transmitted and received between the input device 220 and the terminal device 7. Therefore, the data indicating the operation on the input device 220 and the data indicating the imaging result by the imaging device may be transmitted to the game device 3 via the terminal device 7. In another embodiment, the input device 220 may be configured to directly communicate with the game device 3. That is, the data indicating the operation on the input device 220 is similar to the wireless communication between the controller 5 and the game device 3, for example, using the technology of Bluetooth (registered trademark) or the like. 220 may be directly transmitted to the game device 3. At this time, operation data indicating the operation performed on the terminal device 7 is transmitted from the terminal device 7 to the game device 3. The input device 220 includes a charging terminal connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 220, It may be.

In another embodiment, an operation device in which the terminal device 7 and the input device 200 (or the input device 220) are integrated may be provided. In this case, the terminal device 7 and the input device 200 are attached to or detached from each of the engaging holes 50a, 50b, 59a, and 59b of the terminal device 7, and the hook portion of the input device 200. A mechanism for possibly connecting is unnecessary.

20 is a diagram illustrating an example in which an additional device is connected to the terminal device 7. In FIG. 20, the terminal apparatus 7 is connected (mounted) to the stand 210 which is an example of an additional apparatus. The stand 210 is a support device for mounting (supporting) the terminal device 7 at a predetermined angle. The stand 210 is provided with the support member 211, the charging terminal 212, and the guide members 213a and 213b.

In this embodiment, the stand 210 also has a function as a charger, and has a charging terminal 212. The charging terminal 212 is a terminal connectable with the charging terminal 66 of the terminal device 7. In addition, in this embodiment, although each charging terminal 66 and 212 is a metal terminal, the connector which has a shape which can be connected to the other one may be sufficient. When the terminal device 7 is connected to the stand 210, the charging terminal 212 of the stand 210 and the charging terminal 66 of the terminal device 7 come into contact with each other, and the terminal device ( Electric power is supplied to 7), and charging can be performed.

The support member 211 is for supporting the back surface of the terminal device 7 at a predetermined angle. The supporting member 211 forms a predetermined surface (here, the back surface) of the housing 50 when the terminal (charge terminal 66) of the terminal device 7 and the terminal (charge terminal 212) of the stand 210 are connected. I support it. As shown in FIG. 20, the support member 211 has the wall part 211a and the groove part 211b. The support member 211 supports the housing 50 so that the rear surface of the housing 50 is mounted along a predetermined support surface (here, the surface formed by the wall portion 211a) by the wall portion 211a. In addition, the groove part 211b is a part into which the part (lower part) of the housing 50 is inserted, when the terminal device 7 and the stand 210 are connected. For this reason, the groove portion 211b is formed so as to approximately fit the shape of the part of the housing 50. The groove portion 211b extends in a direction parallel to the support surface.

In addition, the guide members 213a and 213b can be inserted into the second locking holes 50a and 50b of the terminal device 7, and are for determining the position where the terminal device 7 is connected to the stand 210. Each of the guide members 213a and 213b is provided at a position corresponding to the locking holes 50a and 50b of the terminal device 7. That is, each of the guide members 213a and 213b is provided at a position to be inserted into the locking holes 50a and 50b when the terminal device 7 and the stand 210 are correctly connected. In addition, the case where the terminal device 7 and the stand 210 are connected correctly is a case where the charging terminal 212 of the stand 210 and the charging terminal 66 of the terminal device 7 are connected. In addition, the guide members 213a and 213b are provided so that a part thereof protrudes from the bottom surface of the groove portion 211b. That is, the guide members 213a and 213b are provided so that a part thereof may protrude upward from the surface of the support member 211. When the terminal device 7 is connected to the stand 210, some of the guide members 213a and 213b are inserted into the locking holes 50a and 50b, respectively.

In this embodiment, each guide member 213a and 213b is a wheel member (roller part) which can be rotated, respectively. Each guide member 213a and 213b is rotatable in a predetermined direction. Here, the predetermined direction is a direction parallel to the support surface (in the horizontal direction), in other words, the left and right directions of the terminal device 7 when the terminal device 7 is connected to the stand 210. to be. The guide member may be a rotating member that can rotate at least in a predetermined direction. For example, in another embodiment, the guide member may be a sphere that is rotatably supported by a spherical recess. In addition, in this embodiment, although the number of guide members is two, the guide member of the number according to the number of the locking holes formed in the lower surface of the terminal apparatus 7 may be provided, and the stand 210 is a guide member. Only one may be provided and three or more may be provided.

When the terminal device 7 is connected to the stand 210, the back surface of the terminal device 7 contacts the support member 211, whereby the terminal device 7 is mounted on the stand 210 at a predetermined angle. . That is, the lower part of the housing 50 is inserted into the groove part 211b, and the wall part 211a supports the back surface of the housing 50, so that the terminal device 7 is mounted on the stand 210 at a predetermined angle. . Therefore, in this embodiment, with respect to the direction perpendicular | vertical to the said predetermined direction, the position of the terminal device 7 is set by the support member 211 to an accurate position.

Here, when the terminal device 7 is connected to the stand 210, when the terminal device 7 and the stand 210 are not in the correct positional relationship, the terminal device 7 is provided by the guide members 213a and 213b. ) Position is modified and connected. That is, when the engaging holes 50a and 50b are displaced from the guide members 213a and 213b with respect to the predetermined direction, the guide members 213a and 213b are engaged with the housing 50 around the engaging holes 50a and 50b, Lt; / RTI > As a result, the terminal device 7 slides in the predetermined direction by the rotation of the guide members 213a and 213b. In addition, in this embodiment, since the two guide members 213a and 213b are provided side by side in a predetermined direction, the lower surface of the terminal device 7 can be brought into contact only with the guide members 213a and 213b, so that the terminal device 7 Can be moved more smoothly. In addition, when the inclination (concave inclination) is formed around the engaging holes 50a and 50b, the terminal apparatus 7 can be moved more smoothly. As a result of the sliding of the terminal device 7 as described above, a part of each of the guide members 213a and 213b is inserted into the locking holes 50a and 50b. Thereby, the charging terminal 212 of the stand 210 and the charging terminal 66 of the terminal device 7 come into contact with each other, and the charging is surely performed.

As described above, the user can easily connect the terminal device 7 to the stand 210 without loading the terminal device 7 at the correct position. According to this embodiment, since the positioning of the terminal device 7 with respect to the stand 210 can be performed by the simple and easy structure of the locking hole of the terminal device 7, and the guide member of the stand 210, it is a stand. It is possible to make 210 a compact and simple and easy configuration. In the present embodiment, the terminal device 7 is a relatively large portable device, but even with such a large portable device, the stand 210 itself can have a small configuration as shown in FIG. In addition, since the stand 210 may connect terminal devices having various shapes or sizes, it is possible to provide a support device having high versatility.

In addition, in the present embodiment, the locking holes 50a and 50b are used as the holes for hooking the hook portion of the attachment device, and are also used as objects for inserting the guide members. Therefore, the number of holes formed in the housing 50 of the terminal device 7 can be reduced, and the shape of the housing 50 can be made simple and easy.

In addition, in the said embodiment, although the hole into which the guide member of the stand 210 is inserted was a hole formed in the lower side surface of the housing 50 (locking holes 50a and 50b), where is the position of the hole? It may be. For example, a hole may be formed in the other side surface of the housing 50, and a hole may be formed in the front surface or the back surface of the housing 50. In addition, since the guide portion needs to be provided at a position corresponding to the position of the hole, when the hole is formed on the front surface or the rear surface of the housing 50, the guide portion of the stand 210 is, for example, the position of the wall portion 211a. It may be installed in. In addition, holes may be formed in a plurality of surfaces of the housing 50, and at this time, the terminal device 7 can be mounted on the stand 210 in various directions.

[5. Game processing]

Next, the details of the game processing executed in the game system will be described. First, various data used in game processing will be described. 21 is a diagram illustrating various data used in game processing. In FIG. 21, the main data stored in the main memory (external main memory 12 or internal main memory 11e) of the game device 3 is shown. As shown in FIG. 21, the game program 90, the reception data 91, and the processing data 106 are stored in the main memory of the game device 3. In addition to the data shown in Fig. 21, the main memory stores data necessary for the game, such as image data of various objects appearing in the game and audio data used in the game.

The game program 90 reads a part or all from the optical disk 4 at the appropriate timing after the power supply to the game device 3 is turned on and stores it in the main memory. In addition, the game program 90 may be acquired from the flash memory 17 or the external device of the game device 3 (for example, via the Internet) instead of the optical disk 4. In addition, a part (for example, a program for calculating the posture of the controller 5 and / or the terminal device 7) included in the game program 90 may be stored in advance in the game device 3. .

The received data 91 is various data received from the controller 5 and the terminal device 7. The received data 91 includes the controller operation data 92, the terminal operation data 97, the camera image data 104, and the microphone sound data 105. When a plurality of controllers 5 are connected, a plurality of controller operation data 92 are also provided. When a plurality of terminal devices 7 are connected, a plurality of terminal operation data 97, camera image data 104, and microphone sound data 105 are also provided.

The controller operation data 92 is data indicating the operation of the user (player) with respect to the controller 5. The controller operation data 92 is transmitted from the controller 5, acquired by the game device 3, and stored in the main memory. The controller operation data 92 includes first operation button data 93, first acceleration data 94, first angular velocity data 95, and marker coordinate data 96. In addition, a predetermined number of controller operation data may be stored in main memory in order from the latest (last acquired).

The 1st operation button data 93 is data which shows the input state with respect to each operation button 32a-32i provided in the controller 5. As shown in FIG. Specifically, the first operation button data 93 indicates whether or not each operation button 32a to 32i is pressed.

The first acceleration data 94 is data indicating the acceleration (acceleration vector) detected by the acceleration sensor 37 of the controller 5. Here, although the 1st acceleration data 94 shows the 3D acceleration which makes each component the acceleration regarding the XYZ three-axis direction shown in FIG. 3, in another embodiment, the acceleration regarding arbitrary 1 or more directions It is good to indicate.

The first angular velocity data 95 is data indicating the angular velocity detected by the gyro sensor 48 in the controller 5. Here, although the 1st angular velocity data 95 shows each angular velocity of the 3rd axis circumference of XYZ shown in FIG. 3, in another embodiment, what is necessary is just to show the axial angular velocity of arbitrary 1 or more axes.

The marker coordinate data 96 and the coordinates calculated by the image processing circuit 41 of the image capturing information calculating unit 35, that is, the data indicating the marker coordinates. The marker coordinates are represented by a two-dimensional coordinate system for indicating a position on a plane corresponding to the captured image, and the marker coordinate data 96 indicates coordinate values in the two-dimensional coordinate system.

In addition, the controller operation data 92 should just represent the operation of the user who operates the controller 5, and may contain only a part of said each data 93-96. In addition, when the controller 5 has another input means (for example, a touch panel, an analog stick, etc.), the controller operation data 92 may contain the data which shows the operation with respect to the said other input means. In addition, when using the motion of the controller 5 itself as a game operation like this embodiment, the controller operation data 92 is the 1st acceleration data 94, the 1st angular velocity data 95, or the marker coordinate data ( As shown in 96), the controller 5 includes data whose value changes according to the movement of the controller 5 itself.

The terminal operation data 97 is data representing a user's operation on the terminal device 7. The terminal operation data 97 is transmitted from the terminal device 7, acquired by the game device 3, and stored in the main memory. The terminal operation data 97 includes second operation button data 98, stick data 99, touch position data 100, second acceleration data 101, second angular velocity data 102 and azimuth data. . In addition, a predetermined number of terminal operation data may be stored in the main memory in order from the latest (last acquired).

The second operation button data 98 is data indicating an input state to each operation button 54A to 54L provided in the terminal device 7. Specifically, the second operation button data 98 indicates whether or not each operation button 54A to 54L is pressed.

The stick data 99 is data indicating the direction and amount in which the stick portion of the analog stick 53 (analog sticks 53A and 53B) slides (or is hard). The direction and the amount may be represented by, for example, two-dimensional coordinates or two-dimensional vectors.

The touch position data 100 is data indicating a position (touch position) at which an input is made on the input surface of the touch panel 52. In this embodiment, the touch position data 100 represents the coordinate value of the two-dimensional coordinate system for indicating the position on the input surface. In addition, when the touch panel 52 is a multi-touch system, the touch position data 100 may show several touch positions.

The second acceleration data 101 is data indicating the acceleration (acceleration vector) detected by the acceleration sensor 73. In this embodiment, although the 2nd acceleration data 101 shows the 3D acceleration which makes each component the acceleration with respect to the three-axis direction of xyz shown in FIG. 8, in another embodiment, arbitrary 1 or more directions What is necessary is just to show the acceleration regarding.

The second angular velocity data 102 is data indicating the angular velocity detected by the gyro sensor 74. In this embodiment, although the 2nd angular velocity data 102 shows each angular velocity around the 3 axis | shafts of xyz shown in FIG. 8, in another embodiment, what is necessary is just to show axial angular velocity of arbitrary 1 or more axes.

The azimuth data 103 is data indicating the azimuth detected by the magnetic sensor 72. In this embodiment, the orientation data 103 shows the direction of a predetermined orientation (for example, north) on the basis of the terminal device 7. However, in a place where a magnetic field other than the geomagnetic field occurs, the orientation data 103 does not strictly represent an absolute orientation (north side, etc.), but the relative direction of the terminal device 7 with respect to the magnetic field direction at the place. Since the direction is shown, it is possible to calculate the change in attitude of the terminal device 7 even in such a case.

In addition, the terminal operation data 97 should just show the operation of the user who operates the terminal apparatus 7, and may contain only one of the said data 98-103. In addition, when the terminal apparatus 7 has other input means (for example, a touch pad, the imaging means of the controller 5, etc.), the terminal operation data 97 shows operation with respect to the said other input means. It may contain data. In addition, when using the motion of the terminal apparatus 7 itself as a game operation like this embodiment, the terminal operation data 97 is the 2nd acceleration data 101, the 2nd angular velocity data 102, or the azimuth data ( As shown in 103, the terminal device 7 includes data whose value changes according to the movement of the terminal device 7 itself.

The camera image data 104 is data representing an image (camera image) picked up by the camera 56 of the terminal device 7. The camera image data 104 is image data in which the compressed image data from the terminal device 7 is expanded by the codec LSI 27 and is stored in the main memory by the input / output processor 11a. In addition, a predetermined number of camera image data may be stored in the main memory in order from the latest (last acquired).

The microphone sound data 105 is data indicating the voice (microphone sound) detected by the microphone 79 of the terminal device 7. The microphone sound data 105 is the audio data compressed by the codec LSI 27 from the compressed audio data transmitted from the terminal device 7 and stored in the main memory by the input / output processor 11a.

The data 106 for processing is data used in the game process (FIG. 22) mentioned later. The processing data 106 includes control data 107, controller attitude data 108, terminal attitude data 109, image recognition data 110, and voice recognition data 111. In addition to the data shown in FIG. 21, the processing data 106 includes various data used in game processing, such as data representing various variables set in various objects appearing in the game.

The control data 107 is data indicating control instructions for the components included in the terminal device 7. The control data 107 indicates, for example, an instruction for controlling the lighting of the marker unit 55, an instruction for controlling the imaging of the camera 56, and the like. The control data 107 is transmitted to the terminal device 7 at an appropriate timing.

The controller attitude data 108 is data representing the attitude of the controller 5. In this embodiment, the controller attitude data 108 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96 included in the controller operation data 92. . The calculation method of the controller attitude data 108 is mentioned later in step S23.

The terminal attitude data 109 is data representing the attitude of the terminal device 7. In this embodiment, the terminal attitude data 109 is calculated based on the second acceleration data 101, the second angular velocity data 102 and the azimuth data 103 included in the terminal operation data 97. The calculation method of the terminal attitude data 109 is mentioned later in step S24.

Image recognition data 110 is data indicating the result of a predetermined image recognition process for the camera image. This image recognition process may be any process as long as it detects certain features from the camera image and outputs the result. For example, a predetermined object (for example, a user's face or a marker, etc.) is extracted from the camera image, The process of calculating the information about the object may be sufficient.

The speech recognition data 111 is data representing the result of a predetermined speech recognition process for the microphone sound. This speech recognition process may be any process as long as it detects certain characteristics from the microphone sound and outputs the result. For example, the speech recognition process may be a process of detecting a user's speech or a process of simply outputting a volume.

Next, with reference to FIG. 22, the detail of the game process performed in the game device 3 is demonstrated. 22 is a main flowchart showing the flow of game processing executed in the game device 3. When the power supply of the game device 3 is turned on, the CPU 10 of the game device 3 executes a startup program stored in a boot ROM (not shown), whereby each unit such as the main memory is initialized. The game program stored in the optical disc 4 is read into the main memory, and execution of the game program is started by the CPU 10. In the game device 3, a game program stored in the optical disk 4 may be executed immediately after power-on, or a built-in program that first displays a predetermined menu screen after power-on is executed. May be configured to execute a game program stored in the optical disk 4 when the start of the game is instructed. The flowchart shown in FIG. 22 is a flowchart which shows the process performed after the above process is completed.

In addition, the process of each step of the flowchart shown in FIG. 22 is only a mere example, and if the same result is obtained, you may replace the process order of each step. In addition, the value of a variable and the threshold used at a determination step are only an example, You may employ | adopt another value as needed. In addition, in this embodiment, it demonstrates as CPU10 performs the process of each step of the said flowchart. However, you may let processors other than CPU10 and a dedicated circuit perform step process of a part of each said step.

First, in step S1, the CPU 10 executes initial processing. The initial processing is, for example, a process of establishing a virtual game space, placing each object appearing in the game space at an initial position, or setting initial values of various variables used in the game processing.

In the present embodiment, in the initial processing, the CPU 10 controls the lighting of the marker device 6 and the marker unit 55 based on the type of game program. Here, the game system 1 has two of the marker apparatus 6 and the marker part 55 of the terminal apparatus 7 as imaging objects of the imaging means of the controller 5 (imaging information calculating part 35). have. Depending on the game content (type of game program), either one of the marker device 6 and the marker unit 55 is used, or both are used. In addition, the game program 90 includes data indicating whether each of the marker device 6 and the marker portion 55 is turned on. The CPU 10 determines whether or not to read the data and turn it on. When the marker device 6 and / or the marker portion 55 are turned on, the following processing is executed.

That is, when the marker device 6 is turned on, the CPU 10 transmits to the marker device 6 a control signal for lighting each infrared LED included in the marker device 6. The transmission of this control signal may be simply supplying power. As a result, each infrared LED of the marker device 6 is turned on. On the other hand, when the marker portion 55 is turned on, the CPU 10 generates control data indicating an instruction to turn on the marker portion 55 and stores it in the main memory. The generated control data is transmitted to the terminal device 7 in step S10 described later. The control data received in the radio module 80 of the terminal device 7 is sent to the UI controller 75 via the codec LSI 76, and the UI controller 75 instructs the marker unit 55 to light up. Is done. Thereby, the infrared LED of the marker part 55 lights up. In addition, in the above, the case where the marker apparatus 6 and the marker part 55 are lighted was demonstrated, The extinguishing of the marker apparatus 6 and the marker part 55 can be performed by the same process as the case where it turns on. .

After the above step S1, the process of step S2 is performed. Subsequently, a processing loop consisting of a series of processes in steps S2 to S11 is repeatedly executed at a predetermined rate at a predetermined time (one frame time).

In step S2, the CPU 10 acquires controller operation data transmitted from the controller 5. Since the controller 5 repeatedly transmits the controller operation data to the game device 3, in the game device 3, the controller communication module 19 sequentially receives the controller operation data, and the received controller operation data is input and output. It is stored in the main memory sequentially by the processor 11a. It is preferable that the transmission / reception interval is shorter than the processing time of the game, for example, it is one-200th second. In step S2, the CPU 10 reads the latest controller operation data 92 from the main memory. After step S2, the process of step S3 is executed.

In step S3, the CPU 10 acquires various data transmitted from the terminal device 7. Since the terminal device 7 repeatedly transmits the terminal operation data, the camera image data, and the microphone sound data to the game device 3, the game device 3 receives these data sequentially. In the game device 3, the terminal communication module 28 receives these data sequentially, and the decompression process is performed sequentially by the codec LSI 27 about camera image data and microphone sound data. The input / output processor 11a sequentially stores terminal operation data, camera image data, and microphone sound data in the main memory. In step S3, the CPU 10 reads the latest terminal operation data 97 from the main memory. After step S3, the process of step S4 is executed.

In step S4, the CPU 10 executes a game control process. The game control process is a process of executing a process of operating an object in a game space according to a game operation by a user, and advancing a game. In the present embodiment, the user can play various games using the controller 5 and / or the terminal device 7. The game control process will be described below with reference to FIG. 23.

23 is a flowchart showing the detailed flow of the game control process. In addition, although the series of processes shown in FIG. 23 are various processes that can be executed when the controller 5 and the terminal device 7 are used as the operation device, all of the processes need not necessarily be executed. Depending on the type and content, only a part of the processing may be executed.

In the game control process, first in step S21, the CPU 10 determines whether to change the marker to be used. As described above, in the present embodiment, at the start of the game process (step S1), a process of controlling the lighting of the marker device 6 and the marker unit 55 is executed. Here, depending on the game, it is also possible to change the object to be used (lit) from the marker device 6 and the marker unit 55 during the game. Also, depending on the game, it is conceivable to use both the marker device 6 and the marker portion 55. However, if both are turned on, there is a risk of misdetecting one marker with the other marker. Therefore, in game, it may be desirable to switch the lighting so that only one of them lights up. In consideration of such a case, the process of step S21 is a process of determining whether to change the object to be lit during the game.

The determination of step S21 can be performed, for example, by the following method. In other words, the CPU 10 can perform the determination according to whether or not the game situation (game stage, operation target, etc.) has changed. When the game situation changes, there is an operation method between the operation method of operating the controller 5 facing the marker device 6 and the operation method of operating the controller 5 facing the marker part 55. Because you can think of this being changed. In addition, the CPU 10 can make an image determination based on the attitude of the controller 5. In other words, the determination can be made by whether the controller 5 faces the marker device 6 or toward the marker part 55. In addition, the attitude | position of the controller 5 can be calculated based on the detection result of the acceleration sensor 37 and the gyro sensor 48, for example (refer step S23 mentioned later). In addition, the CPU 10 may perform the determination based on whether or not there is a change instruction by the user.

If the determination result of step S21 is affirmative, the process of step S22 is executed. On the other hand, when the determination result of step S21 is negative, the process of step S22 is skipped and the process of step S23 is performed.

In step S22, the CPU 10 controls the lighting of the marker device 6 and the marker portion 55. That is, the lighting state of the marker apparatus 6 and / or the marker part 55 is changed. In addition, the specific process of turning on or turning off the marker apparatus 6 and / or the marker part 55 can be performed similarly to the case of said step S1. After step S22, the process of step S23 is executed.

As described above, according to the present embodiment, the light emission (lighting) of the marker device 6 and the marker portion 55 can be controlled according to the type of the game program by the processing of the step S1, and the step S21. And by the process of S22, the light emission (lighting) of the marker apparatus 6 and the marker part 55 can be controlled according to a game situation.

In step S23, the CPU 10 calculates the attitude of the controller 5. In the present embodiment, the attitude of the controller 5 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96. Hereinafter, the calculation method of the attitude | position of the controller 5 is demonstrated.

First, the CPU 10 calculates the attitude of the controller 5 based on the first angular velocity data 95 stored in the main memory. The method of calculating the posture of the controller 5 from the angular velocity may be any method, but the posture may include the previous posture (the posture calculated last time) and the angular velocity of the present time (the angular velocity obtained in step S2 in the current processing loop). Is calculated using Specifically, the CPU 10 calculates a posture by rotating the previous posture by a unit time at the present angular velocity. The previous attitude is represented by the controller attitude data 108 stored in the main memory, and the current angular velocity is represented by the first angular velocity data 95 stored in the main memory. Therefore, the CPU 10 reads the controller attitude data 108 and the first angular velocity data 95 from the main memory to calculate the attitude of the controller 5. The data indicating the "posture based on the angular velocity" calculated as described above is stored in the main memory.

In addition, when calculating a posture from an angular velocity, it is good to set initial posture. That is, when calculating the attitude of the controller 5 from the angular speed, the CPU 10 first calculates the initial attitude of the controller 5. The initial posture of the controller 5 may be calculated based on the acceleration data, and the player may perform a predetermined operation with the controller 5 in a specific posture, whereby the specific posture at the time when the predetermined operation is performed is determined. You may use it as an initial position. In addition, when calculating the attitude | position of the controller 5 as an absolute attitude | position with respect to a predetermined direction in space, it is good to calculate the said initial attitude | position, For example, the attitude | position of the controller 5 at the time of a game start. When calculating the posture of the controller 5 as a relative posture with reference to, the initial posture may not be calculated.

Next, the CPU 10 corrects the attitude of the controller 5 calculated on the basis of the angular velocity using the first acceleration data 94. Specifically, the CPU 10 first reads the first acceleration data 94 from the main memory and calculates the attitude of the controller 5 based on the first acceleration data 94. Here, in the state where the controller 5 is almost stopped, the acceleration applied to the controller 5 means gravity acceleration. Therefore, in this state, since the direction (gravity direction) of gravity acceleration can be calculated using the 1st acceleration data 94 which the acceleration sensor 37 outputs, the direction of the controller 5 with respect to the gravity direction ( Posture) can be calculated based on the first acceleration data 94. The data representing the " an attitude based on acceleration " calculated as described above is stored in the main memory.

When calculating the attitude | position based on acceleration, CPU10 next corrects the attitude | position based on angular velocity using the attitude | position based on acceleration. Specifically, the CPU 10 reads data representing the attitude based on the angular velocity and data representing the attitude based on the acceleration from the main memory, and calculates the attitude based on the angular velocity data based on the acceleration data. Correction is performed to approach the posture at a predetermined ratio. This predetermined ratio may be a predetermined fixed value or may be set in accordance with the acceleration indicated by the first acceleration data 94 or the like. In addition, as for the attitude based on the acceleration, the attitude cannot be calculated for the rotational direction based on the gravity direction, so the CPU 10 may not correct the rotational direction. In this embodiment, data indicating the posture after correction obtained as described above is stored in the main memory.

After correcting the attitude based on the angular velocity as described above, the CPU 10 corrects the corrected attitude again using the marker coordinate data 96. First, the CPU 10 calculates the attitude (the attitude based on the marker coordinates) of the controller 5 based on the marker coordinate data 96. Since the marker coordinate data 96 indicates the positions of the markers 6R and 6L in the captured image, the attitude of the controller 5 with respect to the roll direction (rotation direction around the Z axis) can be calculated from these positions. have. That is, the attitude | position of the controller 5 regarding a roll direction can be calculated from the inclination of the straight line which connects the position of the marker 6R and the position of the marker 6L in a picked-up image. In addition, when the position of the controller 5 with respect to the marker device 6 can be specified (for example, when the controller 5 can be located in front of the marker device 6), the imaging is performed. From the position of the marker apparatus 6 in an image, the attitude | position of the controller 5 regarding a pitch direction and a yaw direction can be calculated. For example, when the positions of the markers 6R and 6L are moved to the left in the picked-up image, the controller 5 can determine that the direction (posture) is changed to the right. In this way, the attitude of the controller 5 with respect to the pitch direction and the yaw direction can be calculated from the positions of the marker 6R and the marker 6L. As described above, the attitude of the controller 5 can be calculated based on the marker coordinate data 96.

When the posture based on the marker coordinates is calculated, the CPU 10 next corrects the posture corrected (the posture corrected by the posture based on the acceleration) to the posture based on the marker coordinates. In other words, the CPU 10 performs correction for bringing the posture corrected close to the posture based on the marker coordinates at a predetermined ratio. This predetermined ratio may be a predetermined fixed value. In addition, the correction by the attitude | position based on marker coordinate may be performed only in one or two directions of a roll direction, a pitch direction, and a yaw direction. For example, in the case of using the marker coordinate data 96, since the attitude can be calculated with high accuracy with respect to the roll direction, the CPU 10 calculates the attitude based on the marker coordinate data 96 only with respect to the roll direction. You may use it to correct. In addition, when the marker device 6 or the marker part 55 is not imaged by the imaging element 40 of the controller 5, the attitude | position based on the marker coordinate data 96 cannot be calculated, In this case, the correction process using the marker coordinate data 96 may not be executed.

According to the above, the CPU 10 corrects the first posture of the controller 5 calculated on the basis of the first angular velocity data 95 by using the first acceleration data 94 and the marker coordinate data 96. did. Here, in the method of using the angular velocity among the methods of calculating the attitude of the controller 5, the attitude can be calculated even when the controller 5 is moving. On the other hand, in the method using the angular velocity, the attitude is calculated by cumulatively adding and subtracting the angular velocities that are sequentially detected. There is concern. In the method of using acceleration, no error is accumulated, and the attitude cannot be calculated with high accuracy (because the gravity direction cannot be accurately detected) while the controller 5 is violently moved. In addition, the method of using marker coordinates can calculate the attitude (especially with respect to the roll direction) with high accuracy, but cannot calculate the attitude in a state where the marker portion 55 cannot be imaged. In contrast, according to the present embodiment, since three types of methods having different characteristics as described above are used, the attitude of the controller 5 can be calculated more accurately. In another embodiment, the posture may be calculated using one or two of the above three methods. In addition, when performing lighting control of a marker in the process of said step S1 or S22, it is preferable that CPU10 calculates the attitude | position of the controller 5 using at least marker coordinates.

After the above step S23, the process of step S24 is executed. In step S24, the CPU 10 calculates the attitude of the terminal device 7. That is, since the terminal operation data 97 obtained from the terminal device 7 includes the second acceleration data 101, the second angular velocity data 102 and the azimuth data 103, the CPU 10 stores these data. On the basis of this, the posture of the terminal device 7 is calculated. Here, the CPU 10 can know the rotation amount (change amount of posture) per unit time of the terminal device 7 by the second angular velocity data 102. In addition, since the acceleration applied to the terminal device 7 means gravity acceleration in the state where the terminal device 7 is almost stopped, the gravity applied to the terminal device 7 by the second acceleration data 101. The direction (that is, the posture of the terminal device 7 based on the gravity direction) can be known. In addition, the orientation data 103 can know the predetermined orientation (that is, the attitude of the terminal apparatus 7 based on the predetermined orientation) based on the terminal apparatus 7. In addition, even when a magnetic field other than the geomagnetism is generated, the rotation amount of the terminal device 7 can be known. Therefore, the CPU 10 can calculate the attitude of the terminal device 7 based on the second acceleration data 101, the second angular velocity data 102, and the orientation data 103. In addition, in this embodiment, the attitude | position of the terminal apparatus 7 is calculated based on the said three data, In another embodiment, the attitude | position is calculated based on one or two of the said three data. You may do so.

In addition, although the specific calculation method of the attitude | position of the terminal apparatus 7 may be any method, the attitude | position computed based on the angular velocity which the 2nd angular velocity data 102 shows, for example, the 2nd acceleration data 101 and azimuth | directions A method of correction using the data 103 is considered. Specifically, the CPU 10 first calculates the attitude of the terminal device 7 based on the second angular velocity data 102. In addition, the method of calculating a posture based on an angular velocity may be the same as the method in said step S23. Next, the CPU 10 calculates the posture calculated based on the angular velocity at the appropriate timing (for example, when the terminal device 7 is close to the stopped state) based on the second acceleration data 101. Correction is performed by the posture calculated based on the posture calculated and / or the posture calculated based on the azimuth data 103. In addition, the method of correcting the attitude | position based on angular velocity to the attitude | position based on acceleration may be the same method as the case of calculating the attitude | position of the controller 5 mentioned above. In addition, when correcting the posture based on the angular velocity to the posture based on the azimuth data, the CPU 10 moves the posture based on the angular velocity at a predetermined ratio to the posture based on the azimuth data. You may also According to the above, the CPU 10 can calculate the attitude of the terminal device 7 accurately.

Moreover, since the controller 5 is provided with the imaging information calculating part 35 which is an infrared detection means, the game device 3 can acquire the marker coordinate data 96. FIG. Therefore, as for the controller 5, the game device 3 knows from the marker coordinate data 96 the absolute attitude in the real space (what is the attitude of the controller 5 in the coordinate system set to the real space). Can be. On the other hand, the terminal device 7 is not equipped with infrared detection means like the imaging information calculating part 35. Therefore, the game device 3 cannot know the absolute attitude | position in real space with respect to the rotation direction centering on the gravity direction only from the 2nd acceleration data 101 and the 2nd angular velocity data 102. As shown in FIG. Therefore, in this embodiment, the terminal device 7 is equipped with the magnetic sensor 72, and the game device 3 acquires the azimuth data 103. As shown in FIG. According to this, the game device 3 can calculate from the azimuth data 103 the absolute posture in the real space with respect to the rotational direction based on the gravity direction, so that the posture of the terminal device 7 can be calculated more accurately. Can be.

As a specific process of said step S24, CPU10 reads 2nd acceleration data 101, 2nd angular velocity data 102, and azimuth data 103 from main memory, and based on these data, the terminal apparatus ( Calculate the attitude of 7). The calculated data indicating the attitude of the terminal device 7 is stored in the main memory as the terminal attitude data 109. The process of step S25 is performed after step S24.

In step S25, the CPU 10 executes a recognition process of the camera image. In other words, the CPU 10 performs predetermined recognition processing on the camera image data 104. This recognition process may be any type as long as it detects certain features from the camera image and outputs the results. For example, when the face of the player is included in the camera image, the face recognition process may be performed. Specifically, a process of detecting a part of the face (eye, nose, mouth, or the like) may be performed, or a process of detecting the facial expression. The data indicating the result of the recognition processing is stored in the main memory as the image recognition data 110. The process of step S26 is performed after step S25.

In step S26, the CPU 10 executes a microphone voice recognition process. That is, the CPU 10 performs predetermined recognition processing on the microphone sound data 105. This recognition process may be any type as long as it detects certain characteristics from the microphone sound and outputs the result. For example, a process may be used to detect the player's instruction from the microphone sound, or may simply be a process to detect the volume of the microphone sound. In addition, data representing the result of the recognition process is stored in the main memory as the voice recognition data 111. After step S26, the process of step S27 is executed.

In step S27, the CPU 10 executes a game process according to the game input. The game input may be any data as long as it is data transmitted from the controller 5 or the terminal device 7 or data obtained from the data. Specifically, the game input includes data obtained from the respective data (controller attitude data 108, terminal attitude data 109, image recognition) in addition to the data included in the controller operation data 92 and the terminal operation data 97. Data 110 and voice recognition data 111]. In addition, the content of the game process in step S27 may be any, for example, the process of operating the object (character) which appears in a game, the process of controlling a virtual camera, or the process of moving the cursor displayed on the screen. It may be. Furthermore, the process of using a camera image (or a part thereof) as a game image, the process of using a microphone sound as a game sound, etc. may be sufficient. In addition, the example of the said game process is mentioned later. In step S27, for example, data indicating the result of game control processing, such as data of various variables set to a character (object) appearing in the game, variable data about a virtual camera arranged in the game space, scoring data, and the like. Is stored in main memory. After step S27, the CPU 10 ends the game control process of step S4.

Returning to the description of FIG. 22, in step S5, a television game image for display on the television 2 is generated by the CPU 10 and the GPU 11b. In other words, the CPU 10 and the GPU 11b read data representing the result of the game control process of step S4 from the main memory, and also read data necessary for generating a game image from the VRAM 11d, and the game image. Create The game image should just show the result of the game control process of step S4, and may be generated by what kind of method. For example, the method of generating a game image may be a method of generating a three-dimensional CG image by arranging a virtual camera in a virtual game space and calculating a game space viewed from the virtual camera (without using a virtual camera). The method of generating a two-dimensional image may be sufficient. The generated game image for television is stored in the VRAM 11d. The process of step S6 is performed after the said step S5.

In step S6, the terminal game image for display on the terminal device 7 is generated by the CPU 10 and the GPU 11b. The game image for a terminal may also show the result of the game control process of step S4 similarly to the said television game image, and may be generated by what kind of method. The game image for the terminal may be generated by the same method as the television game image, or may be generated by another method. The generated game image for the terminal is stored in the VRAM 11d. In addition, depending on the contents of the game, the television game image and the terminal game image may be the same, and in this case, the process of generating the game image may not be executed in step S6. The process of step S7 is performed after the said step S6.

In step S7, a game sound for television for output to the speaker 2a of the television 2 is generated. In other words, the CPU 10 causes the DSP 11c to generate game sounds in accordance with the result of the game control process in step S4. The generated game sound may be, for example, a sound effect of a game, a sound of a character appearing in the game, a BGM, or the like. The process of step S8 is performed after the said step S7.

In step S8, the game sound for a terminal for output to the speaker 77 of the terminal device 7 is generated. In other words, the CPU 10 causes the DSP 11c to generate game sounds in accordance with the result of the game control process in step S4. The game sound for the terminal may be the same as or different from the game sound for the television. For example, although the effect sounds are different, only a part may be different, as BGM is the same. In addition, when the game sound for television and the game sound for a terminal are the same, the process of generating a game sound may not be executed in step S8. The process of step S9 is performed after the said step S8.

In step S9, the CPU 10 outputs a game image and a game sound to the television 2. Specifically, the CPU 10 sends the data of the television game image stored in the VRAM 11d and the data of the television game sound generated by the DSP 11c in step S7 to the AV-IC 15. . Accordingly, the AV-IC 15 outputs image and audio data to the television 2 via the AV connector 16. Thereby, the television game image is displayed on the television 2, and the television game audio is output from the speaker 2a. The process of step S10 is performed after step S9.

In step S10, the CPU 10 transmits a game image and a game sound to the terminal device 7. Specifically, the image data of the game image for a terminal stored in the VRAM 11d and the audio data generated by the DSP 11c in step S8 are sent to the codec LSI 27 by the CPU 10, and the codec LSI. The predetermined compression process is performed by (27). In addition, the image and audio data subjected to the compression process are transmitted to the terminal device 7 through the antenna 29 by the terminal communication module 28. The terminal device 7 receives image and audio data transmitted from the game device 3 by the radio module 80, and predetermined decompression processing is performed by the codec LSI 76. The image data subjected to the decompression process is output to the LCD 51, and the audio data subjected to the decompression process is output to the sound IC 78. Thereby, while the terminal game image is displayed on the LCD 51, the terminal game sound is output from the speaker 77. After step S10, the process of step S11 is executed.

In step S11, the CPU 10 determines whether to end the game. The determination of step S11 is performed by, for example, whether or not the game is over, or whether the user has given an instruction to stop the game. If the determination result of step S11 is negative, the process of step S2 is executed again. On the other hand, when the determination result of step S11 is affirmative, CPU10 complete | finishes the game process shown in FIG. Subsequently, the series of processes of steps S2 to S11 are repeatedly executed until it is determined in step S11 that the game ends.

As described above, in the present embodiment, the terminal device 7 includes the touch panel 52, an inertial sensor such as the acceleration sensor 73, the gyro sensor 74, and the touch panel 52 and the inertial sensor. Is output to the game device 3 as operation data and used as input of the game (steps S3, S4). In addition, the terminal apparatus 7 is provided with a display apparatus (LCD51), and the game image obtained by a game process is displayed on the LCD 51 (step S6, S10). Therefore, the user can perform an operation of directly touching the game image using the touch panel 52, and also displays the LCD 51 on which the game image is displayed (since the movement of the terminal device 7 is detected by the inertial sensor). ) Can be operated to move itself. Since the user can play the game with an operation feeling such as directly operating the game image by these operations, for example, the user can provide a game with a new operation sense, such as the first and second game examples described later. .

In addition, in this embodiment, the terminal device 7 is provided with the analog stick 53 and the operation button 54 which can be operated in the state which hold | maintained the terminal device 7, The game device 3 has an analog stick ( 53) and the operation for the operation button 54 can be used as the input of the game (steps S3, S4). Therefore, even in the case where the operation is directly performed on the game image as described above, the user can perform a more detailed game operation by button operation or stick operation.

In addition, in this embodiment, the terminal device 7 is provided with the camera 56 and the microphone 79, The data of the camera image which the camera 56 picks up, and the data of the microphone sound which the microphone 79 detects Is transmitted to the game device 3 (step S3). Therefore, since the game device 3 can use the camera image and / or the microphone sound as a game input, the user can capture an image with the camera 56 or input a voice into the microphone 79. It is also possible to perform a game operation. In addition, since these operations can be performed in the state which hold | maintained the terminal apparatus 7, by performing these operations in the case of performing a direct operation with respect to a game image as mentioned above, a user can perform more various game operations. .

In addition, in this embodiment, since a game image is displayed on the LCD 51 of the portable terminal apparatus 7 (step S6, S10), a user can arrange | position the terminal apparatus 7 freely. Therefore, in the case where the operation is performed with the controller 5 facing the marker side, the user can play the game with the controller 5 facing the free direction by arranging the terminal device 7 at a free position. 5) can improve the degree of freedom of operation. In addition, since the terminal device 7 can be arranged at any position, for example, as in the fifth game example described later, the terminal device 7 is arranged at a position suitable for the game contents, thereby making the game more realistic. Can provide.

In addition, according to the present embodiment, the game device 3 acquires operation data and the like from the controller 5 and the terminal device 7 (steps S2 and S3), so that the user has the controller 5 and the terminal device 7. Can be used as an operation means. Therefore, in the game system 1, a plurality of users may play a game by using each device, and one user may play a game using two devices.

In addition, according to the present embodiment, the game device 3 can generate two kinds of game images (steps S5 and S6), and display the game images on the television 2 and the terminal device 7 (step S9). , S10). In this manner, by displaying two kinds of game images on different devices, a game image that is easier to see for the user can be provided, and the operability of the game can be improved. For example, when two people play a game, the game image of the viewpoint which is easy to see for one user is displayed on the television 2, like the 3rd or 4th game example mentioned later, and the other user sees it. By displaying the game image of an easy viewpoint on the terminal device 7, it is possible to play a game from the viewpoint of each player. For example, even when one person plays a game, as in the first, second, and fifth game examples described later, two types of game images are displayed at two different viewpoints, whereby the player can play the game space. The appearance is easier to grasp, improving the game's operability.

[6. Game example]

Next, a specific example of the game played in the game system 1 will be described. In addition, in the game example demonstrated below, some of the structure of each apparatus in the game system 1 may not be used, and the process of one part of a series of process shown in FIG. 22 and FIG. It may not be executed. That is, the game system 1 may not be provided with all the above-mentioned structures, and the game device 3 may not execute part of the series of processes shown in FIGS. 22 and 23.

(1st game example)

The first game example is a game in which an object (repair sword) is blown in the game space by operating the terminal device 7. The player can instruct the direction of firing the sword by the operation of changing the posture of the terminal device 7 and the operation of drawing a line on the touch panel 52.

FIG. 24 is a diagram showing a screen of the television 2 and the terminal device 7 in the first game example. In FIG. 24, a game image representing a game space is displayed on the LCD 51 of the television 2 and the terminal device 7. On the television 2, a hydraulic sword 121, a control surface 122, and a target 123 are displayed. On the LCD 51, the control surface 122 (and the hydraulic sword 121) is displayed. In the first game example, the player blows the shuriken 121 by an operation using the terminal device 7 and enjoys while hitting the target 123.

In the case of flying the sword 121, the player first operates the posture of the terminal device 7 to change the posture of the control surface 122 disposed in the virtual game space to a desired posture. That is, the CPU 10 calculates the attitude of the terminal device 7 based on the outputs of the inertial sensors (acceleration sensor 73 and gyro sensor 74) and the magnetic sensor 72 (step S24). The attitude of the control surface 122 is changed based on the attitude | position which was made (step S27). In the first game example, the attitude of the control surface 122 is controlled to be the attitude corresponding to the attitude of the terminal device 7 in the real space. That is, the player can change the attitude of the control surface 122 in the game space by changing the attitude of the terminal device 7 (the control surface 122 displayed on the terminal device 7). In the first game example, the position of the control surface 122 is fixed to a predetermined position in the game space.

Next, the player performs an operation of drawing a line on the touch panel 52 using the touch pen 124 or the like (see the arrow shown in FIG. 24). In the first game example, the control surface 122 is displayed on the LCD 51 of the terminal device 7 so that the input surface of the touch panel 52 and the control surface 122 correspond to each other. Therefore, by the line drawn on the touch panel 52, the direction on the control surface 122 (direction shown by this line) can be calculated. The hydraulic sword 121 is fired in the direction determined in this way. By the above, the CPU 10 calculates the direction on the control surface 122 from the touch position data 100 of the touch panel 52 and performs the process of moving the hydraulic sword 121 in the calculated direction (step) S27). In addition, the CPU 10 may control the speed of the hydraulic sword 121 according to, for example, the length of the line or the speed at which the line is drawn.

As described above, according to the first game example, the game device 3 uses the output of the inertial sensor as the game input, thereby moving the control surface 122 in accordance with the movement (posture) of the terminal device 7, By using the output of the touch panel 52 as a game input, the direction on the control surface 122 can be specified. According to this, since the player can move the game image (image of the control surface 122) displayed on the terminal device 7 or perform touch operation on the game image, the player is directly operating on the game image. The game can be played with a new sense of operation.

In the first game example, the direction in the three-dimensional space can be easily indicated by using the inertial sensor and the sensor outputs of the touch panel 52 as game inputs. That is, the player actually adjusts the posture of the terminal device 7 with one hand, and inputs the direction to the touch panel 52 with a line with the other hand, so that the intuitive operation as if the direction is actually input in the space. You can easily indicate the direction. Further, since the player can perform the posture operation of the terminal device 7 and the input operation to the touch panel 52 at the same time, the player can quickly perform the operation of indicating the direction in the three-dimensional space.

In addition, according to the first game example, the control surface 122 is displayed on the entire screen in the terminal device 7 in order to facilitate the operation of the touch input to the control surface 122. On the other hand, the television 2 has an image of a game space including the entire surface of the control surface 122 and the target 123 so that the attitude of the control surface 122 is easy to be grasped and the target 123 is easily aimed at. Is displayed (see Fig. 24). That is, in step S27, the first virtual camera for generating the game image for television is set such that the entirety of the control surface 122 and the target 123 are included in the viewing range, and generate the game image for the terminal. The second virtual camera to be set is set so that the screen (input surface of the touch panel 52) of the LCD 51 and the control surface 122 coincide on the screen. Therefore, in the first game example, the game operation is made easier by displaying images of the game space viewed from different viewpoints on the television 2 and the terminal device 7.

(The second game example)

In addition, the game which uses the inertial sensor and the sensor output of the touch panel 52 as a game input is not limited to the said 1st game example, Various game examples are considered. The second game example is a game in which an object (a bullet of a cannon) is blown in the game space by operating the terminal device 7 as in the first game example. The player can instruct the direction in which the bullet is to be fired by an operation of changing the posture of the terminal device 7 and an operation of designating a position on the touch panel 52.

FIG. 25 is a diagram showing a screen of the television 2 and the terminal device 7 in the second game example. In FIG. 25, the cannon 131, the bullet 132, and the target 133 are displayed on the television 2. The terminal device 7 is marked with a bullet 132 and a target 133. The terminal game image displayed on the terminal device 7 is an image of the game space viewed from the position of the cannon 131.

In the second game example, the player can change the display range displayed on the terminal device 7 as a game image for the terminal by manipulating the posture of the terminal device 7. That is, the CPU 10 calculates the attitude of the terminal device 7 based on the outputs of the inertial sensors (acceleration sensor 73 and gyro sensor 74) and the magnetic sensor 72 (step S24). Based on the posture, the position and posture of the second virtual camera for generating the game image for the terminal are controlled (step S27). Specifically, the second virtual camera is provided at the position of the cannon 131, and the direction (posture) is controlled in accordance with the posture of the terminal device 7. In this way, the player can change the range of the game space displayed on the terminal device 7 by changing the attitude of the terminal device 7.

In addition, in the second game example, the player designates the shooting direction of the bullet 132 by an operation (touch operation) of inputting a point on the touch panel 52. Specifically, as the processing of step S27, the CPU 10 calculates a position (control position) in the game space corresponding to the touch position, and determines a predetermined position (for example, the position of the cannon 131) in the game space. ] To the control position is calculated as the firing direction. Then, a process of moving the bullet 132 in the firing direction is performed. In this manner, in the first game example, the player performs an operation of drawing a line on the touch panel 52, but in the second game example, an operation of specifying a point on the touch panel 52 is performed. The control position can be calculated by setting the same control surface as in the first game example (however, in the second game example, the control surface is not displayed). That is, the control surface is arranged in accordance with the posture of the second virtual camera so as to correspond to the display range in the terminal device 7 (specifically, the control surface is centered on the position of the cannon 131 and the terminal device). And rotate in accordance with the change in posture of (7)], the position on the control surface corresponding to the touch position can be calculated as the control position.

According to the second game example, the game device 3 changes the display range of the terminal game image in accordance with the movement (posture) of the terminal device 7 by using the output of the inertial sensor as the game input. By using a touch input for designating a position within the display range as a game input, the direction (firing direction of the bullet 132) in the game space can be specified. Therefore, in the second game example, as in the first game example, the player can move the game image displayed on the terminal device 7 or perform a touch operation on the game image. Therefore, the player directly manipulates the game image. The game can be played with a new sense of operation as if playing a game.

Also in the second embodiment, similarly to the first embodiment, the player actually adjusts the posture of the terminal device 7 with one hand, and touches the touch panel 52 with the other hand, The direction can be easily indicated by an intuitive operation as if the direction is actually being input. Further, since the player can perform the posture operation of the terminal device 7 and the input operation to the touch panel 52 at the same time, the player can quickly perform the operation of indicating the direction in the three-dimensional space.

In addition, in the second game example, the image displayed on the television 2 may be an image from the same viewpoint as the terminal apparatus 7, but in FIG. 25, the game apparatus 3 displays an image from another viewpoint. Doing. That is, the second virtual camera for generating the game image for the terminal is set at the position of the cannon 131, whereas the first virtual camera for generating the game image for the television is set at the rear position of the cannon 131. . Here, for example, by causing the television 2 to display an invisible range on the screen of the terminal device 7, the player views the target 133 that is not visible on the screen of the terminal device 7. You can realize how to play, such as watching and aiming at them. In this way, by varying the display ranges of the television 2 and the terminal device 7, it is possible not only to make it easier to grasp the state in the game space, but also to further improve the interest in the game.

As described above, according to the present embodiment, since the terminal device 7 having the touch panel 52 and the inertial sensor can be used as the operation device, it is possible to directly play the game image as in the first and second game examples. It is possible to realize a game with a sense of operation such as to perform an operation.

(Third game example)

Hereinafter, with reference to FIG. 26 and FIG. 27, a 3rd game example is demonstrated. The third game example is a baseball game in which two players compete. That is, the first player operates the batter using the controller 5, and the second player uses the terminal device 7 to operate the pitcher. In addition, the television 2 and the terminal device 7 display game images that are easy to perform game operations for each player.

FIG. 26 is a diagram illustrating an example of a television game image displayed on the television 2 in the third game example. The television game image shown in Fig. 26 is an image mainly for the first player. That is, the television game image represents a game space in which the pitcher (pitcher object) 142, which is the operation target of the second player, is viewed from the batter (hitter object) 141 side of the first player. The first virtual camera for generating a game image for television is disposed at the rear position of the batter 141 so as to face the pitcher 142 from the batter 141.

27 is a figure which shows an example of the terminal game image displayed on the terminal device 7 in 3rd game example. The game image for terminals shown in FIG. 27 is mainly an image for a second player. That is, the game image for a terminal shows the game space which saw the batter 141 which was the operation target of a 1st player from the pitcher 142 side which is the operation target of a 2nd player. Specifically, in step S27, the CPU 10 controls the second virtual camera used to generate the game image for the terminal based on the posture of the terminal device 7. The posture of the second virtual camera is calculated to correspond to the posture of the terminal device 7 as in the second game example described above. In addition, the position of the second virtual camera is fixed at a predetermined predetermined position. The game image for the terminal also includes a cursor 143 for indicating the direction in which the pitcher 142 throws the ball.

The method for operating the batter 141 by the first player and the method for operating the pitcher 142 by the second player may be any method. For example, the CPU 10 detects a swing operation on the controller 5 based on the output data of the inertial sensor of the controller 5 and causes the batter 141 to swing the bat in accordance with the swing operation. You may do so. Further, for example, the CPU 10 moves the cursor 143 in accordance with an operation on the analog stick 53, and the position indicated by the cursor 143 when a predetermined button among the operation buttons 54 is pressed. The pitcher 142 may also perform the operation of throwing the ball toward. The cursor 143 may be moved in accordance with the posture of the terminal device 7 instead of the operation on the analog stick 53.

As described above, in the third game example, the game image is generated on the television 2 and the terminal device 7 at different viewpoints, thereby providing a game image that is easy to see and operate for each player.

In addition, in the third game example, two virtual cameras are set in a single game space, and two types of game images in which the game space is viewed from each virtual camera are displayed respectively (FIGS. 26 and 27). Therefore, as for the two types of game images generated in the third game example, most of the game processes (control of objects in the game space, etc.) for the game space are common, and two drawing processes are performed for the common game space. Since each game image can be generated only by performing it, there exists an advantage that processing efficiency is high compared with the case where each said game process is performed.

In the third game example, the cursor 143 indicating the throwing direction is displayed only on the terminal device 7 side, so that the first player cannot see the position indicated by the cursor 143. Therefore, a game problem such as the throwing direction being known to the first player and the second player being disadvantaged does not occur. As described above, in the present embodiment, when one player sees the game image, when the game problem occurs in the other player, the game image may be displayed on the terminal device 7. As a result, problems such as deterioration of the strategy of the game can be prevented. Further, in another embodiment, depending on the game content (for example, when the above problem does not occur even when the game image for the terminal is shown to the first player), the game device 3 is the game image for the terminal. May be displayed on the television 2 together with the game image for television.

(The fourth game example)

The fourth game example will be described below with reference to FIGS. 28 and 29. The fourth game example is a shooting game in which two players cooperate. In other words, the first player performs an operation of moving the plane using the controller 5, and the second player performs an operation of controlling the direction of the cannon firing of the plane using the terminal device 7. Also in the fourth game example, similarly to the third game example, game images that are easy to perform game operations for each player are displayed on the television 2 and the terminal device 7.

FIG. 28 is a diagram illustrating an example of a television game image displayed on the television 2 in the fourth game example. 29 is a figure which shows an example of the terminal game image displayed on the terminal device 7 in a 4th game example. As shown in Fig. 28, in the fourth game example, an airplane (airplane object) 151 and a target (balloon object) 153 appear in a virtual game space. The plane 151 has a cannon (cannon object) 152.

As shown in FIG. 28, as a game image for a television, an image of a game space including an airplane 151 is displayed. The first virtual camera for generating a game image for television is set to generate an image of a game space viewed from the rear of the plane 151. That is, the 1st virtual camera is arrange | positioned in the position which is behind the plane 151, and the plane 151 is contained in a photography range (viewing range). Also, the first virtual camera is controlled to move with the movement of the plane 151. That is, in the process of said step S27, CPU10 controls the movement of the airplane 151 based on controller operation data, and controls the position and attitude of a 1st virtual camera. In this way, the position and attitude of the first virtual camera are controlled in accordance with the operation of the first player.

On the other hand, as shown in Fig. 29, as the game image for the terminal, an image of the game space viewed from the plane 151 (more specifically, the cannon 152) is displayed. Therefore, the second virtual camera for generating the game image for the terminal is disposed at the position of the plane 151 (more specifically, the position of the cannon 152). In the process of said step S27, CPU10 controls the movement of the airplane 151 based on controller operation data, and controls the position of a 2nd virtual camera. In addition, the second virtual camera may be disposed at a peripheral position of the plane 151 or the cannon 152 (for example, a position slightly rearward of the cannon 152). As described above, the position of the second virtual camera is controlled by the operation of the first player (operating the movement of the plane 151). Therefore, in the fourth game example, the first virtual camera and the second virtual camera move in association.

Moreover, as a game image for a terminal, the image of the game space seen from the direction of the firing direction of the cannon 152 is displayed. Here, the firing direction of the cannon 152 is controlled to correspond to the attitude of the terminal device 7. That is, in this embodiment, the attitude | position of a 2nd virtual camera is controlled so that the visual direction of a 2nd virtual camera may correspond with the firing direction of the cannon 152. FIG. In the process of said step S27, CPU10 controls the direction of the cannon 152, and the attitude | position of a 2nd virtual camera in accordance with the attitude | position of the terminal apparatus 7 calculated by the said step S24. In this way, the posture of the second virtual camera is controlled by the operation of the second player. In addition, the second player can change the firing direction of the cannon 152 by changing the attitude of the terminal device 7.

In addition, when firing a bullet from the cannon 152, the second player presses a predetermined button of the terminal device 7. When a predetermined button is pressed, a bullet is fired in the direction of the cannon 152. In the game image for a terminal, aiming 154 is displayed in the center of the screen of the LCD 51, and a bullet is fired in the direction indicated by the aiming 154.

As described above, in the fourth game example, the first player mainly looks at the television game image (FIG. 28) showing the game space in which the airplane 151 is viewed in the direction of travel (for example, the desired target 153). The plane 151 to move in a direction. On the other hand, the second player operates the cannon 152 mainly while watching the game image for the terminal (FIG. 29) indicating the game space in which the cannon 152 is fired. As described above, in the fourth game example, in the game in which two players cooperate, game images that are easy to see and operate for each player are displayed on the television 2 and the terminal device, respectively. Can be.

In addition, in the fourth game example, the positions of the first virtual camera and the second virtual camera are controlled by the operation of the first player, and the attitude of the second virtual camera is controlled by the operation of the second player. That is, in this embodiment, as a result of the position or attitude | position of a virtual camera changing with each game operation of each player, the display range of the game space displayed on each display apparatus changes. Since the display range of the game space displayed on the display device changes according to the operation of each player, each player can realize that his or her game operation is sufficiently reflected in the game progression, so that the game can be fully enjoyed.

In the fourth game example, the game image viewed from the rear of the airplane 151 was displayed on the television 2, and the game image viewed from the cannon position of the airplane 151 was displayed on the terminal device 7. Here, in another game example, the game device 3 displays the game image viewed from the rear of the plane 151 on the terminal device 7, and the game image viewed from the position of the cannon 152 of the plane 151 is a television. You may make it display in (2). At this time, the role of each player is replaced with that of the fourth game example, in which the first player operates the cannon 152 using the controller 5, and the second player uses the terminal device 7 to operate the plane ( 151 may be performed.

(5th game example)

A fifth game example will be described below with reference to FIG. 30. The fifth game example is a game in which a player performs an operation using the controller 5, and the terminal device 7 is used as a display device rather than an operation device. Specifically, the fifth game example is a golf game, and according to an operation (swing operation) in which the player swings the controller 5 like a golf club, the game device 3 plays golf to the player character in the virtual game space. To perform a swing operation.

FIG. 30 is a diagram showing the use of the game system 1 in the fifth game example. In Fig. 30, an image of a game space including a player character (object) 161 and a golf club (object) 162 is displayed on the screen of the television 2. In addition, although it is not displayed because it is hidden by the golf club 162 in FIG. 30, the ball (object) 163 arrange | positioned in a game space is also displayed on the television 2. On the other hand, as shown in FIG. 30, the terminal apparatus 7 is arrange | positioned in the bottom surface of the front front of the television 2 so that the screen of the LCD 51 may become a vertical upper direction. The terminal device 7 displays an image representing the ball 163, an image representing a portion of the golf club 162 (specifically, the head 162a of the golf club), and an image representing the surface of the game space. do. The game image for terminals is an image viewed from above the ball circumference.

When playing a game, the player 160 stands in the vicinity of the terminal device 7 and swings the controller 5 like a golf club. At this time, the CPU 10 controls the position and attitude of the golf club 162 in the game space in accordance with the attitude of the controller 5 calculated in the process of step S23 in step S27. Specifically, the golf club 162 is a golf club in the game space when the front end direction (the Z-axis forward direction shown in FIG. 3) of the controller 5 faces the image of the ball 163 displayed on the LCD 51. 162 is controlled to contact the ball 163.

In addition, when the direction of the tip of the controller 5 faces the LCD 51 side, an image (head image) 164 showing a part of the golf club 162 is displayed on the LCD 51 (see FIG. 30). . In addition, with regard to the game image for the terminal, in order to increase the realism, the image of the ball 163 may be displayed at the actual size, and the direction of the head image 164 is changed in accordance with the rotation around the Z axis of the controller 5. It may be displayed to rotate. The game image for the terminal may be generated using a virtual camera installed in the game space, or may be generated using image data prepared in advance. When generating using the image data prepared in advance, it is possible to generate a detailed and realistic image with a small processing load without constructing the terrain model of the golf course in detail.

As a result of swinging the golf club 162 by the player 160 performing the swing operation, when the golf club 162 touches the ball 163, the ball 163 moves (flies). That is, in step S27, the CPU 10 determines whether the golf club 162 and the ball 163 have contacted each other, and moves the ball 163 in the case of contact. Here, the game image for television is generated so that the ball 163 after a movement is included. That is, the CPU 10 controls the position and posture of the first virtual camera for generating the game image for television so that the moving ball is included in the shooting range. On the other hand, in the terminal device 7, when the golf club 162 touches the ball 163, the image of the ball 163 is moved and immediately disappears from the screen. Therefore, in the fifth game example, the state in which the ball moves is mainly displayed on the television 2, and the player 160 can confirm the whereabouts of the ball blown by the swing operation with the game image for television.

As described above, in the fifth game example, the player 160 can swing the golf club 162 by swinging the controller 5 (the player character 161 can swing the golf club 162). ]. Here, in the fifth game example, the golf club 162 in the game space touches the ball 163 when the direction of the tip of the controller 5 is directed toward the image of the ball 163 displayed on the LCD 51. Is controlled. Therefore, the player can obtain a sense of hitting a golf club by the swing operation, and can make the swing operation more realistic.

In the fifth game example, the head image 164 is displayed on the LCD 51 when the direction of the tip of the controller 5 faces the terminal device 7. Therefore, the player directs the direction of the tip of the controller 5 toward the terminal device 7 so that the attitude of the golf club 162 in the virtual space corresponds to the attitude of the controller 5 in the real space. A sense can be acquired and swing operation can be made more realistic.

As described above, in the fifth game example, when the terminal device 7 is used as the display device, the operation using the controller 5 can be made more realistic by arranging the terminal device 7 at an appropriate position. .

In the fifth game example, the terminal device 7 is arranged on the bottom surface, and the terminal device 7 displays an image representing only the game space around the ball 163. Therefore, the terminal apparatus 7 cannot show the position and attitude | position of the whole golf club 162 in a game space in the terminal device 7, and the ball 163 moves after a swing operation in the terminal device 7. FIG. It cannot be displayed. Therefore, in the fifth game example, the entire golf club 162 is displayed on the television 2 before the ball 163 moves, and the ball 163 moves after the ball 163 moves. It is displayed in (2). As described above, according to the fifth game example, a realistic operation can be provided to the player, and a game image that is easy to see is presented to the player by using two screens of the television 2 and the terminal device 7. can do.

In the fifth game example, the marker portion 55 of the terminal device 7 is used to calculate the attitude of the controller 5. That is, the CPU 10 turns on the marker unit 55 (without turning on the marker device 6) in the initial processing of the step S1, and the CPU 10 sets the marker coordinate data 96 in the step S23. ), The attitude of the controller 5 is calculated. According to this, it can be correctly determined whether the direction of the front-end | tip part of the controller 5 is a posture toward the marker part 55 side. In the fifth game example, the steps S21 and S22 may not be executed. In another game example, the markers to be turned on may be changed in the game by executing the processing of the steps S21 and S22. For example, in step S21, the CPU 10 determines whether or not the direction of the tip of the controller 5 faces the gravity direction based on the first acceleration data 94, and in step S22, the gravity direction The marker unit 55 may be turned on in the case of facing upwards, and the marker device 6 may be turned on when not facing in the direction of gravity. According to this, when the direction of the front-end | tip part of the controller 5 faces the gravity direction, the attitude | position of the controller 5 can be calculated with high precision by acquiring the marker coordinate data of the marker part 55, and the controller 5 In the case where the front end portion of the heading side is toward the television 2 side, the attitude of the controller 5 can be calculated with high accuracy by acquiring marker coordinate data of the marker device 6.

As described in the fifth game example, the game system 1 can be used as a display device by installing the terminal device 7 at a free position. According to this, in the case of using the marker coordinate data as a game input, the controller 5 is freed by setting the terminal device 7 to a desired position in addition to using the controller 5 facing the television 2 side. Orientation can be used. That is, according to this embodiment, since the direction which can use the controller 5 is not restrict | limited, the freedom of operation of the controller 5 can be improved.

[7. Other operation example of game system]

As described above, the game system 1 can perform an operation for playing various games. While the terminal device 7 can be used as a portable display or a second display, it can also be used as a controller for performing touch input or motion input. According to the game system 1, a wide game can be performed. It becomes possible. In addition, it is also possible to perform the following operations, including applications other than games.

[Operation example in which the player plays the game using only the terminal device 7]

In the present embodiment, the terminal device 7 functions as a display device and also as an operation device. Therefore, by using the terminal device 7 as display means and operation means without using the television 2 and the controller 5, the terminal device 7 can also be used as a portable game device.

Specifically, the CPU 10 obtains the terminal operation data 97 from the terminal device 7 in step S3, and the terminal operation data 97 in step S4. Game processing is executed using only as game input (without using controller operation data). The game image is generated in step S6, and the game image is transmitted to the terminal device 7 in step S10. At this time, steps S2, S5, and S9 may not be executed. According to the above, a game process is performed according to the operation with respect to the terminal device 7, and the game image which shows the game process result is displayed on the terminal device 7. As shown in FIG. In this way, it is also possible to use the terminal device 7 as a portable game device (although the game processing is actually performed by the game device). Therefore, according to the present embodiment, even when the television 2 cannot be displayed on the television 2 due to the fact that the television 2 is in use (for example, another person is watching a television broadcast) or the like, the user is a terminal. The game can be played using the device 7.

In addition, the CPU 10 is not limited to the game image, but may transmit the image to the terminal device 7 for display on the menu screen described above after the power is turned on. According to this, since the player can play a game without using the television 2 from the beginning, it is convenient.

In the above, it is also possible to change the display device for displaying the game image from the terminal device 7 to the television 2 during the game. Specifically, the CPU 10 may execute step S9 again to output the game image to the television 2. In addition, the image output to the television 2 in step S9 is the same as the game image transmitted to the terminal device 7 in step S10. According to this, since the input of the television 2 is switched so that the input from the game device 3 is displayed, the same game image as that of the terminal device 7 is displayed on the television 2, and thus the display for displaying the game image. The device can be changed to the television 2. In addition, after the game image is displayed on the television 2, the screen display of the terminal device 7 may be turned off.

In the game system 1, an infrared remote control signal to the television 2 may be output from an infrared output means (marker device 6, marker portion 55, or infrared communication module 82). . According to this, the game device 3 can perform the operation on the television 2 by outputting the infrared remote control signal from the infrared output means in accordance with the operation on the terminal device 7. In this case, since the user can operate the television 2 using the terminal device 7 without operating the remote control of the television 2, the user can switch the input of the television 2 as described above. It is convenient.

(Example of operation of communicating with other device via network)

As described above, since the game device 3 has a function of connecting to a network, the game system 1 can also be used when communicating with an external device via a network. Fig. 31 is a diagram showing the connection relationship between the devices included in the game system 1 in the case of connecting to an external device via a network. As shown in FIG. 31, the game device 3 may communicate with the external device 191 through the network 190.

As described above, when the external device 191 and the game device 3 can communicate, the game system 1 can communicate with the external device 191 using the terminal device 7 as an interface. For example, the game system 1 can be used as a television telephone by transmitting and receiving images and audio between the external device 191 and the terminal device 7. Specifically, the game device 3 receives images and voices (images and voices of the call partner) from the external device 191 through the network 190, and receives the received images and voices into the terminal device 7. Send. As a result, the terminal device 7 displays the image from the external device 191 on the LCD 51 and outputs the sound from the external device 191 from the speaker 77. In addition, the game device 3 receives the camera image captured by the camera 56 and the microphone voice detected by the microphone 79 from the terminal device 7, and the camera image and the microphone voice are received via the network 190. Transmit to external device 191. The game device 3 can use the game system 1 as a television telephone by repeating transmission and reception of the image and sound with the external device 191.

In addition, in this embodiment, since the terminal apparatus 7 is a portable type, a user can use the terminal apparatus 7 in a free position, or can direct the camera 56 to a free direction. In addition, in this embodiment, since the terminal device 7 is provided with the touch panel 52, the game device 3 transmits input information (touch position data 100) with respect to the touch panel 52 to an external device ( 191). For example, when the terminal device 7 outputs an image and an audio from the external device 191, and transmits a character or the like written on the touch panel 52 to the external device 191, so-called. It is also possible to use the game system 1 as an e-learning system.

(Operation example in conjunction with TV broadcasting)

In addition, when the television broadcast is being watched by the television 2, the game system 1 can also operate in conjunction with the television broadcast. In other words, when the television program is being watched by the television 2, the game system 1 causes the terminal device 7 to output information about the television program and the like. An operation example when the game system 1 operates in conjunction with television broadcasting will be described below.

In the above operation example, the game device 3 can communicate with a server via a network (in other words, the external device 191 shown in FIG. 31 is a server). The server stores various pieces of information (television information) related to television broadcasting for each channel of television broadcasting. The television information may be information about a program such as subtitles or performer information, or may be information of an EPG (Electronic Program Table) or information broadcast as data broadcast. The television information may be an image, audio, text, or a combination thereof. It is not necessary for one server, and a server may be provided for each channel or program of a television broadcast, and the game device 3 may communicate with each server.

When the video and audio of a television broadcast are output on the television 2, the game device 3 inputs the channel of the television broadcast being watched to the user using the terminal device 7. The server then requests the server through the network to transmit television information corresponding to the input channel. Accordingly, the server transmits data of television information corresponding to the channel. When receiving the data transmitted from the server, the game device 3 outputs the received data to the terminal device 7. The terminal device 7 displays image and text data among the data on the LCD 51, and outputs audio data from the speaker. By the above, the user can enjoy the information about the television program currently being watched, etc. using the terminal apparatus 7.

As described above, the game system 1 may communicate with an external device (server) via a network, thereby providing the user with the terminal device 7 information linked with television broadcasting. In particular, in this embodiment, since the terminal device 7 is a portable type, the user can use the terminal device 7 at a free position, and the convenience is high.

As described above, in the present embodiment, the user can use the terminal device 7 for various uses and forms in addition to using the game.

[8. Modification]

The said embodiment is an example which implements this invention, and in another embodiment, it is also possible to implement this invention with the structure demonstrated below, for example.

(Modified example having a plurality of terminal devices)

In the above embodiment, the game system 1 is configured to have only one terminal device, but the game system 1 may be configured to have a plurality of terminal devices. That is, the game device 3 is capable of wireless communication with a plurality of terminal devices, respectively, and transmits game image data, game voice data and control data to each terminal device, and transmits operation data, camera image data, and microphone sound data. It may be received from each terminal device. In addition, although the game device 3 performs wireless communication with each of the plurality of terminal devices, the game device 3 may perform wireless communication with each terminal device in time division or may divide the frequency band. good.

In the case of having a plurality of terminal devices as described above, more types of games can be played using the game system. For example, if the game system 1 has two terminal devices, the game system 1 has three display devices, so that game images for each of the three players are generated, Can be displayed. In addition, when the game system 1 has two terminal devices, in a game (for example, the fifth game example) using the controller and the terminal device as one set, two players can play the game at the same time. have. In addition, when the game process of said step S27 is performed based on the marker coordinate data output from two controllers, the game operation which directs a controller toward a marker (marker apparatus 6 or marker part 55) is performed. Two players can play each. That is, one player can play the game with the controller facing the marker device 6, and the other player can play the game with the controller toward the marker unit 55.

(Modifications Regarding Functions of Terminal Devices)

In the above embodiment, the terminal device 7 functions as a so-called new client terminal that does not execute game processing. Here, in another embodiment, a part of the process may be executed by another device such as the terminal device 7 in the series of game processes executed by the game device 3 in the above embodiment. For example, the terminal apparatus 7 may perform some processing (for example, the process of generating game images for terminals). That is, the terminal device may function as a portable game device that performs a game process based on the operation on the operation unit, generates a game image based on the game process, and displays it on the display unit. Further, for example, in a game system having a plurality of information processing apparatuses (game apparatuses) that can communicate with each other, the plurality of information processing apparatuses may share the game processing to be executed.

(Modifications Regarding Configuration of Terminal Device)

The terminal apparatus in the said embodiment is an example, The shape of each operation button of the terminal apparatus, the housing | casing 50, the number of each component, an installation position, etc. are only an example, The other shape, number, and installation position It may be. For example, the terminal device may have a configuration shown below. Hereinafter, a modification of the terminal device will be described with reference to FIGS. 32 to 35.

32 is a diagram showing an appearance configuration of a terminal device according to a modification of the above embodiment. (A) in FIG. 32 is a front view of a terminal device, (b) is a top view, (c) is a right side view, (d) is a bottom view. 33 is a diagram illustrating a state in which a user grips the terminal device illustrated in FIG. 32. In addition, in FIG. 32 and FIG. 33, although the component corresponding to the component of the terminal apparatus 7 in the said embodiment is attached | subjected with the same reference numeral as FIG. 8, it does not need to be comprised by the same thing.

As shown in FIG. 32, the terminal device 8 includes a housing 50 which is generally rectangular in the shape of a plate that is long horizontally. The housing 50 is large enough to be gripped by the user. Therefore, the user can hold the terminal device 8 to move or change the arrangement position of the terminal device 8.

The terminal device 8 has an LCD 51 on the surface of the housing 50. The LCD 51 is installed near the center of the surface of the housing 50. Therefore, the user can move by holding the terminal device while viewing the screen of the LCD 51 by opening the housing 50 of both sides of the LCD 51 as shown in FIG. In addition, although FIG. 9 shows an example in which a user grasps the terminal device 8 horizontally (horizontally) by pulling the housing 50 of the left and right sides of the LCD 51, the terminal device 8 is shown. ) Can also be held vertically (longitudinally).

As shown in Fig. 32A, the terminal device 8 has a touch panel 52 on the screen of the LCD 51 as an operation means (operation unit). In this modification, the touch panel 52 is a resistive touch panel. However, the touch panel is not limited to the resistive film type, and for example, any type of touch panel such as a capacitive type can be used. The touch panel 52 may be a single touch system or a multi-touch system. In this modification, the touch panel 52 uses the same resolution (detection accuracy) as that of the LCD 51. However, the resolution of the touch panel 52 and the resolution of the LCD 51 do not necessarily need to match. The input to the touch panel 52 is usually performed using a touch pen, but it is also possible to input the touch panel 52 with a user's finger without being limited to the touch pen. Moreover, the housing 50 may be provided with the accommodating hole for accommodating the touch pen used for performing the operation with respect to the touch panel 52. FIG. As described above, since the terminal device 8 includes the touch panel 52, the user can operate the touch panel 52 while moving the terminal device 8. That is, the user can directly input (by the touch panel 52) to the screen while moving the screen of the LCD 51.

As shown in FIG. 32, the terminal apparatus 8 is equipped with two analog sticks 53A and 53B and several buttons 54A-54L as an operation means (operation part). Each analog stick 53A and 53B is a device for indicating a direction. Each analog stick 53A and 53B is comprised so that the stick part operated by a user's finger can slide or incline in arbitrary directions (any angle of up, down, left, and right inclination directions) with respect to the surface of the housing 50. . The left analog stick 53A is provided on the left side of the screen of the LCD 51 and the right analog stick 53B is provided on the right side of the screen of the LCD 51. Thus, the user can make an input indicating the direction by using an analog stick with any left or right hand. Further, as shown in Fig. 33, each analog stick 53A and 53B is installed at an operable position while the user grips the left and right portions of the terminal device 8, so that the user holds the terminal device 8. Even when moving, each analog stick 53A and 53B can be operated easily.

Each of the buttons 54A to 54L is an operation means for performing a predetermined input. As shown below, each button 54A to 54L is provided at a position that can be operated in a state where the user grips the left and right portions of the terminal device 8 (see FIG. 33). Therefore, the user can easily operate these operation means even when the user moves by holding the terminal device 8.

As shown in Fig. 32A, a cross-cross button (direction input button) 54A and buttons 54B to 54H are provided on the surface of the housing 50 among the operation buttons 54A to 54L. do. That is, these buttons 54A to 54G are disposed at positions operable by the thumb of the user (see FIG. 33).

The criss-cross button 54A is on the left side of the LCD 51 and is provided below the left analog stick 53A. That is, the crisscross button 54A is disposed at a position which can be operated by the user's left hand. The criss-cross button 54A has a cross-shaped shape and is a button which can instruct the directions of up, down, left, and right. In addition, buttons 54B to 54D are provided below the LCD 51. These three buttons 54B-54D are arrange | positioned in the position which can be operated by both left and right hands. In addition, the four buttons 54E to 54H are on the right side of the LCD 51 and are provided below the right analog stick 53B. That is, the four buttons 54E to 54H are disposed at positions operable by the user's right hand. In addition, the four buttons 54E to 54H are arranged so as to be in a positional relationship of up, down, left and right (relative to the center position of the four buttons 54E to 54H). Accordingly, the terminal device 8 can also function the four buttons 54E to 54H as buttons for instructing the user in the directions of up, down, left, and right.

In addition, as shown in (a), (b) and (c) of FIG. 32, the first L button 54I and the first R button 54J are inclined upper portions (left upper portion and right side) of the housing 50. Upper part) is installed. Specifically, the first L button 54I is provided at the left end of the upper side in the plate-shaped housing 50 and is exposed from the upper side and the left side. Moreover, the 1st R button 54J is provided in the right end part of the upper side surface in the housing 50, and is exposed from the upper side and the right side surface. In this manner, the first L button 54I is disposed at a position operable by the user's left hand index finger, and the first R button 54J is disposed at a position operable by the user's right hand index finger (see FIG. 9).

In addition, as shown in FIGS. 32B and 32C, the second L button 54K and the second R button 54L are provided on the rear surface of the plate-shaped housing 50 (that is, the LCD 51 is provided). On the opposite side of the surface] are provided on the feet 59A and 59B which are formed to protrude. Like the lid part 59 of the said embodiment, each foot part 59A and 59B is provided in the area | region containing the position opposite to the operation part (each analog stick 53A and 53B) provided in the left and right of a display part, respectively. Further, the second L button 54K is provided slightly above the rear left side (left side when viewed from the front side) of the housing 50, and the second R button 54L is provided from the rear right side (the surface side of the housing 50). Right side when seen] is installed slightly above. In other words, the second L button 54K is provided at a position substantially opposite to the left analog stick 53A provided on the surface, and the second R button 54L is located at approximately the opposite side of the right analog stick 53B provided on the surface. Is installed in position. In this way, the second L button 54K is disposed at a position operable by the user's left hand finger, and the second R button 54L is disposed at a position operable by the user's right hand finger (see FIG. 9). Further, as shown in FIG. 32C, the second L button 54K and the second R button 54L are provided on the surface of the foot portions 59A and 59B facing upwards and are inclined upwardly. It has a button side. When the user grips the terminal device 8, it is considered that the middle finger moves in the up and down direction, so that the user faces the button surface upward, so that the user can easily press the second L button 54K and the second R button 54L. In addition, when the foot is provided on the rear surface of the housing 50, the user can easily grip the housing 50, and when a button is provided on the foot, the user can easily operate while holding the housing 50.

In addition, with regard to the terminal device 8 shown in Fig. 32, since the second L button 54K and the second R button 54L are provided on the rear side, the screen (the surface of the housing 50) of the LCD 51 is reduced. When the terminal device 8 is loaded in a state of facing upward, the screen may not be completely horizontal. Therefore, in another embodiment, three or more feet may be formed in the back surface of the housing 50. According to this, in the state where the screen of the LCD 51 faces upward, since the foot contacts the bottom surface and can be loaded on the bottom surface, the terminal device 8 can be loaded so that the screen becomes horizontal. In addition, the terminal device 8 may be mounted horizontally by adding a detachable foot.

To each of the buttons 54A to 54L, a function corresponding to a game program is appropriately assigned. For example, the crisscross buttons 54A and the buttons 54E to 54H may be used for direction indication operations, selection operations, or the like, and the respective buttons 54B to 54E may be used for determination operations, cancellation operations, and the like.

In addition, although not shown, the terminal device 8 has a power button for turning on / off the power supply of the terminal device 8. In addition, the terminal device 8 includes a button for turning on / off the screen display of the LCD 51, a button for setting connection (pairing) with the game device 3, and a speaker (speaker shown in Fig. 10). (77)] may have a button for adjusting the volume.

As shown in FIG. 32A, the terminal device 8 includes a marker portion (marker portion 55 shown in FIG. 10) consisting of a marker 55A and a marker 55B on the surface of the housing 50. Equipped with. The marker portion 55 is provided above the LCD 51. Each marker 55A and the marker 55B are composed of one or more infrared LEDs, similarly to the respective markers 6R and 6L of the marker device 6. The marker unit 55 is used for the game device 3 to calculate the movement of the controller 5 and the like, similarly to the marker device 6 described above. In addition, the game device 3 can control the lighting of each infrared LED included in the marker unit 55.

The terminal device 8 is equipped with the camera 56 which is an imaging means. The camera 56 includes an imaging device (eg, a CCD image sensor, a CMOS image sensor, etc.) having a predetermined resolution, and a lens. As shown in FIG. 32, in the present modification, the camera 56 is provided on the surface of the housing 50. Therefore, the camera 56 can image the face of the user holding the terminal device 8, for example, can image the user when playing the game while looking at the LCD 51.

In addition, the terminal device 8 is equipped with a microphone (microphone 79 shown in FIG. 10) which is a voice input means. In the surface of the housing 50, a microphone hole 50c is formed. The microphone 79 is installed inside the inner housing 50 of the hole 50c for the microphone. The microphone detects ambient sounds of the terminal device 8, such as a user's voice.

The terminal device 8 is provided with a speaker (speaker 77 shown in FIG. 10) which is an audio output means. As shown in FIG. 32D, a speaker hole 57 is formed in the lower side surface of the housing 50. The output sound of the speaker 77 is output from this speaker hole 57. In this modification, the terminal device 8 is equipped with two speakers, and the speaker hole 57 is formed in the position of each of the left speaker and the right speaker.

The terminal device 8 also includes an expansion connector 58 for connecting another device to the terminal device 8. In the present modification, as shown in FIG. 32D, the expansion connector 58 is provided on the lower side surface of the housing 50. The other device connected to the expansion connector 58 may be any type, for example, an input device such as a controller (gun controller or the like) or a keyboard used for a specific game. If it is not necessary to connect another device, the expansion connector 58 may not be provided.

In addition, with respect to the terminal device 8 shown in FIG. 32, the shape of each operation button and the housing 50, the number of each component, the installation position, etc. are merely an example, and it is a different shape, number, and installation position. good.

As mentioned above, in the said modification, the two feet 59A and 59B provided in the position on both the left and right sides in the back surface of the housing 50 are provided as a projection part. Also in this case, similarly to the above embodiment, the user can grip the terminal device 8 comfortably by holding the terminal device 8 while the lower surface of the projection is held on the finger or the middle finger (see FIG. 33). In addition, as in the above embodiment, since the second L button 54K and the second R button 54L are provided on the upper surface of the projection, the user can easily operate these buttons in the above state.

As in the above-described embodiments and modifications, it is preferable that the projections are formed on the rear surface side of the housing so as to protrude to a position above the center of the housing and at least on both the left and right sides thereof. According to this, when the user grips both the left and right sides of the housing, the projection can be gripped by the finger so that the terminal device can be gripped with ease. Further, since the protrusion is formed on the upper side, the user can support the housing even with the palm (see FIG. 10 and the like), so that the terminal device can be reliably gripped.

In addition, the protrusion may not be formed above the center of the housing. For example, in the case where the operation portions are respectively provided on the left and right sides of the display portion, the projection portion can be caught by any finger other than the thumb while the user grips the housing so that each operation portion can be operated by the thumbs of both hands. It may be installed in. By this, the user can grip the terminal device comfortably by making the projection part catch the finger.

34 and 35 are diagrams showing an external configuration of a terminal apparatus according to another modification of the embodiment. 34 is a right side view of the terminal device, and FIG. 35 is a bottom view. The terminal device 9 shown in FIG. 34 and FIG. 35 is the same as that of the terminal device 7 in the said embodiment except having the convex parts 230a and 230b. Hereinafter, the structure of the terminal apparatus 9 in this modification is centered on the difference with the said embodiment.

The convex portions 230a and 230b are convex in cross section and are formed on both the left and right sides on the rear surface side of the housing 50, respectively. Here, the convex part 230a is formed in the left side (left side when seen from the surface side) of the housing 50, and the convex part 230b is formed in the right side (right side when viewed from the surface side) of the housing 50. Is formed. As shown in Fig. 35, the convex portions 230a and 230b are formed on both the left and right sides (both end portions) of the housing 50. Moreover, each convex part 230a and 230b is formed below the protrusion part (cover part 59). Each convex part 230a and 230b is formed at intervals between the protrusions. That is, in the housing 50, the part between each convex part 230a and 230b and a protrusion part is comprised thinner than these each part. Each of the convex portions 230a and 230b has a shape in which the protruding portion extends in the vertical direction, and the cross section perpendicular to the vertical direction is convex.

In the present modification, the user can grip the terminal device 9 more reliably by holding the little finger (and the weak finger) to surround each of the convex portions 230a and 230b. That is, the convex portions 230a and 230b have a function of the grip portion. The convex portion (grip portion) may have any shape. However, if the convex portion (grip portion) is formed to extend in the vertical direction, the terminal device 9 is easy to be gripped, which is preferable. In addition, although the height of each convex part 230a and 230b may be sufficient, it may be formed lower than a protrusion part. According to this, since the lower side of the screen becomes lower than the upper side in the state which mounted the terminal apparatus 9 so that the screen of the LCD 51 may become an upper direction, the terminal apparatus 9 can be loaded in the state which is easy to see. In addition, since each of the protrusions 230a and 230b is formed at intervals between the protrusions, the user can hold the terminal device 9 by placing a finger on the lower surface of the protrusion, so that the protrusions interfere with the finger. There is no work. As mentioned above, according to the said modification, a convex part is formed below a projection part, and a user can hold | grip a terminal device more reliably. In addition, in another embodiment, it is good also as a structure in which the said projection part is not formed in the back surface of the housing 50, In that case, a user can hold | grip the housing 50 reliably by a convex part (grip part). In addition, the surface of the convex portion (grip portion) may be made of a slippery material to further improve the grip function. Even when there is no convex part, you may use the material which is hard to slip on the back surface of a housing.

(Modifications Regarding Apparatus Applying This Configuration)

In the said embodiment, although the terminal apparatus used with a stationary game apparatus was demonstrated as an example, the structure of the operation apparatus described in this specification can be applied to the arbitrary apparatus which a user grips and uses. For example, the operation device may be realized as an information terminal such as a portable game machine, a mobile phone, a smartphone, and an electronic book terminal.

As described above, the present invention can be used, for example, as an operation device (terminal device) or the like in a game system, for the purpose of enabling the user to easily grip the same.

1: game system
2: television
3: game device
4: optical disc
5: controller
6: marker device
7 to 9: terminal device
10: CPU
11e: internal main memory
12: external main memory
51: LCD
52: touch panel
53: analog stick
54: Operation Button
55: marker portion
56: camera
59: cover part
62: magnetic sensor
63: acceleration sensor
64: Gyro sensor
200: input device
210: stand
230: convex

Claims (19)

  1. Plate-shaped housing,
    A display unit provided on the surface side of the housing;
    A first operation unit and a second operation unit which are respectively provided on the left and right sides of the display unit above the center of the housing;
    And a third operation unit and a fourth operation unit which are respectively provided at positions opposite to the first operation unit and the second operation unit on the rear surface side of the housing.
  2. According to claim 1, On the back side of the housing, further provided with protrusions protruding at least on the left and right positions on both sides,
    And the third operation portion and the fourth operation portion are disposed on an upper surface of the protrusion.
  3. The operating device according to claim 2, wherein a lower surface of the protrusion is provided with a first locking hole that can engage an additional device different from the operating device.
  4. The operating device according to claim 3, wherein a second locking hole in which the attachment device can be engaged is formed in the lower surface of the housing.
  5. The operating device according to claim 2, further comprising a convex portion below the protruding portion and on both left and right sides on the rear surface of the housing, the convex portion having a convex cross section.
  6. The operating device according to claim 5, wherein the protrusion and the convex portion are formed at intervals.
  7. The operating device according to claim 1, further comprising grip portions provided on both left and right sides of the rear surface of the housing.
  8. The said 1st operation part and the 2nd operation part are operation apparatuses of Claim 1 which are direction input parts which have a movable member which can slide or a hardness, respectively.
  9. The operating device according to claim 1, wherein the third operation unit and the fourth operation unit are pressable keys, respectively.
  10. The 5th operation part of Claim 1 arrange | positioned below the said 1st operation part in the surface side surface of the said housing,
    And a sixth operation portion disposed below the second operation portion on the surface side surface of the housing.
  11. The method of claim 10, wherein the fifth operation portion is a key capable of inputting at least four directions in up, down, left, and right directions,
    And the sixth operation portion includes a plurality of pushable keys.
  12. The operation device according to claim 1, further comprising a touch panel provided on a screen of the display unit.
  13. The operating device according to claim 1, further comprising an inertial sensor inside the housing.
  14. The operation device according to claim 1, further comprising a communication unit for wirelessly transmitting operation data indicating an operation performed on the device to the game device.
  15. 15. The apparatus of claim 14, wherein the communication unit receives image data transmitted from the game device,
    And a display control unit which displays the received image data on the display unit.
  16. The game processing unit according to claim 1, further comprising: a game processing unit which executes a game processing based on an operation for itself;
    And a display control unit for generating a game image based on the game processing and displaying the game image on the display unit.
  17. The operating device according to claim 1, wherein the display portion has a screen of 5 inches or more.
  18. The operation device according to claim 4,
    And an additional device having respective hook portions capable of being locked to the first and second locking holes, respectively, wherein the hook portions are connected to the operating device by locking the first and second locking holes.
  19. The operation device according to claim 4,
    And a support device having a guide member insertable into the second locking hole and a supporting member supporting the rear surface of the housing at a predetermined angle when the guide member is inserted into the second locking hole.
KR1020130014536A 2010-11-01 2013-02-08 Operating apparatus and operating system KR20130020715A (en)

Priority Applications (18)

Application Number Priority Date Filing Date Title
JPJP-P-2010-245299 2010-11-01
JP2010245299A JP4798809B1 (en) 2010-11-01 2010-11-01 Display device, game system, and game processing method
JPJP-P-2010-245298 2010-11-01
JP2010245298 2010-11-01
JP2011092506 2011-04-18
JPJP-P-2011-092506 2011-04-18
JP2011092612A JP6103677B2 (en) 2010-11-01 2011-04-19 Game system, operation device, and game processing method
JPJP-P-2011-092612 2011-04-19
JP2011102834A JP5837325B2 (en) 2010-11-01 2011-05-02 Operating device and operating system
JPJP-P-2011-102834 2011-05-02
JP2011103705 2011-05-06
JP2011103706A JP6005908B2 (en) 2010-11-01 2011-05-06 Equipment support system and support device
JPJP-P-2011-103704 2011-05-06
JPJP-P-2011-103705 2011-05-06
JPJP-P-2011-103706 2011-05-06
JP2011103704A JP6005907B2 (en) 2010-11-01 2011-05-06 Operating device and operating system
JPJP-P-2011-118488 2011-05-26
JP2011118488A JP5936315B2 (en) 2010-11-01 2011-05-26 Information processing system and information processing apparatus

Publications (1)

Publication Number Publication Date
KR20130020715A true KR20130020715A (en) 2013-02-27

Family

ID=47898499

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130014536A KR20130020715A (en) 2010-11-01 2013-02-08 Operating apparatus and operating system

Country Status (2)

Country Link
KR (1) KR20130020715A (en)
HK (4) HK1165745A1 (en)

Also Published As

Publication number Publication date
HK1171400A1 (en) 2016-07-22
HK1165745A1 (en) 2015-10-16
HK1171403A1 (en) 2015-10-16
HK1171399A1 (en) 2016-06-30

Similar Documents

Publication Publication Date Title
US8550915B2 (en) Game controller with adapter duplicating control functions
KR101231989B1 (en) Game controller and game system
JP5669336B2 (en) 3D viewpoint and object designation control method and apparatus using pointing input
JP5430246B2 (en) Game device and game program
US20070211027A1 (en) Image processing apparatus and storage medium storing image processing program
JP5131809B2 (en) Game device and game program
US7815508B2 (en) Game device and storage medium storing game program
JP5188682B2 (en) Game device, game program, game system, and game control method
JP5330640B2 (en) Game program, game device, game system, and game processing method
KR101169813B1 (en) Game system and storage medium having game program stored thereon
JP5361349B2 (en) Information processing apparatus, computer program, information processing system, and information processing method
US9345962B2 (en) Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US7538775B2 (en) Storage medium storing game program and game apparatus
JP5730463B2 (en) Game program and game device
US8284158B2 (en) Computer readable recording medium recording image processing program and image processing apparatus
US9211475B2 (en) Game device and storage medium storing game program for performing a game process based on data from sensor
JP2009172010A (en) Information processing program and information processor
CN102462960B (en) Controller device and controller system
JP5506129B2 (en) Game program, game device, game system, and game processing method
JP5520457B2 (en) Game device and game program
AU2011204816B8 (en) Display device, game system, and game process method
US9199168B2 (en) Game system, game apparatus, storage medium having game program stored therein, and game process method
EP2422854B1 (en) Game system, game device, storage medium storing game program, and game process method
US10150033B2 (en) Position calculation system, position calculation device, storage medium storing position calculation program, and position calculation method
EP1870141A1 (en) Recording medium recording game program and game apparatus

Legal Events

Date Code Title Description
A107 Divisional application of patent
WITN Withdrawal due to no request for examination