JP5622372B2 - Image display system and lighting device - Google Patents

Image display system and lighting device Download PDF

Info

Publication number
JP5622372B2
JP5622372B2 JP2009209319A JP2009209319A JP5622372B2 JP 5622372 B2 JP5622372 B2 JP 5622372B2 JP 2009209319 A JP2009209319 A JP 2009209319A JP 2009209319 A JP2009209319 A JP 2009209319A JP 5622372 B2 JP5622372 B2 JP 5622372B2
Authority
JP
Japan
Prior art keywords
light
means
image
game
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2009209319A
Other languages
Japanese (ja)
Other versions
JP2011056061A (en
Inventor
誠宏 近藤
誠宏 近藤
武 流田
武 流田
川井 英次
英次 川井
Original Assignee
任天堂株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 任天堂株式会社 filed Critical 任天堂株式会社
Priority to JP2009209319A priority Critical patent/JP5622372B2/en
Priority claimed from US12/877,547 external-priority patent/US8602891B2/en
Publication of JP2011056061A publication Critical patent/JP2011056061A/en
Application granted granted Critical
Publication of JP5622372B2 publication Critical patent/JP5622372B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHTING NOT OTHERWISE PROVIDED FOR
    • H05B37/00Circuit arrangements for electric light sources in general
    • H05B37/02Controlling

Description

  The present invention relates to an image display system that displays an image on a display screen, and more particularly to an image display system that provides a user with a further visual effect in addition to an image.

  Conventionally, there is a game system that displays a game image or the like on a television receiver (hereinafter simply referred to as “TV”). For example, Patent Literature 1 discloses a game system in which a game operation is performed by moving a controller that a user has in his hand, and a game image that changes according to the game operation is displayed on a television. In this game system, a device (marker unit) that emits infrared light is installed around the television as an accessory device of the game device body. The marker unit is used by the game device to calculate the movement of the controller. The game device calculates the movement of the controller based on the position of the marker unit (infrared light) in the image captured by the camera of the controller. To do.

JP 2008-125614 A

  In the conventional game system, the game system only displays a game image as a result of the game process on the screen of the television. For this reason, only a game image displayed on the screen of the television can give a visual effect to the user, and there is a limit to the presence and power of the game. Further, as in the game system of Patent Document 1, although there are accessory devices installed around the television, there is no accessory device that gives a visual effect to the user together with the television.

  Therefore, an object of the present invention is to provide an image display system capable of giving a user a further visual effect in addition to an image displayed on a screen.

  The present invention employs the following configurations (1) to (21) in order to solve the above problems.

(1)
The present invention is an image display system that displays an image on a display screen. The image display system includes light emitting means, light projecting means, display control means, and light projection control means. The light emitting means emits infrared light. The light projecting means projects visible light. The display control means executes predetermined information processing based on the detection result of infrared light, and controls display of an image on the display screen. The light projection control unit controls light projection by the light projection unit.

Here, the “image display system” is composed of one or more devices, but may have a function of displaying an image on a display screen (display device) and needs to include the display device itself. There is no. The “display screen” is a display screen of the television 2 connected to the game apparatus 3 in the embodiment described later, but may be a monitor of a personal computer, for example. The “image” includes both still images and moving images.
The “light emitting means” is a concept including various light emitting devices that emit infrared light in addition to the infrared LEDs (markers 6R and 6L) in the embodiments described later.
The “light projecting means” is a concept including various light emitting devices that project visible light to an arbitrary place in addition to each optical module in the embodiment described later.
The “display control means” is the CPU 10 (and the GPU 11b) of the game apparatus 3 in the embodiments described later, but may be any apparatus as long as it has a function of controlling display on the display apparatus. . The “predetermined information processing” is a concept including an arbitrary process using an infrared light detection result as an input in addition to a game process in an embodiment described later.
The "light projection control means" is the CPU 10 or the input / output processor 11a of the game apparatus 3 or the microcomputer 28 of the lighting apparatus 9 in the embodiment described later, but any means that controls the light emission by the light projection means. It may be something like this.

  According to the configuration of (1), the image display system displays an image on the display screen and projects light (visible light) to the surroundings by the light projecting unit. Therefore, a lighting effect can be added by the light projected to the surroundings in addition to the image, and a further visual effect can be given to the user in addition to the image.

(2)
The image display system may further include an input device including an imaging unit capable of detecting infrared light. The display control means executes predetermined information processing based on the position of the infrared light imaged by the imaging means.

  Here, the “input device” is the controller 5 in the embodiment described later, but it is not necessary to include the operation buttons and the acceleration sensor 37 as in the controller 5, and it is sufficient to include at least an imaging unit. . In addition, the “imaging unit” is a solid-state imaging device such as a CMOS sensor or a CCD sensor in the embodiments described later, but may be any imaging device (imaging device) that can detect infrared light. .

  According to the configuration of (2), since predetermined information processing is executed with the position of the infrared light imaged by the imaging means as an input, the user performs an input by operating an input device including the imaging means. Can do.

(3)
The display control means may display a game image obtained as a result of executing the game process as predetermined information processing on the display screen. At this time, the light projection control unit controls light projection by the light projection unit in accordance with the game process.

  According to the configuration of (3), the image display system can add a lighting effect by the light projecting means to the game image, so that it is possible to further improve the realism and power of the game.

(4)
The light projection control unit may change the light projection by the light projection unit according to the image on the display screen.

  According to the configuration of (4), since the image display system can change the light projected by the light projecting unit according to the display image, it is possible to add a lighting effect suitable for the image.

(5)
The image display system may further include an operation reception unit that receives a user operation. At this time, the light projection control unit changes the light projection by the light projection unit according to the user operation received by the operation reception unit.

  The “operation receiving means” is a concept including an arbitrary input device that receives a user operation in addition to the controller 5 in an embodiment described later.

  According to the configuration of (5), the image display system can change the light projected by the light projecting means according to the user's operation, so that it can add a lighting effect suitable for the user operation. .

(6)
The image display system may further include audio output means for outputting sound based on predetermined information processing. At this time, the light projection control means changes the light projection by the light projection means according to the sound output by the sound output means.

  In the embodiment described later, the “sound output unit” is the speaker 2a built in the display device (television 2). However, the “sound output unit” may have any function of outputting sound, and is configured separately from the display device. May be.

  According to the configuration of (6), since the image display system can change the light projected by the light projecting means according to the output sound, it is possible to add an illumination effect that matches the output sound.

(7)
The light emitting means may be installed so as to emit infrared light toward the front of the display screen. At this time, the light projecting means is installed so as to emit visible light toward the rear of the display screen.

  In addition, the above-mentioned “emitted toward the front of the display screen” does not mean that “emerges in a direction perpendicular to the display screen” in a strict sense. It is facing the front rather than the direction parallel to. Similarly, “emits toward the rear of the display screen” means that “the emission direction is directed to the rear side rather than the direction parallel to the display screen”, as in the embodiment described later, It includes the case where it radiates | emits towards the back of the display screen a little upwards.

  According to the configuration of (7), when the light emitting means is directed to the front of the display screen, the light projecting means emits visible light toward the rear of the display screen. Therefore, when the display device is disposed in front of the wall surface, visible light can be projected onto the wall surface behind the display device. Therefore, when viewed from the front of the display device, visible light is projected around the display screen (see, for example, FIG. 15), and an illumination effect can be effectively added to the image on the display screen. .

(8)
The image display system may further include a first housing and a second housing. The first housing has light emitting means therein. The second housing is configured separately from the first housing, and has light projecting means therein.

  According to the structure of (8), since the 1st housing | casing which has a light emission means, and the 2nd housing | casing which has a light projection means are comprised separately, a light emission means and a light projection means are installed freely. be able to. The environment in which the display device (television) is installed varies depending on each home. However, by increasing the degree of freedom of installation, the light emitting means and the light projecting means can be appropriately arranged according to the environment.

(9)
In (8) above, the second housing may be detachable from the first housing.

  According to the configuration of (9), the casing having the light emitting means and the casing having the light projecting means can be installed as a unit, so that a user-friendly system can be provided.

(10)
In (9) above, even if the second casing is mounted on the first casing so that the light projecting means emits visible light in a direction opposite to the direction of emitting infrared light by the light emitting means. Good.

  According to the configuration of (10), as in the case of (7) above, when the light emitting means is directed to the front of the display screen, the light projecting means emits visible light toward the rear of the display screen. Therefore, when the display device is arranged in front of the wall surface, similarly to the above (7), when viewed from the front of the display device, visible light is projected around the display screen. A lighting effect can be effectively added to an image.

(11)
The image display system may further include a housing having a light emitting unit and a light projecting unit therein.

  According to the configuration of (11), since the light emitting means and the light projecting means are an integrated device, it is possible to provide a system that is easy to use for the user, saving the trouble of installing each of them separately.

(12)
In the above (11), the light projecting means may be arranged in the housing such that the light projecting means emits visible light in a direction opposite to the infrared light emitting direction by the light emitting means.

  According to the configuration of (12), in the same manner as (7) and (10) above, when the light emitting means is directed to the front of the display screen, the light projecting means emits visible light toward the rear of the display screen. Become. Therefore, when the display device is arranged in front of the wall surface, an illumination effect can be effectively added to the image on the display screen as in the above (7) and (10).

(13)
The light projecting means may emit a plurality of visible lights each having a longitudinal section of light.

  Note that “the cross section of light has a longitudinal shape” means that the shape when the light is applied to a surface perpendicular to the traveling direction of the light becomes a longitudinal shape.

  According to the configuration of (13), since the light projecting unit projects a plurality of linear lights, the light projecting pattern can be increased as compared with the case of projecting only one light, and the illumination effect can be varied. Can be increased.

(14)
The light projecting unit may include a first emitting unit and a second emitting unit. The first emitting means emits a plurality of visible lights in different directions. The second emitting means emits visible light having a wider cross section than the visible light in a direction overlapping with the plurality of visible lights emitted by the first emitting means.

  Here, the “first emitting means” is the linear light modules 61 to 67 that emit linear light in the embodiments described later, but is not limited to those that emit linear light, and a plurality of visible lights. Any member that emits light may be used. The “second emitting means” is a background light module 68 that emits background light in the embodiment described later, but emits visible light having a wider cross section than the visible light emitted by the first emitting means. If it is.

  According to the configuration of (14), a plurality of lights from the first emitting means and light from the second emitting means are projected onto the wall surface around the light projecting means. According to this, since two types of light are projected, the number of projection patterns can be increased as compared with the case where one type of light is projected, and variations in illumination effects can be increased.

(15)
The light projecting means may include a light emitting member that can emit light of a plurality of types of colors. At this time, the light projection control means controls at least the color to be emitted by the light emitting member.

  In the embodiment described later, the “light emitting member” is a color LED module that includes a red LED, a green LED, and a blue LED and can emit 256 colors, but may emit two or more colors. Any possible light emitting device may be used.

  According to the configuration of (15), the color of the light projected by the light projecting means can be changed, and a more varied illumination effect can be provided.

(16)
The image display system may further include a communication unit and a power control unit. The communication means communicates with other devices. The power control means can perform power saving control for supplying power to the communication means without supplying power to at least the display control means. At this time, when the power control unit is executing the power saving control, the light projecting control unit controls light projection by the light projecting unit in response to reception of predetermined data by the communication unit.

Here, in the embodiment described later, the “communication means” is the wireless communication module 18 and the antenna 22 for communicating with other game devices and various servers connected to the network. However, the above “communication means” is not limited as long as it has a function of communicating with other devices other than the image display system, and can communicate with other devices by wireless communication or infrared communication (not via a network). An apparatus that performs communication may be used.
In the embodiment described later, the “power control unit” is a system LSI 11 that controls the power supply to each member in the game apparatus 3, but at least the function of controlling the power supply to the display control unit and the communication unit. The power supply to the members other than the power supply to the communication means may not be controlled.

  According to the configuration of (16), the light projecting unit can notify that data has been received from another device, in addition to adding a visual effect to the display image. According to this, notification can be performed even when the power of the display device is off.

(17)
The image display system may further include brightness detection means for detecting ambient brightness. At this time, the light projection control unit changes the light projection by the light projection unit according to the detection result of the brightness detection unit.

  The “brightness detection means” is, for example, an illuminance sensor or the like, but may be any sensor as long as it can detect an indicator relating to the brightness of the surrounding area.

  According to the configuration of (17), the image display system can adjust the light emitted by the light projecting means according to the ambient brightness, so that an illumination effect with an appropriate intensity according to the ambient brightness is produced. Can be made.

(18)
The image display system of the present invention may include an imaging unit, a light projecting unit, a display control unit, and a light projecting control unit. The imaging means images the front of the display screen. The light projecting means projects visible light. The display control unit executes predetermined information processing based on the captured image of the imaging unit, and controls display of the image on the display screen. The light projection control unit controls light projection by the light projection unit when an image is displayed on the display screen.

  The “imaging means” is, for example, a camera (described in (an example of another game system) in [another embodiment] to be described later) that is arranged around the television and images the front of the television. Any device that images the front of the display device (display screen) may be used.

(19)
The image display system of the present invention may include a signal transmission unit, a light projecting unit, a display control unit, and a light projection control unit. The signal transmission means transmits a predetermined signal. The light projecting means projects visible light. The display control means executes predetermined information processing based on the detection result of the predetermined signal, and controls display of an image on the display screen. The light projection control unit controls light projection by the light projection unit when an image is displayed on the display screen.

  The “signal transmitting means” is a device that outputs a predetermined signal by radio waves or ultrasonic waves, for example. The predetermined signal is detected by an input device or the like, and the detection result is used as an input for information processing. It is provided for the purpose of use.

(20)
Further, the present invention may be provided as a lighting device that is detachably connected to a display control device that displays an image on a display device. The illumination device includes a receiving unit and a light projecting unit. The receiving means receives a control instruction from the display control device. The light projecting unit projects visible light toward the rear of the display device in accordance with the control instruction received by the receiving unit.

  According to the configurations of (18) to (20), an image is displayed on the display screen as in the case of (1), and light (visible light) is projected around by the light projecting unit. Therefore, a lighting effect can be added by the light projected to the surroundings in addition to the image, and a further visual effect can be given to the user in addition to the image.

(21)
In the above (20), the light projecting means may include a first emitting means and a second emitting means. The first emitting means emits a plurality of visible lights each having a longitudinal section of light in different directions. The second emitting means emits visible light having a wider cross section than the visible light in a direction overlapping with the plurality of visible lights emitted by the first emitting means.

  According to the configuration of (21), a plurality of linear lights from the first emitting unit and light from the second emitting unit are projected onto the wall surface around the projecting unit. According to this, since two types of light are projected, the number of projection patterns can be increased as compared with the case where one type of light is projected, and variations in illumination effects can be increased.

  According to the present invention, by providing a light projecting unit that projects visible light, in addition to displaying an image on the display screen, light (visible light) is projected to the surroundings by the light projecting unit. A lighting effect can be added to the image on the display screen by light, and a further visual effect can be given to the user in addition to the image.

External view of game system 1 The block diagram which shows the connection relation of each apparatus contained in the game system 1 The block diagram which shows the structure of each apparatus of the game system 1 The perspective view which shows the external appearance structure of the controller 5 The perspective view which shows the external appearance structure of the controller 5 The figure which shows the internal structure of the controller 5 The figure which shows the internal structure of the controller 5 Block diagram showing the configuration of the controller 5 External view of lighting device 9 The perspective view which shows the main structures in the illuminating device 9 Diagram showing internal structure of linear optical module The figure which shows the light radiate | emitted from a linear optical module The figure which shows an example of the linear light projected when three LED75-77 shifts | deviates from the axis | shaft L, and is arrange | positioned. Three views showing the arrangement of each optical module The figure which shows the linear light and background light which were projected on the back surface of the television 2 by the illuminating device 9 The figure which shows the main data memorize | stored in the main memory of the game device 3 Main flowchart showing a flow of processing executed in the game apparatus 3 A flowchart showing data reception processing of the input / output processor 11a in the sleep mode

[Overall configuration of game system]
Hereinafter, a game system 1 which is an example of an image display system according to an embodiment of the present invention will be described with reference to the drawings. FIG. 1 is an external view of the game system 1. FIG. 2 is a block diagram showing a connection relationship between devices included in the game system 1. In FIG. 1, the game system 1 includes a television receiver (hereinafter simply referred to as “TV”) 2, a game device 3, an optical disk 4, a controller 5, a marker device 6, and an illumination device 9. The game system 1 executes game processing in the game device 3 based on a game operation using the controller 5 and displays a game image or the like obtained as a result of the game processing on the television 2.

  An optical disk 4 that is an example of an information storage medium that can be used interchangeably with the game apparatus 3 is detachably inserted into the game apparatus 3. The optical disc 4 stores a game program to be executed on the game apparatus 3. An insertion slot for the optical disk 4 is provided on the front surface of the game apparatus 3. The game apparatus 3 executes a game process by reading and executing a game program stored in the optical disc 4 inserted into the insertion slot. The controller 5 is an input device that provides the game device 3 with operation data indicating the content of the operation performed on the own device. As shown in FIG. 2, the controller 5 and the game apparatus 3 are connected by wireless communication. In the present embodiment, for example, Bluetooth (registered trademark) technology is used for wireless communication between the controller 5 and the game apparatus 3. In other embodiments, the controller 5 and the game apparatus 3 may be connected by wire.

  As shown in FIGS. 1 and 2, a television 2 (including a speaker 2a), which is an example of a display device, is connected to the game apparatus 3 via a connection cord. The television 2 displays a game image obtained as a result of the game process executed in the game device 3. The television 2 has a speaker 2a, and the speaker 2a outputs game sound obtained as a result of the game processing.

  A marker device 6 is installed around the television 2 (upper side of the screen in FIG. 1). Although details will be described later, the user can perform a game operation to move the controller 5, and the marker device 6 is used for the game device 3 to detect the movement of the controller 5. The marker device 6 includes two markers 6R and 6L at both ends thereof. The marker 6 </ b> R (same for the marker 6 </ b> L) is specifically one or more infrared LEDs (Light Emitting Diodes), and outputs infrared light toward the front of the television 2. As shown in FIG. 2, the marker device 6 is connected to the game device 3, and lighting of each infrared LED included in the marker device 6 is controlled by the game device 3. In addition, although the marker apparatus 6 represents the aspect installed in the television 2 in FIG. 1, the position and direction which install the marker apparatus 6 are arbitrary.

  In addition, a lighting device 9 is installed around the television 2 (upper side of the screen in FIG. 1). The lighting device 9 is a device that outputs visible light for the purpose of giving the user a further visual effect (lighting effect) in addition to the image displayed on the television 2. As shown in FIG. 2, the lighting device 9 is connected to the game device 3, and light emission of the lighting device 9 is controlled by the game device 3.

  The position and orientation for installing the lighting device 9 are arbitrary. However, in the present embodiment, it is assumed that the lighting device 9 projects visible light on the rear wall surface (such as a house wall or a curtain) of the television 2 and shows the user the light hit the wall surface (see FIG. 15). Therefore, it is preferable that the illumination device 9 is installed so as to emit visible light toward the rear of the television 2. In FIG. 1, the illumination device 9 is installed on the marker device 6. In another embodiment, the lighting device 9 may be installed directly on the television 2 or may be installed on a pedestal on which the television 2 is placed and at a position behind the television 2. . In addition, a member that can be hooked on the back side of the television 2 may be provided in the lighting device 9, and the lighting device 9 may be disposed by being hooked on the back side of the television 2.

  Moreover, in other embodiment, you may comprise so that the illuminating device 9 and the marker apparatus 6 can be attached or detached. Furthermore, at this time, when the illumination device 9 and the marker device 6 are mounted, the marker device 6 can be mounted so that the illumination device 9 emits visible light to the opposite side of the direction in which the infrared light is emitted. Preferably there is. According to this, since infrared light can be emitted toward the front of the television 2 and visible light can be emitted toward the rear of the television 2, it can be applied to the rear wall of the television 2 as in the present embodiment. This is effective when it is assumed that visible light from the illumination device 9 is projected.

  In the present embodiment, the marker device 6 and the illumination device 9 are separate, but in other embodiments, the marker device 6 and the illumination device 9 may be integrated. That is, you may make it include the infrared LED of the marker apparatus 6 and the optical module of the illuminating device 9 in a single housing | casing. When it is assumed that visible light from the illumination device 9 is projected onto the wall surface behind the television 2 as in this embodiment, the illumination device is directed to the opposite side of the direction in which the marker device 6 emits infrared light. It is preferable to attach the marker device 6 and the illumination device 9 in the casing so that the light 9 emits visible light. According to this, infrared light can be emitted toward the front of the television 2 and visible light can be emitted toward the rear of the television 2.

[Internal configuration of game device 3]
Next, the internal configuration of the game apparatus 3 will be described with reference to FIG. FIG. 3 is a block diagram showing the configuration of each device of the game system 1. The game apparatus 3 includes a CPU 10, a system LSI 11, an external main memory 12, a ROM / RTC 13, a disk drive 14, an AV-IC 15 and the like.

  The CPU 10 executes a game process by executing a game program stored on the optical disc 4, and functions as a game processor. The CPU 10 is connected to the system LSI 11. In addition to the CPU 10, an external main memory 12, a ROM / RTC 13, a disk drive 14, and an AV-IC 15 are connected to the system LSI 11. The system LSI 11 performs processing such as control of data transfer between components connected thereto, generation of an image to be displayed, and acquisition of data from an external device. The internal configuration of the system LSI will be described later. The volatile external main memory 12 stores a program such as a game program read from the optical disc 4 or a game program read from the flash memory 17, or stores various data. Used as a work area and buffer area. The ROM / RTC 13 includes a ROM (so-called boot ROM) in which a program for starting the game apparatus 3 is incorporated, and a clock circuit (RTC: Real Time Clock) that counts time. The disk drive 14 reads program data, texture data, and the like from the optical disk 4 and writes the read data to an internal main memory 11e or an external main memory 12 described later.

  Further, the system LSI 11 is provided with an input / output processor (I / O processor) 11a, a GPU (Graphics Processor Unit) 11b, a DSP (Digital Signal Processor) 11c, a VRAM 11d, and an internal main memory 11e. Although not shown, these components 11a to 11e are connected to each other by an internal bus.

  The GPU 11b forms part of a drawing unit and generates an image according to a graphics command (drawing command) from the CPU 10. The VRAM 11d stores data (data such as polygon data and texture data) necessary for the GPU 11b to execute the graphics command. When an image is generated, the GPU 11b creates image data using data stored in the VRAM 11d.

  The DSP 11c functions as an audio processor, and generates sound data using sound data and sound waveform (tone color) data stored in the internal main memory 11e and the external main memory 12.

  The image data and audio data generated as described above are read out by the AV-IC 15. The AV-IC 15 outputs the read image data to the television 2 via the AV connector 16, and outputs the read audio data to the speaker 2 a built in the television 2. As a result, an image is displayed on the television 2 and a sound is output from the speaker 2a.

  The input / output processor 11a performs transmission / reception of data to / from components connected to the input / output processor 11a and downloads data from an external device. The input / output processor 11a is connected to the flash memory 17, the wireless communication module 18, the wireless controller module 19, the expansion connector 20, and the memory card connector 21. An antenna 22 is connected to the wireless communication module 18, and an antenna 23 is connected to the wireless controller module 19.

  The input / output processor 11a is connected to the network via the wireless communication module 18 and the antenna 22, and can communicate with other game devices and various servers connected to the network. The input / output processor 11a periodically accesses the flash memory 17 to detect the presence / absence of data that needs to be transmitted to the network. If there is such data, the input / output processor 11a communicates with the network via the wireless communication module 18 and the antenna 22. Send. Further, the input / output processor 11a receives data transmitted from other game devices and data downloaded from the download server via the network, the antenna 22 and the wireless communication module 18, and receives the received data in the flash memory 17. Remember. By executing the game program, the CPU 10 reads out the data stored in the flash memory 17 and uses it in the game program. In the flash memory 17, in addition to data transmitted and received between the game apparatus 3 and other game apparatuses and various servers, save data (game result data or intermediate data) of the game played using the game apparatus 3 May be stored.

  The input / output processor 11a receives operation data transmitted from the controller 5 via the antenna 23 and the wireless controller module 19, and stores (temporarily stores) the data in the buffer area of the internal main memory 11e or the external main memory 12.

  Further, an expansion connector 20 and a memory card connector 21 are connected to the input / output processor 11a. The expansion connector 20 is a connector for an interface such as USB or SCSI, and connects a medium such as an external storage medium, a peripheral device such as another controller, or a wired communication connector. By connecting, communication with the network can be performed instead of the wireless communication module 18. In the present embodiment, the extension connector 20 is used to connect the lighting device 9 to the game device 3. The memory card connector 21 is a connector for connecting an external storage medium such as a memory card. For example, the input / output processor 11a can access an external storage medium via the expansion connector 20 or the memory card connector 21 to store data in the external storage medium or read data from the external storage medium.

  The game apparatus 3 is provided with a power button 24, a reset button 25, and an eject button 26. The power button 24 and the reset button 25 are connected to the system LSI 11. When the power button 24 is turned on, power is supplied to each component of the game apparatus 3 via an AC adapter (not shown). When the reset button 25 is pressed, the system LSI 11 restarts the boot program for the game apparatus 3. The eject button 26 is connected to the disk drive 14. When the eject button 26 is pressed, the optical disk 4 is ejected from the disk drive 14.

  In the present embodiment, when the power button 24 is turned on, the system LSI 11 is supplied with power to each component of the game apparatus 3 via an AC adapter (not shown) to enter a normal energized state (“normal mode”). ”). On the other hand, when the power button 24 is turned off, the system LSI 11 is in a mode in which power is supplied to only some components of the game apparatus 3 and power saving control is performed to minimize power consumption (hereinafter “sleep”). Mode)). In the present embodiment, when the sleep mode is set, the system LSI 11 includes components other than the input / output processor 11a, the flash memory 17, the external main memory 12, the ROM / RTC 13, the wireless communication module 18, and the wireless controller module 19. Is instructed to stop the power supply. Therefore, this sleep mode is a mode in which no application is executed by the CPU 10. However, even in the sleep mode, the game apparatus 3 can receive data from the outside, and data transmitted from other game apparatuses or download servers is stored in the flash memory 17.

  Note that power is supplied to the system LSI 11 even in the sleep mode. However, in the sleep mode, the system LSI 11 stops supplying clocks to the components of the GPU 11b, the DSP 11c, and the VRAM 11d. As a result, power consumption is reduced by not driving these components. Although not shown, a fan for exhausting the heat of the IC such as the CPU 10 and the system LSI 11 to the outside is provided inside the housing of the game apparatus 3. In the sleep mode, this fan is also stopped.

  Further, the game apparatus 3 can switch between the normal mode and the sleep mode by remote operation by pressing the power button of the controller 5. Note that when the switching by the remote operation is not performed, the power supply to the wireless controller module 19 may not be performed in the sleep mode. Further, the game apparatus 3 may be able to be set not to use the sleep mode according to a user instruction. When the sleep mode is not used, when the power button 24 is turned off, the power supply to all circuits is completely stopped.

  The lighting device 9 is detachably connected to the game device 3 via the expansion connector 20. The illumination device 9 is a device that emits visible light for the purpose of giving the user a further visual effect in addition to the image displayed on the television 2. As shown in FIG. 3, the lighting device 9 includes a connector 27, a microcomputer 28, and each optical module 29. The connector 27 is detachably connected to the expansion connector 20 of the game apparatus 3 via a cable (not shown). Note that the communication method between the game device 3 and the lighting device 9 may be any method, and in other embodiments, communication may be performed wirelessly.

  The microcomputer 28 is connected to the connector 27, and each optical module 29 is connected to the microcomputer 28. Each optical module 29 is a plurality of light emitting devices that emit visible light. In the present embodiment, each optical module 29 is composed of a plurality of color LED modules capable of emitting a plurality of types of colors. The detailed configuration of each optical module 29 will be described later. The microcomputer 28 is a circuit that controls the light emission of each optical module 29. The game apparatus 3 transmits data indicating the color to be emitted by each light module 29 to the lighting apparatus 9. The microcomputer 28 acquires the data transmitted from the game apparatus 3 via the connector 27 and controls the light emission of each optical module 29 according to the data. The lighting device 9 may be supplied with power through an AC adapter (not shown) (independent of the game device 3), or may be supplied with power from the game device 3.

[Configuration of controller 5]
Next, the controller 5 will be described with reference to FIGS. FIG. 4 is a perspective view showing an external configuration of the controller 5. FIG. 5 is a perspective view showing an external configuration of the controller 5. 4 is a perspective view of the controller 5 as viewed from the upper rear side, and FIG. 5 is a perspective view of the controller 5 as viewed from the lower front side.

  4 and 5, the controller 5 has a housing 31 formed by plastic molding, for example. The housing 31 has a substantially rectangular parallelepiped shape whose longitudinal direction is the front-rear direction (the Z-axis direction shown in FIG. 4), and is a size that can be gripped with one hand of an adult or a child as a whole. A user (player) can perform a game operation by pressing a button provided on the controller 5 and moving the controller 5 itself to change its position and posture.

  The housing 31 is provided with a plurality of operation buttons. As shown in FIG. 4, a cross button 32a, a first button 32b, a second button 32c, an A button 32d, a minus button 32e, a home button 32f, a plus button 32g, and a power button 32h are provided on the upper surface of the housing 31. It is done. In the present specification, the upper surface of the housing 31 on which these buttons 32a to 32h are provided may be referred to as a “button surface”. On the other hand, as shown in FIG. 5, a recess is formed on the lower surface of the housing 31, and a B button 32i is provided on the rear inclined surface of the recess. A function corresponding to the game program executed by the game apparatus 3 is appropriately assigned to each of the operation buttons 32a to 32i. The power button 32h is for remotely turning on / off the main body of the game apparatus 3. The home button 32 f and the power button 32 h are embedded in the upper surface of the housing 31. This can prevent the player from pressing the home button 32f or the power button 32h by mistake.

  A connector 33 is provided on the rear surface of the housing 31. The connector 33 is used for connecting another device to the controller 5. Further, locking holes 33a are provided on both sides of the connector 33 on the rear surface of the housing 31 in order to prevent the other devices from being easily detached.

  A plurality (four in FIG. 4) of LEDs 34a to 34d are provided behind the upper surface of the housing 31. Here, the controller type (number) is assigned to the controller 5 to distinguish it from other main controllers. The LEDs 34a to 34d are used for the purpose of notifying the player of the controller type currently set in the controller 5 and notifying the player of the remaining battery level of the controller 5. Specifically, when a game operation is performed using the controller 5, any one of the plurality of LEDs 34a to 34d is turned on according to the controller type.

  Further, the controller 5 has an imaging information calculation unit 35 (FIG. 7). As shown in FIG. 5, a light incident surface 35a of the imaging information calculation unit 35 is provided on the front surface of the housing 31. The light incident surface 35a is made of a material that transmits at least infrared light from the markers 6R and 6L.

  A sound release hole 31a is formed between the first button 32b and the home button 32f on the upper surface of the housing 31 for emitting sound from a speaker 49 (FIG. 6) built in the controller 5 to the outside.

  Next, the internal structure of the controller 5 will be described with reference to FIGS. 6 and 7. 6 and 7 are diagrams showing the internal structure of the controller 5. FIG. FIG. 6 is a perspective view showing a state in which the upper housing (a part of the housing 31) of the controller 5 is removed. FIG. 7 is a perspective view showing a state where the lower casing (a part of the housing 31) of the controller 5 is removed. The perspective view shown in FIG. 7 is a perspective view of the substrate 30 shown in FIG.

  In FIG. 6, a substrate 30 is fixed inside the housing 31, and operation buttons 32 a to 32 h, LEDs 34 a to 34 d, an acceleration sensor 37, an antenna 45, and a speaker 49 are provided on the upper main surface of the substrate 30. Etc. are provided. These are connected to a microcomputer (microcomputer) 42 (see FIG. 7) by wiring (not shown) formed on the substrate 30 or the like. In the present embodiment, the acceleration sensor 37 is disposed at a position shifted from the center of the controller 5 with respect to the X-axis direction. This makes it easier to calculate the movement of the controller 5 when the controller 5 is rotated about the Z axis. The acceleration sensor 37 is disposed in front of the center of the controller 5 in the longitudinal direction (Z-axis direction). Further, the controller 5 functions as a wireless controller by the wireless module 44 (FIG. 7) and the antenna 45.

  On the other hand, in FIG. 7, an imaging information calculation unit 35 is provided at the front edge on the lower main surface of the substrate 30. The imaging information calculation unit 35 includes an infrared filter 38, a lens 39, an imaging element 40, and an image processing circuit 41 in order from the front of the controller 5. These members 38 to 41 are respectively attached to the lower main surface of the substrate 30.

  Further, the microcomputer 42 and the vibrator 48 are provided on the lower main surface of the substrate 30. The vibrator 48 is, for example, a vibration motor or a solenoid, and is connected to the microcomputer 42 by wiring formed on the substrate 30 or the like. The controller 48 is vibrated by the operation of the vibrator 48 according to the instruction of the microcomputer 42. As a result, a so-called vibration-compatible game in which the vibration is transmitted to the hand of the player holding the controller 5 can be realized. In the present embodiment, the vibrator 48 is disposed slightly forward of the housing 31. That is, by arranging the vibrator 48 on the end side of the center of the controller 5, the entire controller 5 can be vibrated greatly by the vibration of the vibrator 48. The connector 33 is attached to the rear edge on the lower main surface of the substrate 30. 6 and 7, the controller 5 includes a crystal resonator that generates a basic clock of the microcomputer 42, an amplifier that outputs an audio signal to the speaker 49, and the like.

  Note that the shape of the controller 5, the shape of each operation button, the number of acceleration sensors and vibrators and the installation positions, etc. shown in FIGS. 4 to 7 are merely examples, and other shapes, numbers, and installation positions. However, the present invention can be realized. In the present embodiment, the imaging direction by the imaging unit is the positive Z-axis direction, but the imaging direction may be any direction. That is, the position of the imaging information calculation unit 35 in the controller 5 (the light incident surface 35a of the imaging information calculation unit 35) does not have to be the front surface of the housing 31, and other surfaces can be used as long as light can be taken in from the outside of the housing 31. May be provided.

  FIG. 8 is a block diagram showing the configuration of the controller 5. The controller 5 includes an operation unit 32 (operation buttons 32a to 32i), a connector 33, an imaging information calculation unit 35, a communication unit 36, and an acceleration sensor 37. The controller 5 transmits data indicating the details of the operation performed on the own device to the game apparatus 3 as operation data.

  The operation unit 32 includes the operation buttons 32a to 32i described above, and the operation button data indicating the input state (whether or not each operation button 32a to 32i is pressed) to each operation button 32a to 32i is transmitted to the microcomputer of the communication unit 36. Output to 42.

  The imaging information calculation unit 35 is a system for analyzing the image data captured by the imaging unit, discriminating a region having a high luminance in the image data, and calculating a center of gravity position, a size, and the like of the region. Since the imaging information calculation unit 35 has a sampling period of, for example, about 200 frames / second at the maximum, it can track and analyze even a relatively fast movement of the controller 5.

  The imaging information calculation unit 35 includes an infrared filter 38, a lens 39, an imaging element 40, and an image processing circuit 41. The infrared filter 38 passes only infrared rays from the light incident from the front of the controller 5. The lens 39 collects the infrared light transmitted through the infrared filter 38 and makes it incident on the image sensor 40. The image sensor 40 is a solid-state image sensor such as a CMOS sensor or a CCD sensor, for example, and receives the infrared light collected by the lens 39 and outputs an image signal. Here, the markers 6R and 6L of the marker device 6 disposed in the vicinity of the display screen of the television 2 are configured by infrared LEDs that output infrared light toward the front of the television 2. Therefore, by providing the infrared filter 38, the image sensor 40 receives only the infrared light that has passed through the infrared filter 38 and generates image data, so that the images of the markers 6R and 6L can be captured more accurately. Hereinafter, an image captured by the image sensor 40 is referred to as a captured image. Image data generated by the image sensor 40 is processed by the image processing circuit 41. The image processing circuit 41 calculates the position of the imaging target (markers 6R and 6L) in the captured image. The image processing circuit 41 outputs coordinates indicating the calculated position to the microcomputer 42 of the communication unit 36. The coordinate data is transmitted to the game apparatus 3 as operation data by the microcomputer 42. Hereinafter, the coordinates are referred to as “marker coordinates”. Since the marker coordinates change corresponding to the direction (tilt angle) and position of the controller 5 itself, the game apparatus 3 can calculate the direction and position of the controller 5 using the marker coordinates.

  In other embodiments, the controller 5 may not include the image processing circuit 41, and the captured image itself may be transmitted from the controller 5 to the game apparatus 3. At this time, the game apparatus 3 may have a circuit or a program having the same function as the image processing circuit 41, and may calculate the marker coordinates.

  The acceleration sensor 37 detects the acceleration (including gravity acceleration) of the controller 5, that is, detects the force (including gravity) applied to the controller 5. The acceleration sensor 37 detects the value of the acceleration (linear acceleration) in the linear direction along the sensing axis direction among the accelerations applied to the detection unit of the acceleration sensor 37. For example, in the case of a multi-axis acceleration sensor having two or more axes, the component acceleration along each axis is detected as the acceleration applied to the detection unit of the acceleration sensor. For example, the triaxial or biaxial acceleration sensor may be of the type available from Analog Devices, Inc. or ST Microelectronics NV. The acceleration sensor 37 is, for example, a capacitance type acceleration sensor, but other types of acceleration sensors may be used.

  In the present embodiment, the acceleration sensor 37 has a vertical direction (Y-axis direction shown in FIG. 4), a horizontal direction (X-axis direction shown in FIG. 4), and a front-back direction (Z-axis direction shown in FIG. 4) with reference to the controller 5. ) Linear acceleration is detected in each of the three axis directions. Since the acceleration sensor 37 detects acceleration in the linear direction along each axis, the output from the acceleration sensor 37 represents the linear acceleration value of each of the three axes. That is, the detected acceleration is expressed as a three-dimensional vector (ax, ay, az) in an XYZ coordinate system (controller coordinate system) set with the controller 5 as a reference. Hereinafter, a vector having the respective acceleration values related to the three axes detected by the acceleration sensor 37 as components is referred to as an acceleration vector.

  Data indicating the acceleration detected by the acceleration sensor 37 (acceleration data) is output to the communication unit 36. The acceleration detected by the acceleration sensor 37 changes in accordance with the direction (tilt angle) and movement of the controller 5 itself, so that the game apparatus 3 can calculate the direction and movement of the controller 5 using the acceleration data. it can. In the present embodiment, the game apparatus 3 determines the attitude (tilt angle) of the controller 5 based on the acceleration data. That is, the acceleration sensor 37 is used as a sensor that outputs data for determining the tilt angle of the controller 5.

  In addition, based on the acceleration signal output from the acceleration sensor 37, a computer such as a processor (for example, the CPU 10) of the game apparatus 3 or a processor (for example, the microcomputer 42) of the controller 5 performs processing, whereby further information regarding the controller 5 is obtained. Those skilled in the art will be able to easily understand from the description of the present specification that can be estimated or calculated (determined). For example, when processing on the computer side is executed on the assumption that the controller 5 on which the acceleration sensor 37 is mounted is stationary (that is, the processing is executed assuming that the acceleration detected by the acceleration sensor is only gravitational acceleration). When the controller 5 is actually stationary, it can be determined whether or not the attitude of the controller 5 is inclined with respect to the direction of gravity based on the detected acceleration. Specifically, whether or not the controller 5 is inclined with respect to the reference depending on whether or not 1G (gravity acceleration) is applied, based on the state in which the detection axis of the acceleration sensor 37 is directed vertically downward. It is possible to know how much it is inclined with respect to the reference according to its size. Further, in the case of the multi-axis acceleration sensor 37, it is possible to know in detail how much the controller 5 is inclined with respect to the direction of gravity by further processing the acceleration signal of each axis. . In this case, the processor may calculate the tilt angle of the controller 5 based on the output from the acceleration sensor 37, or may calculate the tilt direction of the controller 5 without calculating the tilt angle. Good. Thus, by using the acceleration sensor 37 in combination with the processor, the tilt angle or posture of the controller 5 can be determined.

  On the other hand, when it is assumed that the controller 5 is in a dynamic state (a state in which the controller 5 is moved), the acceleration sensor 37 detects an acceleration corresponding to the movement of the controller 5 in addition to the gravitational acceleration. Therefore, the movement direction of the controller 5 can be known by removing the gravitational acceleration component from the detected acceleration by a predetermined process. Even if it is assumed that the controller 5 is in a dynamic state, the direction of gravity is obtained by removing the acceleration component corresponding to the movement of the acceleration sensor from the detected acceleration by a predetermined process. It is possible to know the inclination of the controller 5 with respect to. In another embodiment, the acceleration sensor 37 is a built-in process for performing a predetermined process on the acceleration signal before outputting the acceleration signal detected by the built-in acceleration detection means to the microcomputer 42. An apparatus or other type of dedicated processing apparatus may be provided. A built-in or dedicated processing device converts the acceleration signal into a tilt angle (or other preferred parameter) if, for example, the acceleration sensor 37 is used to detect static acceleration (eg, gravitational acceleration). It may be a thing.

  In this embodiment, for example, a capacitance type acceleration sensor is used as a sensor that outputs a value that changes according to the movement of the controller. However, other types of acceleration sensors or gyro sensors are used. May be. However, the acceleration sensor detects acceleration in a linear direction along each axis, whereas the gyro sensor detects angular velocity associated with rotation. In other words, when a gyro sensor is employed instead of the acceleration sensor, the nature of the detected signal is different, and therefore both cannot be easily replaced. Therefore, when calculating the attitude (tilt angle) using a gyro sensor instead of the acceleration sensor, for example, the following changes are made. Specifically, the game apparatus 3 initializes the posture value in the detection start state. Then, the angular velocity data output from the gyro sensor is integrated. Further, using the integration result, the amount of change in posture is calculated from the initialized posture value. In this case, the calculated posture is represented by an angle.

  As already described, when the inclination angle (posture) is calculated by the acceleration sensor, the inclination angle is calculated using the acceleration vector. Therefore, the calculated tilt angle can be expressed as a vector, and the absolute direction can be calculated without performing initialization. In the case of using an acceleration sensor and the case of using a gyro sensor. Is different. In addition, since the property of the value calculated as the tilt angle also differs depending on whether it is an angle or a vector as described above, when replacing the acceleration sensor with the gyro sensor, the data of the tilt angle is included. Even for this, it is necessary to perform predetermined conversion.

  The communication unit 36 includes a microcomputer 42, a memory 43, a wireless module 44, and an antenna 45. The microcomputer 42 controls the wireless module 44 that wirelessly transmits data acquired by the microcomputer 42 to the game apparatus 3 while using the memory 43 as a storage area when performing processing. The microcomputer 42 is connected to the connector 33.

  Data output from the operation unit 32, the imaging information calculation unit 35, and the acceleration sensor 37 to the microcomputer 42 is temporarily stored in the memory 43. These data are transmitted to the game apparatus 3 as the operation data. That is, the microcomputer 42 outputs the operation data stored in the memory 43 to the wireless module 44 when the transmission timing to the wireless controller module 19 of the game apparatus 3 arrives. The wireless module 44 modulates a carrier wave of a predetermined frequency with operation data using, for example, Bluetooth (registered trademark) technology, and radiates a weak radio signal from the antenna 45. That is, the operation data is modulated by the wireless module 44 into a weak radio signal and transmitted from the controller 5. The weak radio signal is received by the wireless controller module 19 on the game apparatus 3 side. By demodulating and decoding the received weak radio signal, the game apparatus 3 can acquire operation data. And CPU10 of the game device 3 performs a game process based on the acquired operation data and a game program. Note that the wireless transmission from the communication unit 36 to the wireless controller module 19 is sequentially performed at predetermined intervals, but the game processing is generally performed in units of 1/60 seconds (one frame time). Therefore, it is preferable to perform transmission at a period equal to or shorter than this time. The communication unit 36 of the controller 5 outputs each operation data to the wireless controller module 19 of the game apparatus 3 at a rate of once every 1/200 seconds, for example.

  By using the controller 5, the player can perform an operation of tilting the controller 5 to an arbitrary tilt angle in addition to the conventional general game operation of pressing each operation button. In addition, according to the controller 5, the player can also perform an operation of instructing an arbitrary position on the screen by the controller 5 and an operation of moving the controller 5 itself.

[Configuration of Lighting Device 9]
Next, the configuration of the illumination device 9 will be described with reference to FIGS. FIG. 9 is an external view of the illumination device 9. FIG. 10 is a perspective view showing a main configuration in the illumination device 9. In FIG. 9, the illumination device 9 includes a housing 51, a cover 52, a shaft head 53, a support shaft 54, a support member 55, and eight optical modules 61 to 68. The illuminating device 9 emits visible light, and projects visible light on the wall surface behind the television 2 to give the user a further visual effect in addition to the image displayed on the television 2. .

  As shown in FIG. 9, the casing 51 has an open top surface (a surface on the positive side in the y-axis direction shown in FIG. 9). Inside the casing 51, there are a support shaft 54, support members 55, and eight pieces. The optical modules 61 to 68 are installed. On the support member 55, eight optical modules 61 to 68 (each optical module shown in FIG. 3) are attached. Although details will be described later, each of the optical modules 61 to 68 is a member that emits visible light. A transparent cover 52 is attached to the opening above the support member 55 and each of the optical modules 61 to 68. Therefore, the light emitted from each of the optical modules 61 to 68 passes through the cover 52 and is emitted to the outside of the housing 51. In another embodiment, the lighting device 9 may be configured without the cover 52, and the emission surfaces of the optical modules 61 to 68 may be exposed to the outside.

  As shown in FIG. 10, the support member 55 is connected to the support shaft 54 at both ends thereof (end portions in the x-axis direction shown in FIG. 10). The support shaft 54 is inserted into a hole provided on the side surface of the housing 51. Accordingly, the support member 55 is rotatably supported by the casing 51 (around the x axis). The support shaft 54 is connected to the shaft head 53 on the outside of the housing 51. Therefore, the user can rotate the supporting member 55 (change the inclination) by rotating the shaft head 53. As described above, since the lighting device 9 has the tilt mechanism including the support member 55 and the support shaft 54, the user can change the inclination of the support member 55 to change the emission direction of each of the optical modules 61 to 68. Can be changed.

  Next, the internal structure of the optical module will be described. Among the optical modules 61 to 68, the seven optical modules 61 to 67 arranged in substantially the same direction (which are precisely different) have light in which the light cross section has a long shape (linear shape) (described later). "Linear light") is emitted. These seven optical modules 61 to 67 have the same configuration. On the other hand, the optical module 68 emits light having a wider cross section ("background light" to be described later) than the light from the optical modules 61 to 67. Hereinafter, for the purpose of distinguishing between the optical modules 61 to 67 and the optical module 68, the former may be referred to as a “linear optical module” and the latter may be referred to as a “background light module”.

  FIG. 11 is a diagram illustrating an internal structure of the linear optical module. In FIG. 11, the linear optical module 61 includes a casing 71, a color LED module 72, a condenser lens 73, and a diffusion sheet 74. In addition, in FIG. 11, although the internal structure of the linear optical module 61 is shown, the internal structure of the other linear optical modules 62-67 is the same as that of FIG.

  As shown in FIG. 11, the casing 71 has a box shape with an upper surface opened. A color LED module 72 is attached to the substrate (bottom) in the housing 71. The color LED module 72 can emit light of a plurality of types of colors. In the present embodiment, the color LED module 72 includes a red LED 75, a green LED 76, and a blue LED 77, and the microcomputer 28 (FIG. 2) adjusts the light emission intensity (for example, 256 levels) of each LED 75 to 77 as appropriate. Therefore, it is possible to emit light with a desired color. The three LEDs 75 to 77 are arranged in a row (in FIG. 11, in a row along the axis L parallel to the a-axis direction).

  A condensing lens 73 is installed above the color LED 72. In FIG. 11, the condenser lens 73 is attached to the opening of the housing 71. In the present embodiment, the condenser lens 73 focuses light only in one direction. In the present embodiment, the focusing direction of the condenser lens 73 (the direction in which the lens has a curvature) is the c-axis direction shown in FIG. 11 (see the arrow shown in FIG. 11). In the present embodiment, a Fresnel lens (linear Fresnel lens) is used as the condenser lens 73 in order to reduce the size of the optical module 61. In another embodiment, a convex cylindrical lens may be used as the condensing lens 73 instead of the linear Fresnel lens.

  A diffusion sheet 74 is attached on the upper side of the condenser lens 73. The diffusion sheet 74 diffuses light only in a direction perpendicular to the focusing direction of the condenser lens 73, that is, in the a-axis direction. In other embodiments, a concave cylindrical lens may be used instead of the diffusion sheet 74.

  FIG. 12 is a diagram illustrating light emitted from the linear optical module. As shown in FIG. 11, the light from the color LED module 72 is condensed in the c-axis direction by the condenser lens 73 and diffused in the a-axis direction by the diffusion sheet 74. As a result, the cross-sectional shape of the light emitted from the linear optical module becomes a long shape (linear) that is long in the a-axis direction and short in the c-axis direction, as shown in FIG. Here, the “light cross-sectional shape” refers to a shape when the light is applied to a surface perpendicular to the traveling direction of the light. Hereinafter, light having a cross-sectional shape of a longitudinal shape (linear shape) is referred to as “linear light”.

  In the present embodiment, the focusing direction of the condenser lens 73 (arrow shown in FIG. 11) is perpendicular to the arrangement direction of the three LEDs 75 to 77 (direction of the axis L shown in FIG. 11). That is, the three LEDs 75 to 77 are arranged side by side along the axis L parallel to the a-axis direction, whereas the condenser lens 73 is arranged such that the focusing direction is the c-axis direction perpendicular to the a-axis direction. Installed. This is because the three lights from the three LEDs 75 to 77 are mixed without causing a shift, and the linear light projected on the wall surface is clearly displayed (a single color is displayed). Here, if the three LEDs 75 to 77 are arranged so as to be shifted with respect to the axis L, the light emitted from the LEDs 75 to 77 through the condenser lens 73 is shifted with respect to the c-axis direction. FIG. 13 is a diagram illustrating an example of linear light that is projected when the three LEDs 75 to 77 are arranged so as to be offset from the axis L. FIG. In the above case, the three lights from the LEDs 75 to 77 are shifted with respect to the c-axis direction and projected onto the wall surface. As a result, as shown in FIG. In the region 82 at the end with respect to the short direction of the linear light, the colors of the three lights are not mixed. On the other hand, in the present embodiment, the LEDs 75 to 77 are arranged in a direction perpendicular to the focusing direction of the condensing lens 73 so that the positions of the projected lights in the short direction coincide with each other. You can see the shining light cleanly.

  In the present embodiment, the condensing lens 73 and the diffusion sheet 74 are used to project linear light onto the wall surface. Here, in other embodiments, the linear light may be generated by any method. For example, the linear light may be created using only the condensing lens 73 without using the diffusion sheet 74, or may be created using a slit instead of the condensing lens 73 and the diffusion sheet 74. Further, the linear light modules 61 to 67 may realize linear light by projecting laser light instead of light from the LED.

  The background light module 68 emits light (background light) having a wider cross section than the light from the linear light modules 61 to 67. In this embodiment, the background light module 68 has a configuration including a housing and a color LED module, and does not have a condensing lens and a diffusion sheet. The color LED module of the background light module 68 has three LEDs of red, green, and blue, but these LEDs have higher luminance than the LEDs used in the linear light modules 61 to 67. And what emits light in a wide angle is used. In the present embodiment, the background light module 68 emits light with an intensity smaller than the intensity of the light of the linear light modules 61 to 67 so that the linear light is more conspicuous than the background light. Conversely, if the light intensity of the background light module 68 is made higher than the light intensity of the linear light modules 61 to 67, the background light can be made conspicuous. Further, only one of background light and linear light may be projected. In other embodiments, a condensing lens may be used in the background light module 68 as in the linear light modules 61 to 67, or a condensing lens and a diffusion sheet may be used. The focusing direction in the case of using a condensing lens is the same as the focusing direction in the linear optical modules 61-67. In addition, a diffusion sheet that diffuses light in both the vertical and horizontal directions is used.

  Next, with reference to FIG. 14, arrangement | positioning of each optical module 61-68 in the illuminating device 9 is demonstrated. FIG. 14 is a trihedral view showing the arrangement of the optical modules. 14A is a view of the optical modules 61 to 68 viewed from the y-axis positive side (upper side), and FIG. 14B is a view of the optical modules 61 to 68 viewed from the z-axis negative side (front side). 68 is a view of the optical modules 61 to 68 as viewed from the negative side of the x-axis. Note that FIG. 14 shows a state in which the emission direction of the optical module 64 is vertically upward for the purpose of facilitating understanding of the description, but in reality, the light is directed obliquely upward to the rear from the illumination device 9. Each of the optical modules 61 to 68 is disposed so as to be emitted.

  The linear light modules 61 to 68 emit respective linear lights toward the obliquely upward rear side of the illumination device 9 so as to be radially centered on the illumination device 9. Specifically, as shown in FIGS. 14A and 14B, the linear optical modules 61 to 67 are arranged side by side in the x-axis direction (left-right direction). They are arranged symmetrically with respect to the yz plane as the center. Further, when viewed from the upper side (y-axis positive side) as shown in FIG. 14A, each of the linear optical modules 61 to 67 has a longitudinal direction (linear light) of the emission surface of each of the linear optical modules 61 to 67. (Longitudinal direction) of the lighting device 9 is arranged so as to have a substantially radial orientation centered on a predetermined position behind the lighting device 9. That is, when viewed from the upper side, each of the linear optical modules 61 to 67 is disposed so that the rear end (the end on the z-axis positive side) is inclined inward as the module is disposed on the outer side. Furthermore, when viewed from the front side as shown in FIG. 14B, each of the linear optical modules 61 to 67 has the emission direction of each of the linear optical modules 61 to 67 centered on a predetermined position below the illumination device 9. It arrange | positions so that it may become substantially radial. That is, when viewed from the front side, each of the linear optical modules 61 to 67 is arranged so that the outgoing direction is directed outward as the module is arranged on the outer side.

  The background light module 68 emits background light in a direction overlapping with each linear light emitted from the linear light modules 61 to 67. Specifically, as illustrated in FIG. 14A, the background light module 68 is disposed approximately at the center of each of the linear light modules 61 to 67 with respect to the x-axis direction (left-right direction). Further, the background light module 68 is arranged behind the respective linear light modules 61 to 67 in the z-axis direction (front-rear direction). Furthermore, as illustrated in FIG. 14C, the background light module 68 is disposed so as to be inclined so that the emission direction is slightly downward from the linear light module 64. In other embodiments, the background light module 68 may have an emission direction upward from the linear light module 64 or may be in substantially the same direction.

  FIG. 15 is a diagram showing linear light and background light projected onto the back surface of the television 2 by the lighting device 9. In FIG. 15, an area 91 is an area where light from the linear optical module 61 is projected, an area 92 is an area where light from the linear optical module 62 is projected, and an area 93 is linear light. A region 94 is a region where light from the module 63 is projected, a region 94 is a region where light from the linear optical module 64 is projected, and a region 95 is projected from the linear optical module 65. The region 96 is a region where light from the linear optical module 66 is projected, the region 97 is a region where light from the linear optical module 67 is projected, and the region 98 is linear light. This is a region where light from the module 68 is projected. As shown in FIG. 15, in this embodiment, since light is projected on the back of the television 2, when viewed from the front side of the television 2, it looks as if decoration around the display screen is given by light. An effective lighting effect (decorative effect) can be imparted to the screen. In addition, if the light from the lighting device 9 is emitted toward the user (toward the front), only one point of the lighting device 9 appears to be shining for the user, which is effective as a decoration effect on the display screen. Will be small. On the other hand, in this embodiment, the light of the illumination device 9 is emitted backward to project light onto the wall surface so that the light hitting the wall surface is shown to the user. Therefore, it is possible to show the user light over a wider range than the display screen of the television 2, and a high decoration effect can be obtained.

  By arranging each of the linear optical modules 61 to 67 as shown in FIG. 14, as shown in FIG. Linear light can be projected. According to this embodiment, since the illuminating device 9 projects a plurality of linear lights, it is possible to increase the light projection pattern and increase the variation of the illumination effect as compared with the case where only one light is projected. be able to.

  In general, behind the TV 2, there is a flat wall as well as a wall with a concave angle (typically when the TV 2 is placed in the corner of the room) There may be cases where the wall surface is uneven, such as when a curtain is present. When the wall surface is uneven as described above, the projected light may be distorted due to the unevenness. For example, when horizontally long light (light whose cross section is long in the horizontal direction) is projected onto a wall surface with an uneven surface, the projected light does not appear as a single line. In addition, for example, even if a plurality of lights are emitted so as to be arranged horizontally, if the projection surface is uneven, the projected lights will not be arranged horizontally. On the other hand, in this embodiment, since the illuminating device 9 projects a plurality of linear lights so as to be radial, distortion of the linear lights due to the unevenness of the projection surface behind the television 2 is not conspicuous. Even if the wall surfaces are different, the same radial linear light can be projected. For example, even when a plurality of linear lights are emitted toward the corner of a room, the lines themselves are projected straight, only in the way the radial lines are opened, as compared with the case where they are emitted on a plane. So there is little discomfort. Further, since the surface of the curtain or the like generally has most unevenness in the horizontal direction and does not form so much unevenness in the vertical direction, similarly, light is projected while the appearance of linear light is close to linear. be able to.

  As shown in FIG. 15, in this embodiment, the background light module 68 is arranged as shown in FIG. 14, so that linear light and background light can be projected on the wall surface. Therefore, the illumination device 9 can control not only the color of the linear light but also the color of the background thereof. Therefore, the illumination pattern can be increased as compared with the case of controlling only the linear light. The variation of the effect can be increased. In the present embodiment, the light intensity of the background light module 68 is set to be smaller than the light intensity of the linear light modules 61 to 67. Therefore, the region where both the background light and the linear light are projected is It looks like a linear light color. That is, the wall surface appears to be projected with linear light superimposed on the background light.

  Note that the arrangement of the optical modules 61 to 68 illustrated in FIG. 14 is an example, and the linear optical modules 61 to 67 have a radial shape with the illumination device 9 approximately at the center with respect to the rear wall of the television 2. As long as it is arranged so as to project linear light, it may be arranged in any way. Moreover, in other embodiment, each linear light module 61-67 may be arrange | positioned so that each linear light may be projected in parallel, for example, and each linear light may represent a predetermined shape. May be arranged. On the other hand, the background light module 68 may be arranged in any manner as long as the background light module 68 is arranged to emit light in a direction overlapping with the light of the linear light modules 61 to 67. In another embodiment, the background light module 68 may be arranged so as to emit light in a direction overlapping with the light of some of the linear optical modules 61 to 67.

[Processing of Game Device 3 Regarding Control of Lighting Device 9]
Next, the process performed in the game apparatus 3 is demonstrated centering on the process regarding control with respect to the illuminating device 9. FIG. In the present embodiment, the game device 3 displays a game image on the television 2 and adds a lighting effect using the lighting device 9 when executing the game process according to the game program stored in the optical disc 4. Hereinafter, control of the lighting device 9 when executing the game process will be described.

  First, main data used in the processing in the game apparatus 3 will be described with reference to FIG. FIG. 16 is a diagram illustrating main data stored in the main memory (the external main memory 12 or the internal main memory 11e) of the game apparatus 3. As shown in FIG. 16, a game program 101, operation data 102, and processing data 106 are stored in the main memory of the game apparatus 3. In addition to the data shown in FIG. 16, the main memory stores data necessary for the game process, such as image data of various objects appearing in the game and data indicating various parameters of the object.

  Part or all of the game program 101 is read from the optical disc 4 and stored in the main memory at an appropriate timing after the game apparatus 3 enters the normal mode described above. The game program 101 includes a program for executing game processing and a program for controlling light emission of the lighting device 9 in accordance with the game processing (step S5 described later).

  The operation data 102 is operation data transmitted from the controller 5 to the game apparatus 3. As described above, since the operation data is transmitted from the controller 5 to the game apparatus 3 at a rate of once every 1/200 second, the operation data 102 stored in the main memory is updated at this rate. The operation data 102 includes operation button data 103, marker coordinate data 104, and acceleration data 105.

  The operation button data 103 is data indicating an input state for each of the operation buttons 32a to 32i. That is, the operation button data 103 indicates whether or not each of the operation buttons 32a to 32i is pressed.

  The marker coordinate data 104 is data indicating coordinates calculated by the image processing circuit 41 of the imaging information calculation unit 35, that is, the marker coordinates. The marker coordinates are expressed in a two-dimensional coordinate system for representing a position on a plane corresponding to the captured image. In addition, when the two markers 6R and 6L (infrared light) are imaged by the imaging element 40, two marker coordinates are calculated. On the other hand, if either one of the markers 6R and 6L is not located within the imageable range of the image sensor 40, only one marker is imaged by the image sensor 40, and only one marker coordinate is calculated. Further, when both of the markers 6R and 6L are not located within the image capturing range of the image sensor 40, the marker is not imaged by the image sensor 40, and the marker coordinates are not calculated. Accordingly, the marker coordinate data 104 may indicate two marker coordinates, may indicate one marker coordinate, or may indicate that there is no marker coordinate.

  The acceleration data 105 is data indicating the acceleration (acceleration vector) detected by the acceleration sensor 37. Here, the acceleration data 105 indicates a three-dimensional acceleration vector whose components are accelerations in the directions of the three axes of XYZ shown in FIG.

  The processing data 106 is data used in a game process (FIG. 17) described later. The processing data 106 includes game data 107 and light emission control data 108. The game data 107 is game parameter data, and is game parameter data that affects the control of the lighting device 9. For example, the game data 107 may be data indicating parameters of characters appearing in the game, or data indicating the position of a cursor displayed on the screen.

  The light emission control data 108 is data for controlling the light emission of the lighting device 9. In the present embodiment, the light emission control data 108 indicates the color and intensity of light that each optical module 29 of the illumination device 9 should emit. Specifically, for example, the emission intensity of each of red, green, and blue LEDs is shown in 256 levels. Although details will be described later, the light emission control data 108 is transmitted to the lighting device 9 and acquired by the microcomputer 28, and the microcomputer 28 controls each optical module 29 according to the light emission control data 108.

  Next, details of processing performed in the game apparatus 3 will be described with reference to FIG. FIG. 17 is a flowchart showing a flow of processing executed in the game apparatus 3. When an instruction to start a game is input from the user to the game apparatus 3 in the normal mode, the CPU 10 of the game apparatus 3 executes a startup program stored in a boot ROM (not shown). Each unit is initialized. Then, the game program stored on the optical disc 4 is read into the main memory, and the CPU 10 starts executing the game program. The flowchart shown in FIG. 13 is a flowchart showing a process performed after the above process is completed.

  First, in step S1, the CPU 10 executes an initialization process relating to the game. In this initialization process, the values of various parameters used in the game process are initialized, a virtual game space is constructed, and player objects and other objects are placed at initial positions in the game space. After step S1, the processing loop of steps S2 to S6 is repeatedly executed while the game is executed. Note that one processing loop is executed once per frame time (for example, 1/60 seconds).

  In step S2, the CPU 10 acquires operation data. That is, operation data transmitted from the controller 5 is received via the wireless controller module 19. Then, the operation button data, marker coordinate data, and acceleration data included in the received operation data are stored in the main memory. Following step S2, the process of step S3 is executed.

  In step S3, the CPU 10 executes a game process based on the operation data acquired in step S2. Specifically, for example, a process of controlling the action of the game character based on the operation data, or a process of calculating the position of the cursor displayed on the screen based on the operation data (particularly, for example, the marker coordinate data 104) Or execute. At this time, game data 107 obtained as a result of the game process is stored in the main memory. Following step S3, the process of step S4 is executed.

  In step S <b> 4, the CPU 10 displays a game image corresponding to the game process executed in step S <b> 3 on the screen of the television 2. Specifically, the CPU 10 (and the GPU 11b) reads the game data 107 from the main memory, generates a game image based on the game data 107 and the like, and displays it on the screen. The game image may be, for example, an image of a game space including a game character, or may include a cursor image superimposed on the image. In step S4, the CPU 10 (and the DSP 11c) generates a game sound corresponding to the game process based on the game data 107 and the like, and outputs it from the speaker 2a. Note that the game sound may be BGM during the game, a sound effect of the game, a voice of a game character, or the like. Following step S4, the process of step S5 is executed.

  In step S <b> 5, the CPU 10 controls light emission by the lighting device 9. That is, the CPU 10 reads the game data 107 from the main memory, and generates light emission control data 108 indicating the color and intensity of light that each light module 29 should emit based on the game data 107. The generated light emission control data 108 is stored in the main memory, and is transmitted to the lighting device 9 via the expansion connector 20 by the input / output processor 11a. The microcomputer 28 of the lighting device 9 that has received the light emission control data 108 controls the light emission of each optical module 29 according to the light emission control data 108. Note that the time interval for transmitting the light emission control data 108 to the lighting device 9 may be any amount, but in this embodiment, it is assumed to be the same as the game image update interval (1/60 seconds). As described above, by repeatedly transmitting the light emission control data 108, the game apparatus 3 can change the light projection by the lighting apparatus 9, for example, depending on the game situation, game image, game sound, game operation, and the like. The lighting effect can be given to the user. Following step S5, the process of step S6 is executed.

In step S5, the CPU 10 determines the light emission state of the illumination device 9 according to the game process in step S3. For example, the light emission state of the lighting device 9 is determined based on the game situation, game image, game sound, game operation, and the like. Specific examples of the control of the lighting device 9 include the following.
(1) Control example according to game image (game situation) The CPU 10 changes the light emission state (light intensity, color, light emission pattern, etc.) of the lighting device 9 according to the change of the game image (game situation) on the screen Change. For example, in a fighting game, the light emission state of the lighting device 9 may be changed in response to the attack of the character hitting another character, or in response to the bullet hitting the target in the shooting game. The light emission state may be changed. According to this, an effective lighting effect can be added to the game image by adding the lighting effect according to the change of the game image.
(2) Control example according to game operation CPU10 changes the light emission state of the illuminating device 9 according to a user's game operation. Specifically, the CPU 10 may cause each of the optical modules 61 to 68 to emit light or change the color of light in response to the user pressing an operation button on the controller 5.
Further, when the position of the cursor on the screen is controlled according to the operation of the controller 5, the CPU 10 may change the light emission state of each of the optical modules 61 to 68 according to the position of the cursor on the screen. . For example, a region obtained by dividing the screen into seven in the horizontal direction may correspond to the linear optical modules 61 to 67 one by one, and only the linear optical module corresponding to the region including the cursor position may emit light.
(3) Control example according to game sound CPU10 changes the light emission state of the illuminating device 9 according to music, such as BGM which flows in a game. Specifically, the light modules 61 to 68 are blinked according to the rhythm of the music, or the light emission pattern and / or the light emission color are changed according to the pitch (pitch) of the output sound. Further, in a performance game in which music is played according to a game operation by the user, the lighting device 9 may be controlled in accordance with the music to be played. According to this, the user can enjoy music played by his / her own operation not only by hearing but also visually.

  In step S6, the CPU 31 determines whether or not to end the game. The determination in step S6 is made based on, for example, whether or not the game is cleared, whether or not the game is over, and whether or not the user gives an instruction to stop the game. If the determination result of step S6 is negative, the process of step S2 is executed again. Thereafter, the processing loop of steps S2 to S6 is repeatedly executed until it is determined in step S6 that the game is to be ended. On the other hand, when the determination result of step S6 is affirmative, the CPU 31 ends the game process shown in FIG. This is the end of the description of the game process.

  As described above, according to the present embodiment, a game image is displayed on the television 2, and a lighting effect is added to the wall surface behind the television 2 by light projection by the lighting device 9. According to this, since a further visual effect can be given to the user in addition to the game image, the power and sense of presence of the game can be further improved.

[Other embodiments]
The above-described embodiment is an example for carrying out the present invention. In other embodiments, the present invention can be implemented with, for example, the configuration described below.

(Modification in which the lighting device 9 is driven according to an image other than the game image)
In the above-described embodiment, the case where the lighting device 9 adds the lighting effect when the game image is displayed on the screen of the television 2 has been described. However, the image displayed on the screen of the television 2 is not limited to the game image. It may be a moving image or a still image other than an image. For example, the game apparatus 3 may acquire an image from another external game apparatus or server via a network and add the lighting effect by the lighting device 9 when displaying the image on the television 2. In this case, the game apparatus 3 may acquire data for controlling the light emission of the lighting device 9 (the light emission control data 108) together with the image, or may be acquired from a device or server different from the image acquisition source. May be. Further, the game apparatus 3 may automatically create the light emission control data from the acquired image according to a predetermined algorithm.

(Modification in which the lighting device 9 emits light for notification to the user)
In addition to the purpose of adding a further visual effect to the image displayed on the television 2, the lighting device 9 may be used for the purpose of notifying the user. In the present embodiment, the game apparatus 3 can operate in the sleep mode described above. Since the CPU 10 does not operate in the sleep mode, game processing and game image display are not executed, but communication with an external device (another game device or server device) is executed via the network by the input / output processor 11a. Is done. For example, in the sleep mode, the game apparatus 3 transmits / receives a message created by the user (in an e-mail format) to / from another game apparatus or receives a game program or video data from the server apparatus. To do. In this modified example, when the game apparatus 3 receives a message from another game apparatus in the sleep mode, the game apparatus 3 causes the lighting device 9 to emit light to notify the reception. Details will be described below.

  FIG. 18 is a flowchart showing data reception processing of the input / output processor 11a in the sleep mode. In the sleep mode, the input / output processor 11a executes the processing shown in FIG. 18 at a predetermined timing (for example, once every predetermined time). FIG. 18 shows processing in the sleep mode, but the input / output processor 11a communicates with an external device even in the normal mode.

  In step S11, the input / output processor 11a accesses a mail server that stores and manages messages, and checks whether there is a message addressed to the own device (game device 3). In subsequent step S12, the input / output processor 11a determines whether or not there is a received message as a result of the confirmation in step S11. If there is a received message, the process of step S13 is executed. On the other hand, if there is no received message, the input / output processor 11a ends the data reception process shown in FIG.

  In step S13, the input / output processor 11a receives a message from the mail server. In other words, the input / output processor 11 a accesses the mail server and stores the received message in the flash memory 17. Following step S13, the process of step S14 is executed.

  In step S <b> 14, the input / output processor 11 a starts light emission by the lighting device 9. Specifically, the input / output processor 11 a creates the light emission control data and transmits it to the lighting device 9 via the expansion connector 20. The microcomputer 28 of the lighting device 9 that has received the light emission control data controls the light emission of each optical module 29 according to the light emission control data. Thereafter, the input / output processor 11a continuously emits light by the lighting device 9 by repeatedly transmitting the light emission control data. Therefore, the user can know that the message has arrived by the light emission by the lighting device 9. After the above step S14, the input / output processor ends the data reception process shown in FIG.

  In step S14, the input / output processor 11a may change the light emission state (light intensity, color, light emission pattern, etc.) of the illumination device 9 in accordance with the data reception state. For example, the light emission state of the lighting device 9 may be changed according to the number of received messages, the source of the message, and the type of received data (whether it is a message or data related to a game). According to this, the user can know information of received data, such as what kind of data is received, from the light emission state of the lighting device 9.

  As described above, according to the modified example, the game apparatus 3 can notify the user of message reception by causing the lighting device 9 to emit light in response to reception of the message. In the above modification, the game apparatus 3 causes the lighting device 9 to emit light in response to receiving the message. However, in another embodiment, in addition to the message data or the message data Instead of this, the lighting device 9 may emit light in response to reception of other data (for example, game program or video data transmitted from the server).

  Moreover, in the said modification, the game device 3 shall perform the light emission operation | movement of the illuminating device 9 according to having received data only at the time of sleep mode, However, In other embodiment, the said light emission operation | movement is performed. It may be executed even in the normal mode.

(Modification in which control of lighting device 9 is changed according to brightness)
In other embodiments, the game system 1 may change the light emission state of the lighting device 9 in accordance with ambient brightness. Specifically, the game system 1 includes a sensor (for example, an illuminance sensor) that detects ambient brightness in addition to the configuration illustrated in FIG. 1, and emits light from the lighting device 9 according to the detection result of the sensor. The state may be changed. Note that the sensor is preferably configured separately from the game apparatus 3 and can communicate with the game apparatus 3 in a wired or wireless manner. The sensor is preferably installed around the television 2. For example, when it is determined that the surroundings of the game system 1 are relatively bright from the detection result of the sensor, the game apparatus 3 controls the light emission state of the lighting device 9 to be relatively bright, and the game system 1 When it is determined that the surroundings of the lighting device 9 are relatively dark, the light emitting state of the lighting device 9 may be controlled to be relatively dark. According to this, the game system 1 makes it difficult to see the light on the wall surface projected by the lighting device 9 when the surroundings are bright, or the wall surface projected by the lighting device 9 when the surroundings are dark. It is possible to prevent the light from being too bright.

(Examples of other game systems)
In the above embodiment, the game system 1 is configured to include the marker device 6 as an accessory device to be set around the television 2. Here, in another embodiment, the attached device may be, for example, the following device. That is, the accessory device may be a camera that images the front of the television 2. At this time, the game apparatus 3 performs predetermined information processing using an image captured by the camera. As predetermined information processing, for example, a process for controlling a game according to a captured user or a position of an input device, a process for processing and displaying a captured image, a part of a captured image ( For example, a game process using the user's face part) as a game image is conceivable.

  Further, the accessory device may be a device that outputs a predetermined signal by radio waves or ultrasonic waves, instead of the marker device 6 that outputs infrared light. At this time, the controller 5 as an input device detects the predetermined signal, and the game apparatus 3 executes predetermined information processing based on the detection result. For example, the game apparatus 3 calculates the position of the controller 5 or the like from the detection result, and performs game processing using the calculated position as a game input as predetermined information processing. Conversely, a sensor that detects the predetermined signal by the input device emitting the predetermined signal may be provided in the accessory device.

  In other embodiments, the lighting device 9 may further include an audio output device such as a speaker. In this case, in addition to the light emission control data, the game apparatus 3 further transmits data instructing a sound to be output by the audio output device. The data may indicate the same sound as the sound (game sound) output from the speaker 2a of the television 2, or may indicate a different sound. The microcomputer 28 of the illuminating device 9 controls the audio output device according to the received data to output sound. According to this, in addition to the visual effect by the illuminating device 9, an auditory effect can be given to the user.

  As described above, the present invention can be used in, for example, a game system that displays a game image for the purpose of giving the user a further visual effect in addition to the image displayed on the screen.

DESCRIPTION OF SYMBOLS 1 Game system 2 Television 2a Speaker 3 Game apparatus 4 Optical disk 5 Controller 6 Marker apparatus 9 Illumination apparatus 10 CPU
11a Input / output processor 28 Microcomputer 29 Each optical module 40 Imaging device 61-67 Linear optical module 68 Background light module 71 Case 72 Color LED module 73 Condensing lens 74 Diffusion sheet 75-77 LED

Claims (20)

  1. An image display system for displaying an image on a display screen,
    A light emitting means for emitting infrared light;
    A light projecting means for projecting visible light;
    Display control means for executing predetermined information processing based on the detection result of the infrared light and controlling display of an image on the display screen;
    Projection control means for controlling the projection by the projection means;
    A first housing having the light emitting means therein;
    A second housing that is detachable from the first housing and has the light projecting means therein;
    The second casing is mounted on the first casing such that the light projecting means emits visible light in a direction opposite to the direction of infrared light emitted by the light emitting means. system.
  2. An image display system for displaying an image on a display screen,
    A light emitting means for emitting infrared light;
    A light projecting means for projecting visible light;
    Display control means for executing predetermined information processing based on the detection result of the infrared light and controlling display of an image on the display screen;
    Projection control means for controlling the projection by the projection means;
    A housing having the light emitting means and the light projecting means therein;
    The image projecting system, wherein the light projecting unit is arranged in the housing so that the light projecting unit emits visible light in a direction opposite to a direction in which infrared light is emitted by the light emitting unit.
  3. An input device including an imaging unit capable of detecting infrared light;
    The image display system according to claim 1, wherein the display control unit executes the predetermined information processing based on a position of the infrared light captured by the imaging unit.
  4. The display control means displays a game image obtained as a result of executing game processing as the predetermined information processing on the display screen,
    The image display system according to claim 1, wherein the light projection control unit controls light projection by the light projection unit according to the game process.
  5.   5. The image display system according to claim 1, wherein the light projection control unit changes the light projection by the light projection unit according to an image on the display screen. 6.
  6. It further comprises operation accepting means for accepting a user operation,
    5. The image display system according to claim 1, wherein the light projection control unit changes the light projection by the light projection unit in accordance with a user operation received by the operation reception unit.
  7. Voice output means for outputting sound based on the predetermined information processing;
    5. The image display system according to claim 1, wherein the light projection control unit changes the light projection by the light projection unit according to the sound output by the sound output unit. 6.
  8. The light emitting means is installed to emit infrared light toward the front of the display screen,
    The image display system according to claim 1, wherein the light projecting unit is installed so as to emit visible light toward the rear of the display screen.
  9.   The image display system according to any one of claims 1 to 8, wherein the light projecting unit emits a plurality of visible lights each having a longitudinal section of light.
  10. The light projecting means is
    First emitting means for emitting a plurality of visible lights in different directions;
    10. The apparatus according to claim 1, further comprising: a second emitting unit that emits visible light having a wider cross section than the visible light in a direction overlapping with the plurality of visible lights emitted by the first emitting unit. The image display system described in 1.
  11. The light projecting means has a light emitting member capable of emitting light of a plurality of types of colors,
    The image display system according to claim 1, wherein the light projection control unit controls at least a color to be emitted by the light emitting member.
  12. Communication means for communicating with other devices;
    A power control means capable of power saving control for supplying power to the communication means without supplying power to at least the display control means,
    The light projection control unit controls light projection by the light projection unit in response to reception of predetermined data by the communication unit when the power control unit is executing the power saving control. The image display system according to any one of claims 1 to 11.
  13. It further includes a brightness detection means for detecting ambient brightness,
    The image display system according to any one of claims 1 to 12, wherein the light projection control unit changes the light projection by the light projection unit according to a detection result of the brightness detection unit.
  14. A game system for displaying an image on a display screen,
    Imaging means for imaging a user in front of the display screen;
    A light projecting means for projecting visible light to at least the back of the display screen ;
    Display control means for executing predetermined game processing based on a user image captured by the imaging means , and controlling display of an image on the display screen based on the predetermined game processing;
    A game system comprising: a light projecting control unit configured to control light projection by the light projecting unit at least rearward of the display screen in accordance with an image based on the game process displayed on the display screen .
  15. A game system for displaying an image on a display screen,
    Signal transmitting means for transmitting a predetermined signal by radio waves or ultrasonic waves to a user in front of the display screen ;
    A light projecting means for projecting visible light to at least the back of the display screen ;
    Display control means for executing predetermined game processing based on the detection result of the predetermined signal, and controlling display of an image on the display screen based on the predetermined game processing;
    A game system comprising: a light projecting control unit configured to control light projection by the light projecting unit at least rearward of the display screen in accordance with an image based on the game process displayed on the display screen .
  16. A lighting device connected to a display control device for displaying an image on a display device,
    Receiving means for receiving a control instruction from the display control device;
    A light projecting means for projecting visible light in accordance with a control instruction received by the receiving means;
    A light emitting means for emitting infrared light;
    A first housing having the light emitting means therein;
    A second housing that is detachable from the first housing and has the light projecting means therein;
    The second housing is mounted on the first housing so that the light projecting means emits visible light in a direction opposite to the direction of infrared light emitted by the light emitting means. .
  17. The light projecting means is
    First emitting means for emitting a plurality of visible light beams each having a longitudinal cross section in different directions;
    The lighting device according to claim 16, further comprising: a second emitting unit that emits visible light having a wider cross section than the visible light in a direction overlapping the plurality of visible lights emitted by the first emitting unit.
  18. A lighting device connected to a display control device for displaying an image on a display device,
    Receiving means for receiving a control instruction from the display control device;
    A light projecting means for projecting visible light in accordance with a control instruction received by the receiving means;
    A light emitting means for emitting infrared light;
    A housing having the light emitting means and the light projecting means therein;
    The lighting device is disposed in the housing such that the light projecting unit emits visible light in a direction opposite to an emission direction of infrared light by the light emitting unit.
  19. A game device for displaying an image on a display screen,
    Display control means for executing a predetermined game process based on a captured image of an imaging means for capturing a user in front of the display screen, and controlling display of an image on the display screen based on the predetermined game process;
    A game apparatus comprising: a light projecting control unit that controls light projection by a light projecting unit that projects visible light at least rearward of the display screen in accordance with an image based on the game process displayed on the display screen. .
  20. A game device for displaying an image on a display screen,
    It executes predetermined game processing based on the detection result of the predetermined signal by the signal transmitter for transmitting a forward of Jo Tokoro of the signal by radio waves or ultrasound to a user of the display screen, based on the predetermined game process Display control means for controlling display of an image on the display screen;
    A game apparatus comprising: a light projecting control unit that controls light projection by a light projecting unit that projects visible light at least rearward of the display screen in accordance with an image based on the game process displayed on the display screen. .
JP2009209319A 2009-09-10 2009-09-10 Image display system and lighting device Active JP5622372B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009209319A JP5622372B2 (en) 2009-09-10 2009-09-10 Image display system and lighting device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2009209319A JP5622372B2 (en) 2009-09-10 2009-09-10 Image display system and lighting device
US12/877,547 US8602891B2 (en) 2009-09-10 2010-09-08 Image display system and illumination device
US12/877,651 US8777741B2 (en) 2009-09-10 2010-09-08 Illumination device
US12/877,478 US8647198B2 (en) 2009-09-10 2010-09-08 Image display system, illumination system, information processing device, and storage medium having control program stored therein

Publications (2)

Publication Number Publication Date
JP2011056061A JP2011056061A (en) 2011-03-24
JP5622372B2 true JP5622372B2 (en) 2014-11-12

Family

ID=43944413

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009209319A Active JP5622372B2 (en) 2009-09-10 2009-09-10 Image display system and lighting device

Country Status (1)

Country Link
JP (1) JP5622372B2 (en)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4399087B2 (en) * 2000-05-31 2010-01-13 パナソニック株式会社 Lighting system, video display device, and lighting control method
JP2006330708A (en) * 2001-06-28 2006-12-07 Matsushita Electric Ind Co Ltd Reproducing device
JP2004354583A (en) * 2003-05-28 2004-12-16 Sony Corp Device and method to generate music
JP4651307B2 (en) * 2004-05-26 2011-03-16 シャープ株式会社 Illumination environment reproduction apparatus and method, and video reproduction apparatus
JP2006310414A (en) * 2005-04-27 2006-11-09 Shimatec:Kk Led lighting apparatus
JP2006331735A (en) * 2005-05-24 2006-12-07 Sharp Corp Audio-visual environment control method, audio-visual environment control device, and image display device
JP2007220651A (en) * 2006-01-20 2007-08-30 Toshiba Lighting & Technology Corp Illumination device, and illumination system for image device
EP2005732A1 (en) * 2006-03-31 2008-12-24 Philips Electronics N.V. Adaptive rendering of video content based on additional frames of content
JP5689574B2 (en) * 2006-11-17 2015-03-25 任天堂株式会社 Game device, game program, game system, and game control method
JP2011501981A (en) * 2007-09-07 2011-01-20 エーエムビーエックス ユーケー リミテッド How to generate effect scripts corresponding to game play events

Also Published As

Publication number Publication date
JP2011056061A (en) 2011-03-24

Similar Documents

Publication Publication Date Title
US8870655B2 (en) Wireless game controllers
TWI555561B (en) Head-mounted display and perform the rendering on the screen for a head-mounted display of the method of the game
US9358457B2 (en) Game system, controller device, and game method
JP5705568B2 (en) Game operating device and game system
JP4989105B2 (en) Game controller
US9199166B2 (en) Game system with virtual camera controlled by pointing device
US7833100B2 (en) Video game program and video game system
JP5294442B2 (en) Game device and game program
US20070211027A1 (en) Image processing apparatus and storage medium storing image processing program
US8702514B2 (en) Controller device and controller system
US8882596B2 (en) Game program and game apparatus
JP6243586B2 (en) Game system, game device, game program, and game processing method
US7867089B2 (en) Expanding operating device and operating system
JP2007061271A (en) Game system and game program
JP4907129B2 (en) Information processing system and program
EP2422854A2 (en) Game system, game device, storage medium storing game program, and game process method
JP4703509B2 (en) Game operating device and game system
ES2527047T3 (en) Video game controller and video game system
KR101179020B1 (en) Information processing program
US9132346B2 (en) Connecting video objects and physical objects for handheld projectors
US7831064B2 (en) Position calculation apparatus, storage medium storing position calculation program, game apparatus, and storage medium storing game program
JP5051822B2 (en) Game device with general-purpose remote control function
JP2009092408A (en) Load detection program and load detection device
JP4689585B2 (en) Information processing apparatus and information processing program
WO2004002593A1 (en) Information processor having input system using stroboscope

Legal Events

Date Code Title Description
RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20111019

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20111104

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120720

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20131121

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140120

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140509

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140708

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140912

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140922

R150 Certificate of patent or registration of utility model

Ref document number: 5622372

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250